aaronpk / aperture Goto Github PK
View Code? Open in Web Editor NEWAperture is a Microsub server. Currently in beta.
Home Page: https://aperture.p3k.io
License: Apache License 2.0
Aperture is a Microsub server. Currently in beta.
Home Page: https://aperture.p3k.io
License: Apache License 2.0
Like an indie Dopplr
I would make a channel and add everyone's feeds of their travel plans.
Alternately, could be a special channel that populates with travel plans found in another channel
Apologies if I am posting this in the wrong place!
Using a self-hosted install from master that I got working tonight. Many microformats sources seem to work well.
However, when attempting to add http://tantek.com/ as a source, Aperture reports no feeds whatsoever, despite an atom feed listed in the head and an h-feed in the page.
The fragment ID URL http://tantek.com/#updates gives the same "No feeds were found at the given URL" response.
I'm not sure where to look in the logs to see what might be happening, here.
You seemed to think that in a mark as read query, a single entry[] item should work. However, it doesn't seem to work with Aperture currently. Below is my post string:
action=timeline&channel=f9j5PuPeFZgSsekVAwcWr02J7zQRURC0&method=mark_read&entry[]=10862
My response was a 500 error and the following error:
ErrorException: Undefined variable: entryIds in file ~/aperture/app/Http/Controllers/MicrosubController.php on line 445
I'm thinking it's missing the "else" statement below:
if(!is_array(Request::input('entry'))) {
$entryIds = [Request::input('entry')];
}
$result = $channel->mark_entries_read($entryIds);
Seems like it needs the following between those two lines?
else {
$entryIds = Request::input('entry');
}
Feeds include additional data such as a name, image and description. In the case of podcast feeds, the podcast artwork is a critical component of the feed and should be presented when viewing entries from the feed.
XRay will need to return podcast artwork as author info first.
Aperture should also add the concept of feed-level name/photo/description in order to enhance the display in readers. Here is the current proposal for adding this to Microsub: indieweb/microsub#13
It would be nice to be able to import an OPML file from an existing feed reader so that you don't have to add feeds 1 by 1.
Either build in WebSub (by using Switchboard as a library) or add support for connecting to an external WebSub hub.
By enabling WebSub support for each timeline feed, a client can get real-time notifications of new posts within a timeline. This helps enable things like Push Notifications for ANY channel.
I follow a bunch of people on Micro.blog and the IndieWeb. Right now all their content goes into a standard channel. What I'd love to do is have a text based timeline to read that looks like a Twitter layout, but have all the Photos/Images go into a channel that looks more like Instagram.
I like the idea of having a channel for photos and a channel for text. Right now, I can make a channel that is JUST photos, but I can't prevent those photo posts from appearing in the regular timeline channel, so then I end up seeing duplicate posts.
Free users will have some limitations, such as how long items are stored in channels.
Breaking this out to Aperture from indieweb/microsub#6
I'm going to first implement this as an Aperture-only feature, in order to be able to experiment with the UI and features faster than if it were in the Microsub spec.
I created a token using https://gimme-a-token.5eb.nl with scopes read follow mute block channels
On attempted get call to /microsub/1?action=channels
I receive:-
{
"error": "unauthorized",
"error_description": "The access token provided does not have the necessary scope for this action"
}
Otherwise aperture appears to be functioning as expected. Feeds are added, sent to watchtower &c.
Given your (or a channel's) following list, find birthdays, then create h-events
for upcoming birthdays. A microsub client that shows events will be able to show upcoming birthdays without understanding birthdays specifically. Could also index your website's nicknames cache to find birthdays there. or maybe provide a way to sync to Google Contacts even.
Currently you can add an "API Key" in order to write content into channels. This is actually Aperture acting as a Micropub endpoint. It should be possible to log in to existing Micropub clients as one of these API keys, doing the IndieAuth flow and resulting in providing the client with the API key. This would provide a way to set up fancier integrations without copying and pasting tokens.
Presently, read/unread tracking it always on. It'd be nice to be able to configure a particular channel to disable read/unread tracking, and to just deliver everything as "read" by default.
I'm receiving this error from ?action=channels
{
"error": "unauthorized",
"error_description": "The access token provided does not have the necessary scope for this action"
}
At first I thought this might be a scoping issue. I checked my token endpoint and received:
{
"me": "https://eddiehinkle.com/",
"client_id": "*******",
"scope": "create read"
}
In IRC we talked and thought it might be because it is expecting scope to be an array. However I tried the following commit: EdwardHinkle@5c0e9b4 which still present the same error. So doesn't seem quite as easy as I thought.
I see a lot of blank author images from non indieweb sites. I think it would be smart to fall back to the site favicon / icon if possible
if a post only has plaintext, then Aperture should autolink it and provide an HTML version of the post for clients that want to display HTML.
this gives the Microsub server the ability to autolink using its own rules such as the user's own nicknames cache, linking to hashtag pages, etc.
micropub config says
{
"error": "forbidden",
"error_description": "",
"messages": []
}
logging in with monocle gives:
Error
There was a problem trying to load the channels from your Microsub endpoint.
Microsub endpoint: https://aperture.p3k.io/microsub/63
Your website: http://known.kevinmarks.com/
The endpoint returned the following response.
{
"code": 403,
"header": "HTTP/1.1 403 Forbidden\r\nServer: nginx/1.14.0\r\nContent-Type: application/json\r\nTransfer-Encoding: chunked\r\nConnection: keep-alive\r\nX-Powered-By: PHP/7.2.7-1+ubuntu16.04.1+deb.sury.org+1\r\nCache-Control: no-cache, private\r\nDate: Thu, 16 Aug 2018 13:41:31 GMT",
"body": "{\"error\":\"forbidden\",\"error_description\":\"The token endpoint could not verify this access token\",\"token_endpoint\":{\"url\":\"http:\\/\\/known.kevinmarks.com\\/indieauth\\/token\",\"code\":404,\"response\":\"Client mismatch.\"}}",
"error": "",
"error_description": "",
"url": "https://aperture.p3k.io/microsub/63?action=channels",
"debug": "HTTP/1.1 403 Forbidden\r\nServer: nginx/1.14.0\r\nContent-Type: application/json\r\nTransfer-Encoding: chunked\r\nConnection: keep-alive\r\nX-Powered-By: PHP/7.2.7-1+ubuntu16.04.1+deb.sury.org+1\r\nCache-Control: no-cache, private\r\nDate: Thu, 16 Aug 2018 13:41:31 GMT\r\n\r\n{\"error\":\"forbidden\",\"error_description\":\"The token endpoint could not verify this access token\",\"token_endpoint\":{\"url\":\"http:\\/\\/known.kevinmarks.com\\/indieauth\\/token\",\"code\":404,\"response\":\"Client mismatch.\"}}",
"headers": {
"Server": "nginx/1.14.0",
"Content-Type": "application/json",
"Transfer-Encoding": "chunked",
"Connection": "keep-alive",
"X-Powered-By": "PHP/7.2.7-1+ubuntu16.04.1+deb.sury.org+1",
"Cache-Control": "no-cache, private",
"Date": "Thu, 16 Aug 2018 13:41:31 GMT"
},
"rels": []
}
The known endpoints are
<link rel="authorization_endpoint" href="http://known.kevinmarks.com/indieauth/auth">
<link rel="token_endpoint" href="http://known.kevinmarks.com/indieauth/token">
<link rel="micropub" href="http://known.kevinmarks.com/micropub/endpoint">
Sometimes I want to archive a channel so that it no longer shows up in my main apps. For example a channel created to follow a conference hashtag. I want a way to archive a channel so that it doesn't show up in the list anymore, either until it's been un-archived, or maybe if there is some option to temporarily show archived channels.
Currently when adding a source, when the [Find Feeds] is activated, a list of feeds is presented. These feeds do not display the title
which may have been given by the site in markup.
Example: https://ascraeus.org/
, in addition to h-feed mf2 markup, has the following html markup:
<link rel="alternate" type="application/atom+xml" title="Ascraeus Main Feed" href="https://ascraeus.org/index.xml" />
<link rel="alternate" type="application/atom+xml" title="Ascraeus Micropost Feed" href="https://ascraeus.org/micro/index.xml" />
<link rel="alternate" type="application/json" title="Ascraeus JSON feed" href="https://ascraeus.org/jsonfeed/index.json" />
<link rel="self" href="https://ascraeus.org/" />
These feeds are displayed as:
microformats
https://ascraeus.org
atom
https://ascraeus.org/index.xml
atom
https://ascraeus.org/micro/index.xml
To aid in clarity, these atom
identifiers should display the content of the title
attribute set in the page's html.
Given an existing channel, create a meta-channel containing only h-cards with the last known location of everyone from the main channel. Useful as a "where are all my friends" channel if shown on a map.
I should start by saying this is because I am lazy. I was trying to use IFTTT to push notifications to aperture using the webhooks "channel", but they do not support adding Headers so I can't pass "Authorization: Bearer XXX", would it be possible to add support for access_token in the body as well?
Looking at the code it seems like you are checking for Authorization: Bearer and if not returning a 401, if I am missing something please let me know.
Since one of the principles of the IndieWeb movement is to 'own your data' and to comply with the GDPR right to data portability, Aperture should have a feature to export all subscriptions.
Possible alternatives:
When creating a new channel now, it gets sort order 0. That leads to both the Notifications Channel and the new Channel being sort order 0 and when you try to order the new channel down one, it moves the one beneath it up to 0 and it gets stuck.
Recommendation: New Channels get placed at the bottom.
Alternate Suggestion: New Channel gets sort 1, and all other channels have to get moved down by 1.
RFC5005 is a standard that was published in 2007 for "Feed Paging and Archiving" for Atom and RSS. I'd like Aperture to support consuming feeds using at least section 2, and preferably also section 4, of this standard.
Section 2, "Complete Feeds", should be easy, I hope. If a feed contains an empty <fh:complete/>
tag, then Aperture should discard any saved entry whose GUID is no longer present in the feed document. In other words, the entries that are currently in the document should be treated as the complete set of entries for this feed. In this mode, if an entry had been present in the past but is missing now, that's because it was deleted upstream, not because it scrolled out of the most recent 10 entries.
It might be nice if Microsub would also signal to clients that this feed is complete, allowing them to offer different UI if they want to. I'm not sure they strictly need it though. Doing this purely in the Microsub server should go a long way.
Support for section 4, "Archived Feeds", would also be great. In that mode, a feed should not have the <fh:complete/>
tag, but instead should have a <link rel="prev-archive">
tag, where the href
attribute points to another feed document. If you successfully walk the prev-archive
links all the way back until there aren't any more, then you should have a complete feed, as in the first case.
For efficiency, however, you're permitted to treat the archived feed documents as if they have far-future Expires headers. If the publisher needs to change any of the archived feed documents, it needs to generate a new URL for them, so the next time you fetch the main feed you'll see that its prev-archive
link has changed. This means you need to keep enough information to notice that some archived feed documents you've fetched before aren't part of the history any more, and also to discard any entries that were in those feed documents but haven't been copied into the new feed documents.
In addition, the spec says that an entry with the same GUID may appear in multiple feed documents, and if so, you should only use the version from the most recent feed document. This is a trade-off the publisher can choose to make: if an old entry is changed, it can avoid forcing existing clients to redownload the old archives, at the cost of making new clients download a larger total number of entries.
I hope all of that is easier to do than it is to explain. ๐ I think the simplest thing is to recompute the entire set of entries whenever the feed changes. But since the common case is that only recent entries get updated, and since a large feed may have a lot of archived pages, it'd be nice to avoid most of the recomputation when possible.
It also might be nice to load archived pages lazily, in response to Microsub clients using the paged API. That strikes me as more complicated state to keep in the server but it's probably worth doing eventually.
Adding support for section 4 means there are some new error modes that might need to be reported to Microsub clients. In particular, fetching one of the archive pages might fail even though newer ones succeeded, and I'm not sure how you'd inform the user of that failure mode.
In short: I think supporting RFC5005 section 2 is easy and I'd love to see that done first; then section 4 is more useful but opens up some additional questions that may need more thought.
Something is very wrong with the paging mechanism in certain circumstances. I think the common failure mode is when a bunch of entries get added to a channel all at the same time.
I am not really a sysadmin, so maybe I'm missing a command I should be running, but: of my 50GB, 44GB was filled with aperturexxxx
-files in my /tmp folder, freezing my server :)
A little search through the code spots these ones:
Aperture/aperture/app/Listeners/EntrySavedListener.php
Lines 119 to 120 in 9896b58
At the end of the method, there is an unlink()
, but that does not seem to work (maybe you miss a fclose()
here?) There is also this $resized
one that does not get unlinked.
One for once Aperture is more stable.
But I'm lazy, so a docker image would be great and make it easier for others to self host.
When I log in with http://kevinmarks.com I get user 89 and one config.
If I then connect using monocle I get an error because monocle authed http://www.kevinmarks.com
When I log in with http://www.kevinmarks.com I get user 90.
Logging in with monocle then works.
I think the 301 redirect from kevinmarks.com to www.kevinmarks.com should change the URL that Aperture stores too
Currently, anything related to channels requires a channels scope, however that scope should only apply to managing channels. Reading channels should just require the read scope.
When attempting to add http://audreywatters.com/feed.xml to a channel, Aperture responds that no feeds were found at that URL, which is itself a feed.
Standalone XRay reports that it is a feed and finds content there: https://xray.p3k.io/parse?url=http%3A%2F%2Faudreywatters.com%2Ffeed.xml
Currently each source is treated independently, and if two sources end up having the same entry, the entry will appear twice in the channel. This is only a problem when for example I add a Twitter search from granary.io as a source, as well as have my streaming search script running, and a tweet matches both.
A channel should have an option to dedupe entries by the entry URL across all sources in the channel. This is potentially unsafe, since any source can claim an entry URL on any domain, so this should not be the default.
Improve reply contexts by inspecting the reply URL, as documented on https://indieweb.org/reply-context#Minimal_text_reply_contexts
Trying to delete a channel I keep getting a 500 error with a html page (not json) saying "Whoops, looks like something went wrong."
I think the post request looks fine. Just been trying in Postman.
should move all existing entries from the old channel to the new channel
Dunno if you want a different issue for any feed parsing issue I run into or a single issue with a list of problematic feeds.
Anyway https://miklb.com/ is not picking up an author property although it does seem to work with the php mf2 parser
Aperture won't find a feed generated by a URL such as, e.g., http://feeds.pinboard.in/rss/u:ats/. If I save that feed to a pinboard.rss
file at my web server, then it can find, parse, and add it to a channel.
Can Aperture be updated to find and parse the file served by the original URL?
When you use IndieAuth to set up a new Micropub token for a channel, you should be able to create a new channel and then use that channel for the Micropub token being created
(Originally published at: https://eddiehinkle.com/2018/07/26/12/reply/)
Ran into an issue where I think monocle saved my site url with a trailing slash but my token endpoint returns my url without a trailing slash, so I get the invalid_user
error
Looks to me like this code should either handle query string responses or include the accept application/json
header
But might just be my token endpoint.
Without visiting the webinterface of aperture, there were no channels created (and I could not add a new one using Together), so some database setup seems to be not triggered by first use of the microsub API.
Aperture allows me to exclude Reposts, Likes, etc; but it does not allow me to do the opposite: to see only reposts.
I follow several people on Twitter who occasionally post original Tweets, but usually just retweet other people's stuff. I'd like the ability to follow their original posts in one channel, and dump all their retweets into another channel. Some (sometimes, most) of their retweets are interesting and worth skimming; but I'd like to segregate them from original tweets.
I realize this may be somewhat orthogonal to the intent of Aperture, and this functionality could be handled by a different service that munges the Twitter feeds and presents new feeds to Aperture. Feel free to close this "WONTFIX" if that's the case.
According to https://indieweb.org/Microsub-spec#Channels_2:
The response will contain a channels property with the list of channel uids and names. The uid=default and uid=notifications channels are not part of this response, so for users who have only the default channels, the response will be an empty array.
Currently, Aperture returns all of the channels, including the default and notification channels
Currently a bunch of entries from new feeds are added at the top of a channel, completely overwhelming it when it's first created. On the first add of a feed, use the published date to backdate the entries.
When I try to sign-in I get an error message:
> missing authorization endpoint
> Could not find your authorization endpoint
It would be great if instead it would fallback to supporting RelMeAuth perhaps via indielogin.com, so anyone who has already setup rel=me for signing into Indieweb.org could also sign-in to Aperture!
(Originally published at: http://tantek.com/2018/177/b1/support-fallback-relmeauth)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.