But all kidding aside, web directories should be much more powerful now than in the 90s. Websites have RSS, and directory websites should be able to automatically monitor things like uptime, and leverage RSS to preview a site's most recent post.
I've considered maintaining my own directory on my personal website (a one-way webring if you will), but always stopped because the sites I linked to either died, or were acquired and became something very different.
I prefer Mark Twain's “History never repeats itself, but it does often rhyme.”
It's pretty obvious that we have come to a stagnant period of online content and there's a desire to move past the glamour of the Instagram and political fights on Twitter and optimised for ad revenue videos on Youtube but I don't think that the personal websites are coming back.
Those were cool because only specific type of people were able to build websites, then the code free services for sharing content came along and everybody got online presence but because the medium is the message we are kind of getting tired of the message. There seems to be a search for a new medium. The time for the next verse feels around the corner but I don't think we have found it just yet!
It might just be observation bias but it seems to me that personal websites and blogs are coming back at least for the kinds of people that might have had one in the old web. Perhaps the trend won't wash over the whole web but having a subculture that is at least as large as the old web would be great, no?
Regarding the lifetime of a site, it might be possible to submit requests to the Internet Archive or similar service whenever a site is added to a directory or a new post is found on it. That way too, it would be easier to see when a site is no longer active or when it turned into something else. Then, when it's deactivated, the web directory could just point to the archive first
I wonder how soon people will start to collaboratively train ML models for curation, by their acts if curation, much like spam filters are trained today.
I love to see this. The death of blogs and RSS is highly exaggerated. The idea that Google "killed" blogs by killing Google Reader is a meme that is more destructive than Google's act in itself.
There are countless healthy and active blogs that you can read via RSS. There are great RSS reader apps.
For us technically-minded folks we need to keep being proactive about helping people read the web via RSS, improving discovery, and continually making RSS a first-class option on sites we build.
I think people seem them as dead because a small percentage of internet users engage with them, but people forget that billions more people have access to the internet now. Even though it's a smaller percentage, the actual number of users has still gone up generally in my experience.
A shameless plug - I've built (recently) a simple yet powerful blog reader that could satisfy the needs of most people https://lenns.io. Why - it solves three primary issues with "traditional" readers:
1) You can follow websites (based on blog-post titles) when there isn't a proper RSS feed.
2) You have control on how many results per feed to have listed (e.g. not having your whole feed overloaded with the posts of one source)
3) You can assign priorities to Sources and categories so that you have control what's on top of your feed.
... if someone here bothers to give it a try, I'd be happy to receive any feedback.
> As most blogs and websites don't export the full content of their posts, that leads to a mixed reading experience. Quite often, you click the title of a post within a conventional reader only to see that there's nothing there apart from another link to the original post.
It’s the opposite for me. Almost every blog exports the full text. It used to be the other way, but that was in the GReader times.
> The feed reader for people that want to be in control
I find that a questionable headline with no self-hosting option.
And it misses screenshots, I never sign up blind for anything.
This sounds interesting to me. Not because RSS is broken, I hardly ever encounter a site without a feed. But the "only one best article per site per day" sounds very smart.
However...do I get this by e-mail? In an app? Web interface? Maybe you could add some screenshots to your page :-)
Why did people create podcasts for free and distribute them via RSS? Because there once was a time when advances in computing had use cases that weren't driven entirely by commercial interests.
We still do! But the percentage of such publishers is smaller than back then, maybe against the desires if would-be publishers. Do you think it was easy to offer paid online content and get paid in 1998?
You can always choose to. Not everyone will join you of course and you might be discouraged because of that but there are plenty of us putting websites and content online without commercial interests.
On my blog I get about 800 hits per month. It’s not enough to generate meaningful revenue with ads. So my blog is more about branding and SEO for my name, and slapping an ad or even a donate button feels cheap to me.
If I had 10x the visitors I might see things differently but I think a lot of small blogs are in my boat.
Actually not really. Depends on the post of course, but for a VC ad channel, HN has plenty of users that are very much not aligned with that mindset and more let's say idealistic comments will regularly get upvoted here.
I have been maintaining a personal website since 2001 and the core interest of my website is to share things I find interesting. RSS does not go against that. On the other hand, RSS makes it easy for subscribers to find out when I have shared something new.
I would say that providing a first paragraph of text by RSS feed might actually attract more users/readers to a site to read the full article (paid or not).
Wasn't it Basecamp/37Signals who said to emulate drugs dealers and give the first try away for free? ;)
Drug education has always been all kinds of divorced from reality. But once you step away from literal drugs then the concept of a freebie to hook you is not at all uncommon.
They wouldn't, but who cares about them? RSS is for people who blog to share their interests freely, and to help their readers get the content more easily.
> Is this just basic nostalgia, people wanting to recreate the dial-up days or even BBS days?
That's certainly not why I created my search engine. Old isn't an end, its a means to cut the bullshit.
Like I read a lot of old books, not because I'm nostalgic for yellowed paper, but because they consistently have much better signal to noise ratio than most of what you'll find on a screen or printed past 1990 or so. (When people bought books in physical book stores and weren't primarily ordering books online, books weren't judged by their page count as a proxy for how much content they contained, and thus had a lot less filler and anecdotes.)
If you gave me a method of selection that was as reliable for identifying good books among contemporary books, I'd probably read more contemporary books as a result.
I believe that much of it is, indeed, basic nostalgia. Some of this, however, is also the recognition that not all of the early web was bad. Much of the early internet is viewed through rose-tinted glasses, but some of it was really good. For example, the ability to use a directory to find _exactly_ what you're looking for while search just feeds you two pages of paid result listings. Likewise, gopher made information available on even the most modest of machines (gemini trying to recreate something somewhat better) while modern web can spin up cooling fans on high-end laptop from 2019.
This is the best thing for RSS in a long time.
What I miss though for RSS streams is commenting. 'Nobody' reads the articles on link aggregators, (*e: just the comments). In a way, RSS is a link aggregator that limits its user base to the ones who read and don't comment.
I am wondering what will happen if RSS readers find a way to share comments on posts. Maybe ActivityPub makes that possible.
The only service of which I am aware that allows for comments on RSS is https://linklonk.com/ .
Are there other approaches to bring comments to RSS streams?
Maybe ooh.directory can use ActivityPub to allow commenting and voting on the entries. Comments on HN are great to check for problems with an article. That should also be true for comments about entire blogs.
> I am wondering what will happen if RSS readers find a way to share comments on posts
The IndieWeb community is focused on this with Social Readers[1] and Webmentions[2]. The core idea is your reader also ties into your own published feed, so you can make a comment right in your reader that publishes the comment to your own feed and sends a webmention to the original article so they know about it.
Barriers to entry are still kinda high (much like making a website 25 years ago) so any adoption should lead to a better signal/noise ratio. Unless it becomes popular enough for bots to start spamming the webmentions...
Webmention depends on the original source to link back to those comments for discovery though, right? I think perhaps a better approach would be a way to comment on any URL and see comments from other users (or communities) that you have subscribed to. That way the author of the content being discussed is not in a position to limit the discussion to what they approve of and you can discuss things that are not even opted into this system.
Ideally this system would also be integrated into browsers you you can see and write comments even when visiting a URL directly.
Thats exactly what my vision is for comments on Haven[1]. Without webmentions, you're just telling your friends/followers what you think which enables private comments/conversations on public content.
> What I miss though for RSS streams is commenting.
>In a way, RSS is a link aggregator that limits its user base to the ones who read and don't comment.
I share this view. What do you think about a concept wherein a collection of followed feeds is presented in a timeline with the possibility to 'comment' with an email form? It will look like a regular comment textarea under some blogpost, but the commenting is done by email. The reaction isn't immediately visible under the blogpost (if at all) and therefor everything works humanly slow. But there's also no signup required, so it's more anonymous and openly accessible.
>a collection of followed feeds is presented in a timeline with the possibility to 'comment'
From my point of view, that's the future.
Using mail is dangerous because it is another protocol, and it still requires account management because nobody will post to arbitrary pages with their primary mail address. That said, mail still has the potential to become the standard for social networks.
Why should comments not be immediately visible? People can already write mails to authors. That's not what builds momentum. The interesting part is the interaction of the audience.
We can build social interaction that improves direct human interaction in the same way that cars improve human movement. I wouldn't focus on slowing it down but on speeding up the filtering so that it is easy to find the individually preferred audience among all possible reactions to an url. The difficult part is to maintain group identities and momentum when all comments are dissected and re-aggregated.
It's problematic to manage comments on the server of a blog post. This creates silos and echo chambers because it is difficult to find the comments of a user on other blogs and creators rarely allow their audience to create their own posts. Newspapers with their comment sections already offer that protocol. We have Facebook because newspapers missed out on coming together and offering the missing parts.
What is missing at Linklonk, apart from opening up to ActivityPub, is long-term conversations on posts. On aggregators like HN, there is an audience at the top comments. Hardly anybody writes comments on old posts. To make comments on off-momentum articles worthwhile, it would be necessary to have a notification function that is triggered not only by replies but also by new comments when they pass a threshold.
LinkLonk supports comment notifications for any item that you have subscribed to. You get notified through browser notifications and email. And you can mute/follow individual branches of the comment tree. Details: https://linklonk.com/item/396040801387544576
There is definitely more to improve, but I'd like to see more use of the comments section first to know what needs to be improved next.
As for ActivityPub, I just saw https://go-fed.org/ on HN and it looks like a good fit to add ActivityPub support to LinkLonk (since it is written in Go). There are multiple ways to do it, so let me know how you would like to see it working. Do you want to follow posts of Mastodon users on LinkLonk (similar to how it shows you RSS content)? Do you want Mastodon users to be able to follow your channels on LinkLonk (e.g. @s3000@linklonk.com)?
>LinkLonk supports comment notifications ... can mute/follow individual branches
Great!
>how you would like to see it working
That's a difficult question. My primary focus is on the availability of discussions. Nevertheless, integrating LinkLonk in both directions with Mastodon should help with growth. There doesn't seem to be an algorithmic timeline, which LinkLonk could provide. Establishing LinkLonk as an integrated RSS Reader and ActivityPub Client could attract a bigger user base that makes calculating suggestions easier.
What ActivityPub can bring to new social networks like LinkLonk is the ability to share the comment section with various other providers. E.g. RSS readers or annotating browser plugins can pool their comments in one place to have an active discussion even if each tool has a small user number.
However, pooling discussions is generally avoided:
1. When there is a dupe on HN, the discussion in the old submission is not continued.
2. There exist various aggregators like lobste.rs, reddit.com/programming or tildes.net, that almost discuss the same articles as HN, but they are different communities.
3. Twitter and Mastodon could pool the replies to posts of the same url, but they don't.
The trivial approach for LinkLonk would be an integration with lemmy.ml and all Mastodon posts that discuss the same URL. That way, most posts would come with a discussion and LinkLonk instantly had a vibrant community.
With the pool-avoiding behavior in mind, some adjustments could be needed but I haven't figured out what that could be.
> Why should comments not be immediately visible? People can already write mails to authors. That's not what builds momentum. The interesting part is the interaction of the audience.
I agree, but I was thinking of some filtering function wherein the webmaster can whitelist/blacklist comments. If a certain site or topic attracts - for whatever reason - visitors who abuse the comments, the webmaster needs to be able to block stuff afterwards or even upfront. An ideal life would be wherein every visitor/commenter does only nice things.
> Hardly anybody writes comments on old posts.
Interesting idea. I always like it when I get responses on my videos/comments on Youtube from years ago.
I at the end of each of my blog posts I include a link to the tweet where I announced the post. Works well for me. If Twitter dies I guess I'll have to swap them out at some point.
Alternatively, you could use something like a Discus embed for comments but I didn't want to have one more thing to manage.
A nice addition would be having account, and being able to like it. Likes are not public, but instead combine to show you things other people liked as well. Since they are not public, hopefully that will discourage gaming it.
DMOZ was and is freely licensed though. This one makes for a nice proof of concept, but there's no mention of availability or reusability for any of this data.
- Newsletters aren't included. Some sites are a blog and a newsletter, with identical content, but only those which mainly seem like a blog are included.
- Only blogs updated within the past year or so are added.
- Tumblrs are only included if they’re either focused on a specific topic or feature original content.
- Link blogs are only included if they include original commentary about each link.
- No blogs promoting hate speech, denial of climate change, anti-vax ideas, etc.
> - No blogs promoting hate speech, denial of climate change, anti-vax ideas, etc.
I've heard of Twitter accounts getting banned because they mentioned some of these subjects in context of criticizing them. (which is the opposite result that would be expected)
How their curation process work is just as important as the rules themselves. If it's transparent and there's a person (not just an automated algorithm), is there also a recourse process for false positives or bad decision making?
My personal opinion is that Phil is a busy guy and doesn't have time to carefully analyze your request with any nuance.
Here's an example of the difficulty.
"Phil, can you review the article [link to blog] where you banned us from? We are critically analyzing a very socially sensitive topic and while we are disagreeing with the majority of people, we actually encouraging unity and are not encouraging any hate. You can clearly see if you read our entire article."
Or, "I only linked to Trump's tirade to point out how insane it is. Can you re-read my article....?"
> I've heard of Twitter accounts getting banned because they mentioned some of these subjects in context of criticizing them. (which is the opposite result that would be expected)
My comment above is the very reason for this discussion. Not using the site is the same as suggesting not commenting here, which is nearly nonsensical.
This looks like a really cool project, excited to browse through over the holiday break. Just submitted my art blog (https://freezine.xyz), but realized I don't publish an RSS feed. Will have to address and resubmit.
Why, oh why....?
Stuck in hospital, so much free time free, so many restrictions, and yet so few - what am i missing, or likely to have missed, being on mobile?
Sorry to hear you're in hospital, I hope you get well soon. The warning is there because I don't design the site for mobile, plus there is content hidden around the site that is easiest to find by viewing source or otherwise exploring in a way that mobile browsers don't make easy.
Nowadays, I find Feedly topics a good place to explore RSS sources. I believe it is human curation, so it is kind of a directory too (though less "indie" and not restricted to blogs). You can sort by Followers (=popularity) and articles/week.
I also recently started making some portal with curated links and feeds. I call it Discovery Portals. The very small beginning can be found at https://www.heyhomepage.com/discover/
This gave me a flashback to the final season of "Halt and Catch Fire", which I enjoyed a lot and recommend to anyone who has nostalgic feelings about computing and the internet in the 90s.
Smart request! I recently started collecting blogs on https://www.heyhomepage.com/discover/ (almost nothing to see there right now) and I planned to publish the lists of links also as easily searchable and sharable OPML files.
I miss directories. The hard part is to keep it up to date and delete unreachable sites. Also flag the inactive ones. I don't know if this site does this.
At the moment I've only added blogs that have been updated within the past year or so. But over time some blogs will obviously stop updating.
It's currently possible to filter each category to only show blogs that have been updated within the past week/month/year, which should help.
If it gets to the point where there are lots of blogs that are still live, but haven't been updated for a long time, maybe I'll make the default to only show blogs that have been updated within a year or so. We'll see.
I can see which blogs generate errors of different kinds when fetching their feeds, so I'll be able to spot blogs that disappear. If that happens they'll be removed/hidden from the site.
Terrific. Brings back early internet vibes. I want to see more of this — human curated content. AI brought the promise of bringing improved content suggestions. I find it has done exactly the opposite. End of the day, I find humans are better at curating content for humans than machines.
I still have an instance of Fever running on one of my servers.... but I really need to update the feeds I'm tracking with it. I haven't updated them in ages.
Looks very nice. Unfortunately a lot of good bloggers now use paywalled substack (I don't blame them, I don't have time to write for free either), but this is a good resource for the few that are still gratis.
An error occurred during a connection to ooh.directory. PR_END_OF_FILE_ERROR
The page you are trying to view cannot be shown because the authenticity of the received data could not be verified.
Please contact the web site owners to inform them of this problem.
I don't think it's that simple. It was useful then because web discoverability was an unsolved problem, and web content was tiny compared to today. A directory sort of fit the need. As the web grew directories became bloated and hard to navigate, search became
more useful.
This is more of a nostalgia or niche link list. It's not the same purpose as yahoo in the 90s.
I think that search engines like Google are unbeatable to find content that we already know about. On the other hand, those lists are great to discover what we don't already know about, such as new topics, etc. Just like the awesome lists on GitHub.
My thought exactly, and also why I don't just search 'latest news' on Google every morning. Human curation is a thing. (not sure how much human curation goes into ooh.directory, but I'm pretty sure they don't use Google's ranking algorithms to surface links)
Fun and utterly off topic: I'm on a boat for my partner's birthday, and this "FortiGuard" web security thing they use on their satellite internet that yesterday temporarily prevented us from watching porn together is today preventing me from viewing this cool site, on the grounds: "Newly Observed Domain."
Wondering who thought new domains should be blocked "just in case" and how they determine that. What percentage of requests the service receives are domains its never seen before? Assuming 90% or so are like, google, facebook, etc, but what if someone has a phone app that calls weird api domains? Actually that might explain some of the random weird failures I've been seeing on this trip...
"No blogs promoting hate speech, denial of climate change, anti-vax ideas, etc."
Well that's a shame, there's quite a few out there that pose credible and interesting questions and discussions around some of these topics.
Basically this is saying: "No conservative blogs allowed" which erases at least 50% or more of the population's musings.
I'd love to see an aggregator that isn't politically motivated and biased as this one is. The internet would be a better, freer place.
We may even find that these "hateful", "alarming" ideas are in fact mainstream after all and being unfairly suppressed by sites like these as well as legacy media, social media, etc. under the guise of the greater good.
The United States, judging by voting numbers in the last 2 Presidential elections. 50% is obviously a rough rounding, and serves as a figure of speech.
Spreading misinformation is a VERY different thing from constructive discussion of different takes on an issue. Anyone taking offense to a request like this can't tell the difference.
Valid discussions and critiques on the prescribed narratives are all labeled as "misinformation" conveniently and expressly because the holders of the legacy media narratives are in power and do not wish to be challenged. Common examples are: medical practitioners blowing the whistle on vaccine injuries, or recommending treatments to illnesses that have been demonized by media outlets.
You would be shocked at how many things no longer exist, but previously existed with incredible support and numbers behind them. Everything from Facebook groups, YouTube videos, blog posts, websites, etc. -- too many mediums to list, and yet, unreferenceable here because they've been taken down by the liberal moderation machine who's unwilling to view their opponents as valid participants of grander dialogues, so instead, enact policies of erasure rather than honest debate.
It is not a fair game when those who are in power _are_ the ones deciding what can and cannot exist in the public scene, causing them to remain unchallenged, and prevents their political opponents from obtaining power by suppressing their use of the standard avenues of communication.
But all kidding aside, web directories should be much more powerful now than in the 90s. Websites have RSS, and directory websites should be able to automatically monitor things like uptime, and leverage RSS to preview a site's most recent post.
I've considered maintaining my own directory on my personal website (a one-way webring if you will), but always stopped because the sites I linked to either died, or were acquired and became something very different.