There's a Matrix room for NNCP https://matrix.to/#/#nncp:matrix.org that's also bridged to #nncp on irc.oftc.net. It's pretty small and there isn't a lot of frequent activity there, but it's a good place to ask questions about the project or see what others use it for. jgoerzen hangs out there too!
I keep meaning to try building DNews, which is a single backend with NNTP, web and mail interfaces.
Matrix/IRC/Zulip/Slack are not fundamentally incompatible with Usenet so much as philosophically incompatible. It's the difference between casual mostly-realtime and expected delay.
Was a firehose of unimaginable proportion. There's nothing subjunctive about it. (-:
Interestingly, back in the days when Usenet was a big thing, the big nodes that carried the binaries newsgroups didn't use NNTP to exchange messages. There was too much latency in the protocol. They had duplex protocols that attempted to make the most of the link bandwidth in both directions.
Going back to the UUCP distribution model is actually a step back from where NNTP was.
Additionally, if one has a mechanism for reliably and securely distributing files, which is what NNCP is claimed to be, this does prompt the question: Why on Earth would one layer Usenet on top of it in order to do the same job of sending files around? Especially if it were the traditional Usenet model that didn't even include yencoding, let alone MIME and content signatures? Why would one care about binaries newsgroups if one already had a self-styled "darknet" binaries distribution system in place as the underlying transport mechanism?
I remember building news.Skynet.be up into the Top 100 with some very careful use of Diablo and splitting all my feeds along binary bucket sizes, so that I'd have like eight or ten simultaneous streams going to each peer, and usually the individual buckets stayed roughly the same amount full.
And Diablo had some cool techniques that amounted to offering a news article to all the peers as soon as the unique news article was offered to it, but before the actual body of that unique news article had been delivered. Then you could go rapidly deliver it to all your peers.
So, odds were that if you had a diverse feed, you'd see more news articles come into your server and then successfully flow out of your server to your other Top 100 peers, than anyone else.
I spent months tweaking those newsfeed bucket sizes and peer lists, and I would peer with anyone I could. And I actively searched for new peers that seemed to have a diverse set of news articles that I wouldn't have otherwise seen.
Our peers in Belgium had a love/hate relationship with us over this. They loved us because that meant we had a very fresh feed to give them. But they hated us because it was a "local" network hop across the Belgian exchanges with their high-priced Belgian WAN lines instead of coming across their much higher bandwidth and much lower cost international T-3 connections.
We also had a satellite feed for news articles, which helped us fill in the content over 64KB in size, because that was our break over for our non-binary/binary news feeds.
Then Belgacom decided to bring Skynet back inside the parent company and all the people who had a clue left like rats leaving a sinking ship.
And I guess that satellite feed was Cidera/SkyCache? I have fond memories stumbling across that while scanning transponders on the Sirius satellite at 5 degress East back then. No encryption whatsoever and it didn't take long to reverse engineer the protocol, which was based on UDP multicast, over DVB-MPE. I still remember the packet headers beginning with 0xdeadd00d.
The service broadcast over 200 gigabytes of Usenet news each day. The satellite transponder was almost 1000 times faster (45Mb/s) than dialup, which was just completely mind blowing at the time.
I also remember configuring the INN news server as well, but sorely lacked storage space to carry any binaries.
The whole thing involved writing a FreeBSD kernel device driver for the satellite card, together with a user-space daemon to extract the IP packets from the raw MPEG-2 transport stream, yes it used the same modulation as for TV (DVB-S). All done from my bedroom, when I was 18, with a small portable satellite dish in the window.
Back then the sheer amount of unencrypted IP data out there in the clear on satellite transponders was absolutely incredible to see.
I still have the source code to the FreeBSD kernel driver for the satellite card, and have uploaded it at https://pastebin.com/QegFzVWy
To be honest, I don't remember what the service was. It could well have been SkyCache. I do remember them having 45Mbps of throughput over multicast, which at the time was faster than our entire international bandwidth we had. For what it was, it was a pretty amazing service.
It wasn't until later that we upgraded our international link to be a T-3 that was still cheaper than the E-1 links we had inside the country.
Took me a while to persuade management to let me test Diablo (for various reasons) but it was such an improvement over INN and (earlier) Highwinds. Shame the company weren't really behind the news service once it started costing money.
That was another Matt Dillon special, before Matt decided he needed to fork FreeBSD and create DragonFlyBSD. By that time, I think his interest in Diablo had waned, though.
But at the time, it was far and away the best USENET News routing program that I know of.
I'm not sure what to make of that (aside from the large Italian/French contingent) but the top 10 includes some normal fare like rec.food.cooking, rec.arts.tv, and rec.bicycles.tech
Is HN available (e.g. "comp.culture.hn") on USENET? It would be nice to have the Web forum be interoperable with a USENET group.
The pull model (consume based on your topic/activity) of USENET newsgroups fits forum discussions much better than the "visit all pages you care about daily" Web model, IMHO.
The usenet model also is much better for reading because it tracks what you have already read. If I come to this article tomorrow I'll see all the comments from today that I already read, and someplace mixed in there are new comments that came after I finished reading. Some of those comments are very insightful, but no web forum I've seen has any concept of "you already read this so we will make it easy to skip that by default".
Technically USENET doesn't have that concept either, but all the software I have ever used to read it does, and that is what counts. (unfortunately all the software is local only so I can't switch between my phone and my desktop for reading - then again I haven't looked at USENET in 25 years so I don't know what is current)
It's sad that the threaded reading capabilities of most computer-mediated communications systems are actually worse nowadays than they were in the heydays of Netscape Communicator and trn.
The best solution there would be using some sort of web based hosted solution, which may well not exist. Thinking something like how TT-RSS replaced Google Reader. Basically, self hosted Google Groups, back when Google Groups was synonymous with web based usenet.
Shouldn't be too crazy to cobble together, if it doesn't already exist -- just an nntp client on top of a lamp stack.
There is a hacker news API so somebody could set up an NNTP server with an HN newsgroup. Looks like it's already been done, at least for personal use. If a newsgroup is read-only and only on one NNTP server, does it count as usenet? Is that like the sound of one hand clapping?
Yes it is, but only barely. Most of the posters are quite old and in a lot of groups when someone doesn't respond for a while, people become concerned for their health.
Private usenet is a much better model for corporate discussion than eg Slack and typical alternatives. More natural message threading, integration with email clients, binary and document handling built in. Not sure about search but easily done on client side.
Private nntp servers never seemed to be a thing though.
Yes. Many groups are pretty much dead, but there are some that are active (a handful that are QUITE active). The quality of the active groups ranges from "60% flames" to "these are Ph. D's talking to each other about things I will never understand". Though most, of course, are somewhere in-between.
alt.pub.coffeehouse.amethyst (This is somewhat low-traffic but is unique; people write in the third person imagining themselves stepping into a friendly coffeehouse)
Surprised to see alt.arts.poetry.comments on this list. Yes it remains active, but the regular posters there[1] are for the most part the remnants who triggered the Great Poet Flight to various online (moderated) forum venues 20 years ago.
Some of those troll fights were fun. Some even involved poetry.
Live: rec.arts.sf.written -- discussion of written science fiction and fantasy
Nearly dead, but if you post cogently, someone will respond: rec.audio.high-end -- discussion of high-quality (not necessarily high cost) audio reproduction
I follow comp.os.vms
and sci.electronics.design is always hot. I think there's probably a lot of little corners of usenet which are still pretty active. I was sad to see comp.dsp die off, but it's been equally more active by the same people on stackexchange.
It's incredibly alive thanks to piracy. Look at r/usenet for a peek of what people are doing with it nowadays. There are numerous sites which index binaries on Usenet, many use obfuscation to avoid DMCA/NTDs.
Obfuscation and quasi uncrackable passwords usually. The big problem being that when a .rar file goes missing (and not enough parity files were used), the download is dead forever but is hard to flag as such.
I tried setting up one of the open source servers and it was an awful experience. I dug around and it seems like most providers are using proprietary implementations or proprietary forks of servers that are no longer maintained. I was interested at the time in understanding how the big guys like Giganews and so on set up nntp, peering, storage, etc. Didn't get very far.
Also, games grew well beyond what would be reasonable to send over Usenet. There was really only a time in the late 90s when games were small enough to not require you split them into literally millions of parts. This happened to coincide with the maximum popularity of the Usenet, but as a distribution platform today it's just not practical. Still useful for music albums and DVD movies I guess, but as the parent post pointed out nobody carries alt.binaries anymore. The pirates have long since moved on.
There are plenty of full-size games and UHD movies being distributed over alt.binaries. Pirates pay to use newsservers that carry the massive alt.binaries.
And at Belgacom Skynet we had people dedicated to the task of finding the illegal photos and purging them from our systems. I sat next to Thierry who was doing a lot of that work. Poor guy.
I caught only glimpses of some of the stuff he had to witness, and I would not wish that job on my worst enemy.