Beautiful. The nostalgia hit me very hard with this.
I remember in the original version you had all sorts of options, and could even combine screensavers.
For people who may not remember: screensavers used to be a neccessity as old CRT tubes would 'burn in' an image if they held it static for too long - leaving a shadow of the static image permanently. Screensavers kicked in after a few minutes and displayed something dynamic so as to save your screen from burn in.
In the 90's, having a cool custom screensaver was cool (around the same time that having a custom ringtone on your 3210 was cool), and you'd pay money for software like After Dark.
Screensavers are a thing again on modern TV because OLEDs have a pretty serious burn-in problem apparently. Not much creativity there, probably because it should not be annoying, so it's stuff like art galleries or subtle fireworks effects that also 'train' the pixels.
Best screensaver I used to have was the BSOD one from xscreensaver, so many people fell for it.
Second best is the nyan cat screensaver.
Currently have an Apple image gallery of space themed pictures.
Latest betas include Apple Music/Spotify integration, Speed control, and brings back the ability to put your videos anywhere (including an external drive!) on Monterey.
It's open source too, according to the description, it streams and doesn't cache videos and apparently it handles 4K HDR. If data usage is a concern to you, be aware that some videos are close to 1 GB.
I use this one on my TV, and it's great, highly recommend.
If bandwidth is a concern, you can also point it towards a local JSON file with whatever URLs you want, so you can mirror the screensavers locally and then stream them from there.
Part of the reason these are so captivating is that they are slowed down instead of real time. This gives them an other-worldly quality where you can see extreme detail but in a different motion than you would normally see it.
Many of the cityscape ones I don't believe are slowed down, at least the ones I have seen. If you watch the movement of traffic it appears to be moving at normal speeds. Even the orbital ones appear to be real-time.
The otherworldliness of the video seems to come more from the smoothness of the motion and the uncommon perspective. Because many of the scenes are captured with drones instead of helicopters or airplanes they have a literal bird's perspective in terms of velocity and altitude.
It's been an ongoing debate I've had with some Aerial users, but I agree with you. Most if not all are real time or barely slowed down (latest beta I've put a slider to speed a video up/down partly to try and settle this).
They definitely are massively stabilised though and Apple used to update videos from time to time, tweaking stabilization, colors or sometimes length and pushing new versions. They haven't done that in a long while though.
Some seem shot from fairly high altitude and may use planes instead, and at least for some videos, I've heard they used 3rd parties to shoot them.
The other reason is that you have to have biggest-company-in-the-world-sized connections (and pocketbook) to get permission to run a 120fps 8k drone over an active LAX.
I'm sure for the right opportunity they'd work with anybody. I suspect it's not a matter of connections, but rather, "We'll put a gorgeous shot of your facility on millions of TVs worldwide" is a slam-dunk pitch.
In high school in the very early 2000s my friends and I thought it was hilarious to add bluescreen slides to our PowerPoint presentations. Most of the computers ran Windows 98 and it was a common sight, we’d get gasps in class.
I knew someone that failed to prepare for their presentation, and did this as an excuse the presentation computer was not working and had to reschedule
My favorite trick was to take a screenshot of the desktop, hide all the icons and the task bar, then set the screenshot as the wallpaper. It gets even the most advanced users every time.
I did this to a girl at school. It ended up moving all of her singer, and iTunes found them "again", so half of her songs would show an error "can't find this file", because it was pointing to the old location.
It's my last day at my current job. Thank you for inspiring me to set up a 1 minute BSOD screensaver for the next person who uses my current workstation.
> It's my last day at my current job. Thank you for inspiring me to set up a 1 minute BSOD screensaver for the next person who uses my current workstation.
What company won't wipe and reimage it before redeployment?
Hell, I usually wipe my own work PCs before surrendering them after an upgrade, to make sure it's done properly.
If it's a desktop workstation with lots of expensive development tools installed or a delicate toolchain (especially one in a lab), why spend the time and effort wiping the thing just to reinstall all of the same stuff over again? It might even be a shared computer. I remember my help desk summer job, I worked at a hot desk that was staffed 24/7 so the same computers were used by first, second, and third shift. There were a lot of things that had to be logged into (mainframes, customer email accounts, etc) and not all of the desktops covered the same customers so it would have been a nightmare to handle all of the images and configurations. Whenever a configuration had to change, they would make a new backup to cover that specific desktop in case it died.
The electron guns in CRTs also take about 30 minutes to warm up. Before then the black level is pretty high and the image is washed out. So there is benefit to running a screensaver on a CRT but OLEDs do best when just put to sleep quickly.
When was the last time you looked at one? Was it todsy? I own half a dozen PVMs and two PC monitors and have been looking at CRTs consistently since the 90s.
It takes about 5 second for my LCD monitor to wake-up after a sleep, which is a bit annoying. A screen saver disappears in a fraction of a second even on old hardware.
Working on a next-gen operating system for a large tech company in the Pacific NorthWest in the previous century, I was able to reliably bluescreen the OS when stressing the network using an internal network protocol driver.
Later that day, I read in one of the trade rags about a columnist talking about the efficacy of a bluescreen screensaver.
1+1 = 2
Less than an hour later, most of it spent transcribing the bluescreen text output, I had a BSOD screensaver. Released it to the company intranet and waited for the ensuing hilarity.
Wasn't long before the guys in the build lab took advantage of the screensaver.
They installed the BSOD screensaver and disconnected the mouse and keyboard.
The main dev on the project comes in, sees the bluescreen and proceeds to restart the computer! Oops.
A few years later, on another multi-year large software project at this PNW tech company, the morning that the software was supposed to be signed off and released to manufacturing, the build lab, different group of people, installed the BSOD screensaver and disconnected the mouse and keyboard on the dogfood server for the project.
When the project manager arrived, he went to check on the status of the server, only to find the BSOD. This time a server restart was averted.
So, it's all fun and games until a server is hard rebooted.
> We've designed the OLED screen to aim for longevity as much as possible, but OLED displays can experience image retention if subjected to static visuals over a long period of time. However, users can take preventative measures to preserve the screen [by] utilizing features included in the Nintendo Switch systems by default, such as auto-brightness function to prevent the screen from getting too bright, and the auto-sleep function to go into 'auto sleep' mode after short periods of time.
Both. The internet was full of early adopter, tech enthusiast types. You'd certainly pickup on this. Maybe a bit of the magic was your youth, but the tech was new and magical to everyone.
Every article I read back then was written by a real person about something they cared about. Now it feels like every search result is just 100 pages of AI generated, generic content used to drive clicks to the site. And what content is actually written by a real person, it's in service to their brand and it's tuned for engagement and sharing on social media.
A lot of software back then was open and didn't hide too many of the details. Today, all the technical bits are hidden away and user experiences are carefully controlled and tuned for engagement. Think IRC vs Slack or FB messenger. On irc, you can whois, dcc, technical bits everywhere in the UI.
In the 90s, there was a question whether or not you should include ads and banners on your site. Later, it seemed like the ads went away but actually they took over. So much of the internet experience now is about monetization. Even malware is about making money. You gotta jump into the new tech to get that feeling of wonder and possibly.
No, I knew that "push" was BS. The concept and the promise of the Internet as a whole was exciting but most of the startup scene (especially in the frothy late 90s) was pure land-grab easy-money types.
> This new medium doesn't wait for clicks. It doesn't need computers. It means personalized experiences not bound by a page - think of a how-to origami video channel or a 3-D furry-muckers VR space. It means information that cascades, not just through a PC, but across all forms of communication devices - headlines sent to a pager, or a traffic map popping up on a cellular phone. And it means content that will not hesitate to find you - whether you've clicked on something recently or not.
In the most generous framing, Wired was (is?) a monthly magazine focused on how tech and the people in it are changing the world, oriented to a general audience.
Every article needs to be about something that's world changing (positive or negative) and they need enough articles to put in between all the ads they sold. If there isn't enough world changing stuff in a given month, or the writers got started on the wrong things, they've got to hype up what they've got.
I remember someone once describing Wired as a dumbed-down version of High Technology and Byte dressed up in Mondo 2000's clothes. That's pretty accurate. Sadly, Wired has somehow been the one that still exists...
I ran software for Dell's portables during the height of PointCast craziness - they eventually banned it because aboutq 2/3 (IIRC) of the company's bandwidth for the Product Group offices was being consumed by PointCast news updates and animations!
Haha, PointCast was one of the earliest "next big thing" apps I installed on my Mac in the Dot-com era. I never really found it useful apart from the always available implementation of SameGame.
Man, but didn't it feel like the future? All this information in real time being shown on-screen! What's the weather? What is Ciena's stock price right now?
Never mind those ads that keep showing up, that's just a minor inconvenience, right?
And yeah, I need to keep my modem on all the time and tie up my voice line but maybe someday we'll get that T1 installed in the office.
Oh, so that's why! Until now, I believed they were used to reduce energy consumption when the screen wasn't used. That's why I believed they were just some gimmicky, useless piece of software.
A bit off topic, but I also miss the screen obliterating hammer we used to play with as kids. Any modern, web rendition of that piece of stress-reliever?
Yes, that's the one. I totally forgot that there were other options too.
It's sad that such cultural artifacts are more ephemeral than a mesopotamian clay tablet. Humans four thousand years from now may wonder what happened that made us culturally unproductive, unaware that almost everything we made didn't even exist.
This reminds me of recently learning more about ancient Rome. Since most stuff except for some monuments was built with wood, it's all gone now.
Most people lived in 3 or 4 IIRC story rowhouses. The richest were at the bottom nearest the latrine, and had their own kitchen and servants to work the kitchen. The middle class were in the middle, with some kitchen facilities they operated themselves. The poorest were at the top with no way to cook and relied on street carts.
I was kinda shocked to read this description and how similar it was to a modern city. But since all that is wood, it's all gone. We only know even this because Rome was such a powerful and rich culture that there was a lot of writing, and a comparatively lot got saved for centuries. (Although the vast majority has still been lost, just from estimating based on works cited by ancient works that we don't have copies of)
Writing wasn't cheap, and preserving and transmitting writing isn't cheap. Writing about obvious everyday stuff everyone knows in such an environment has a much smaller chance of making it through the many historical hoops it needed to make its way to us.
Many other more prehistorical places could well have had significant buildup that we would recognize as modern in all but industrialization, but except when someone went to the trouble to build with stone instead of the more common random natural materials like wood.... all gone.
Who knows what went on that we just will never know about because it didn't get written down enough or the culture that wrote it wasn't powerful enough or long lived enough to preserve enough of those writings?
(IANA historian and might be a bit or a lot off. Biggest sources are "Rome: A history in seven sackings" and "Against the grain" Not trying to make strong claims here, just a general sense of wonder at how big the gap between what we assume about the past and what actually was could be. As a layman just dipping my toe into learning more about early history / prehistory, it's shocking and fascinating)
Occasionally we find clay tablets like the one where the guy is bitching about quality of copper ingots he was supposed to buy for his boss. It's kind of a fun look into normal, everyday life that almost never made it past the recycle bin of history.
It looks like the program you are talking about, "Desktop Games," is still hosted on: http://www.gemtree.com/program.htm. No idea if it works on modern Windows or not.
Not just CRTs - I had an LCD around 2006 that I left on a browser window for hours and I had the browser's window & menu bar slightly burned in for a good few months after that.
Yep, there are two different 5K panels on iMacs, off the top of my head LG and Samsung, and the LG one does have slight transient burn in.
It's not "true" burn-in (like CRTs or OLED) as it goes away after a bit but it's fairly easy to trigger with the current trend of very white/light grey designs in macOS, the easiest way to notice it is move from a whitish area to a mid grey one (Daring Fireball is a great example of that problematic grey).
I also have been unlucky on that draw on my 2015 iMac. I barely noticed it at first but newer macOS versions have made it very noticeable sadly. One more reason to run Aerial ;)
That's not the whole reason, though (at least for the original intent). Turning off CRT monitors means both a cycle added (which is not good for the electromagnets and capacitors) and diminished brightness due to cycling (this is due to the "warming up" needed to ensure that the screen is as bright as it can be, which takes up to around 30 minutes).
Turning off and showing a black image are not the same thing. Some old CRTs had a low power most where the power was on - thus the parts stayed warm - but no image was displayed.
Though too be fair, I think the low power mode wasn't signaled by a black image but instead turning off the sync signals or something like that. (I hope you get the idea that I don't know how it worked here)
In the early 1990s, there was no power saving. You could turn off the monitor with the front power switch, or use the screensaver.
A few years later, power saving options began to be introduced. The computer could signal to the monitor that it should go into a low(er) power state. Turning on again took at least 5 seconds for a visible picture, sometimes 15-20. That might not be appropriate if the computer had to be used in response to something like a phone call, or visitor at reception.
CRTS couldn't really show "black". It was more of a dull gray. That's what made the plasma display so amazing with it's true black. LEDs and now OLED are the same except they do not ahve the mass of a dying star.
The screen is charged. That charge causes burn in. The best way to not burn in, short of turning the screen off, is to charge the screen as little as possible. The least charge you can give it is a black signal, which means least burn in.
>The least charge you can give it is a black signal
The point is that you could not give it no signal without it being off. LEDs can do this. The closest to black from a CRT I saw was from Sony's Professional Studio reference monitors that were $32K for a 32" screen. When Sony brought out their OLED reference monitors, they did a side-by-side comparisson of their best CRT, an LCD and the new OLED. All 3 were receiving the same signal, and when the demo started black, the CRT was clearly "on" but the OLED looked "off" with the LCD in between. Just about the time I'm thinking to myself that the CRT brightness was turned up, they switched to reference bars and all were calibrated correctly.
Arguing that CRTs could display true black is arguing against history.
I think you've got the wrong end of the discussion, there.
If you can't or don't want to turn you CRT off... then what's the next best thing for avoiding burn-in?
Send it a black signal, or send it a bright signal?
It's the black signal, because it charges the screen less. Charging the screen is what causes burn in. If you charge it less you get less burn in.
> Arguing that CRTs could display true black
Nobody argued this. Where did you think you read this? I said 'show a black image'. Send it a black signal. It's the minimum signal you can send it without turning it off.
Or just send it a random signal, such as static or even better, an aesthetic screensaver. Showing black on a CRT didn't save any power, and the computer stayed on anyways (since it often took 3-5 minutes to boot up in those days). Black could mean something is wrong or signal is disconnected.
I know one of the authors of Flying Toasters and sent him the link. Here's what he had to say:
"It's a rough approximation, which is super cool, but misses all of the nuance.
"You see, back in the day, there were very strong competitors to After Dark that had as much or more animations. The thing that set After Dark was a "je ne sais quoi" that felt like an elevated experience. Both the customers & the competitors thought this was some sort of accident, or that the animations were important, but neither of these things was true.
"The secret was that people loved the nuance of interacting with the product (& subtlety in the animations) and it was very carefully designed with cognitive science. I pushed the team very hard before I would let anything be released because I insisted on the nuance."
When I was a kid in the 90s learning to program, writing screen savers was one of the best ways to demo your "skills". Also just something creative to do when you didn't really have a project in mind. I still think they're an underappreciated art form, and it's kind of disappointing that OSX only ships with a couple really basic ones. Apple should bundle in all the AfterDark modules (there were several releases).
There was also an amazing set of screensavers which I don't remember the name of, but one of the set was called 'kaos', and a lot of them were based on fractals. If anyone recalls the name of that software...
I knew Tom Dowdy's name as a 14 year old kid in the early 90s; from DarkSide, which was an important inspiration to me trying to write graphics demos at the time, and I knew I'd seen his name in Apple credits. I've just been reading today how admired and loved he was, and I fell deep into his journals from the culinary academy, and his over-the-top method of making cassoulet, which is my favorite dish on the planet and seems like it was one of his. Many years later, my ex and I also talked about quitting the racket and going to culinary school, but I didn't have the guts. He sounds like an extraordinary fellow, and his work left an impression in my brain. I got DarkSide running on macintosh.js today and took a screenshot of his lovely "Kaos" algorithm, which builds these abstract works over time; I remember watching it for hours on a Mac II.
Johnny Castaway [0] was my all-time favorite. Loved the Easter Eggs on certain days, loved that it adapted to the system clock to match day and night themes with real life and found the story entertaining (almost binge-worthy I might dare to say).
I just want to add that my first introduction to PC's was a 286 or 386 at my friend's house in elementary school. His dad had this screensaver and I remember watching the guy on the island.
Someone reminded me of the iconic toaster screensaver the other day. And I totally forgot that people used to go to a physical store, purchase a diskette or disc with screensavers on it and then install it on their computer for cool points.
Just wild to think of having to go through all of that for something you can download in less than a few milliseconds today.
takes <40% of idle low range gaming GPU = nothing, but 30% of one gaming 4GHz CPU thread, what is going on there? Chrome GPU process jumps between 20 an 40% while sliding 6 bitmaps around, something you could even do on a Virge in 1996.
Some time ago I noticed an older 3GHz i5 laptop, otherwise perfectly usable, would start to struggle decoding h264 webcam stream because webmaster decided to use CSS animation for a scrolling title bar superimposed over video feed.
This might be a situation where we need to publicly shame Google into optimizing Chrome. Something akin to 1998 CSS1 Test Suite ACID tests https://thehistoryoftheweb.com/the-rise-of-css/ making a case for IE/Netscape fixing their bad implementations.
Me too. They were designed in a way any modern hardware Accelerated GUI 2D/3D engine should be able to perform them entirely in hardware with CPU just pre filling command lists. When I researched the problem I found they might be much leaner on Safari which isnt surprising considering they were proposed by WebKit in 2007, and Apple (in its fight vs Flash) would want them optimized for phone battery. On Chrome they are second class citizen. There isnt even a visibility check - CSS animation will run on 100% obscured or offscreen elements.
You can edit
.rain {top: -4800px;
to top: 4800px, CPU utilization doesnt change. It sure feels like some procedure spinning in place burning CPU doing nothing. Just a reminder, Rain is implemented in a clever way, its blitting one 800x600 bitmap 6 times per animation frame to a frame buffer. Worst case scenario its 600MB/s fillrate done entirely by hardware accelerated 2D engine, yet Chrome is using ~30% of one 4GHz CPU core doing this.
Hmm, maybe its reading buffer back from GPU and compositing in software? I did an experiment and deleted 5 of the 6 DIVs reducing load to 100MB/s fillrate and CPU utilization didnt even budge .... so Chrome is doing something stupid like reading GPU buffer back to main memory and pushing it to GPU again. This explains why my older i5 laptop was so loaded by stupid tiny text scroll.
Original dev here. I took another look at these and I found a lot of animations that we could convert to transforms to improve performance. I just created a new issue here: https://github.com/bryanbraun/after-dark-css/issues/9
Which is funny, because I'm aware of jwz and the history, but only clicked on the link because of the referrer joke. Maybe I should open a club down the street from DNA called DnB.
Is there a way to slow down the animation or make it more choppy, so it better recreates the experience of running After Dark on my Performa 25 years ago?
Last year I went down a long rabbit hole of trying to get the original After Dark screen saver to run on macOS. I got pretty close; following a guide and using a Wine wrapper I was able to create a stand alone application that when launched runs After Dark full screen. Wiggling the mouse would exit full screen and close the app. The only part I am missing is getting macOS to launch an app instead of a traditional screen saver.
I had this as well. It also included sound effects so you stumble to bed late late at night after a long session of _Pathways into Darkness_, or maybe after meeting Achanar for the first time in _Myst_ and just as you're drifting into sleep you're jolted awake by the sound of a communicator establishing a connection, a red alert claxon, or more gently roused by tribbles cooing. You have to stumble back to the computer to turn off the power to your beige Yamaha powered computer speakers.
When I "played" with the PC of my step dad, I somehow made everything a game.
Paint, PowerPoint, even screensavers. Counting elements, making bets with my brother which will appear more often. Pointing fingers at the screen in the hopes nothing will touch it.
It is rare that something I read getting ready for work actually brightens my day, but this made me actually smile. Thanks for this and thank HN for being the kind of place that a post like this can hit the top of the page.
If Berkeley Systems reproduced these screensavers for modern Macs and sold them in the Mac App Store for ten bucks, they would make an absolute killing.
My favorites were the "Totally Twisted" screen savers from Berkeley Systems - particularly the "Voyer" screen saver where you saw the silhouettes of all the people in their apartments doing crazy things at night.
I liked the daredevil, trying to jump schoolbuses and stuff with his motorbike. Once in a blue moon, he'd be driving a schoolbus trying to jump other stuff.
What's funny is that these screensavers were _expensive_, and thus people pirated them. I remember the most active warez BBS in area had a whole file section dedicated to screensavers.
One of the reasons I turned to open source is that I did pay for most the After Dark series and they all disappeared when Seirra? bought them out and shelved the company. That really killed any interest in commercial software for me.
Everytime I passed this appliances store which had this small computer exhibited, I had to stop and stare thought the window for minutes to watch the flying toasters thingie. I was mesmerized. My 486 was nice, but that was sooo cool.. so high res and smooth..
I remembered the time when I was a kid, I was siting in my mom's office, in front of a Windows XP box and browsing through all different screensavers, themes, start button styles, etc.
Good memories.
I was just playing around with an old version of After Dark yesterday and the "Warp" one is the only one out of these that was on it. These mostly seem to be from After Dark 2.0 and later.
I remember in the original version you had all sorts of options, and could even combine screensavers.
For people who may not remember: screensavers used to be a neccessity as old CRT tubes would 'burn in' an image if they held it static for too long - leaving a shadow of the static image permanently. Screensavers kicked in after a few minutes and displayed something dynamic so as to save your screen from burn in.
In the 90's, having a cool custom screensaver was cool (around the same time that having a custom ringtone on your 3210 was cool), and you'd pay money for software like After Dark.