As a kid, I cobbled together one working Xerox Star from several partially working abandoned units. My friend's mom was a master typesetter, wordprocessor, and data entry ninja. She used the Star for her business. She loved it.
First and last time I got to play with a Xerox. I had an Apple //e (bought with my own money) at home and used WordStar (right?). After seeing the Star, it felt like I was banging the rocks together. I still think its crazy how far ahead Xerox PARC was for the time. (Ditto The Mother of All Demos.)
For some reason, technology preservation warms my heart. The same feeling I got chatting with the folks at the Computer Museum who are restoring a mainframe to working order.
Best i can tell, many things were developed in R&D and on mainframes that is being "rediscovered" today on desktop and laptop. This because the price to performance on ICs have gotten to the proper point, and because computing was at the time insular (or at least seems to have been).
The computer industries go through cycles of amnesia. Every new generation of comp-sci grads bears some responsibility for this amnesia->re-invention cycle - by ignoring the lessons of "the previous generation, who are clearly antiquated and out-dated by virtue of being 'old'", new grads put themselves in the precarious situation of having to re-invent the very solutions they were ignoring in the first place. This is a cultural phenomenon - a clash of generation-based prejudice - and not at all technological.
Manage developers - and the hiring of such - for more than a decade and you will see this pattern occurring. Youth gain amnesia through prejudice, and then re-invent the technology they ignored in the process. This happens time and again.
I'm not sure if if it is prejudice, or simply an artifact of the education system.
This in that students get to learn straight theory, not specific implementations, or the history of thought in computing.
On top of that the labs are by now using networked PCs, not terminals hooked to a mainframe.
Thus unless one take a independent interest in computer history, one kinda get a jump from Univac to modern PC, with maybe touching base on the C64 and AppleII.
It's easy to forget how "rocketpunk" CRTs were: High voltage anodes! Dangerous chance of implosions and/or X-Rays!
Amazing to think it was only a few decades since where just about every household had at least one CRT, if not just about one in every room. Sometimes we forget the past was also a strange place.
I remember my dad explaining how the electron beam scanning worked when I was about 8 or 9 and I totally didn't believe him until he showed me the magnet-on-the-TV-screen trick... (And it just occurred to me that there are probably some younger people here who haven't seen that one. You can probably find it on YouTube if you don't have a CRT handy.)
I have fond memories of synchronized degaussing in high-school computer science class.
We'd all open up the menus on our monitors and then select the degauss option at exactly the same time. Made a big ol' noise with that one. Can't do that with an LCD, unfortunately.
Wow, I had forgotten about degaussing. I worked in a plasma-physics lab in the 90s. Degaussing was required when an experiment was running-- strong electromagnets distorted the CRTs.
"Amazing to think it was only a few decades since where just about every household had at least one CRT,"
Still got one to play games on. Reminds me of physics at High School and we get the part where the teacher, Thurgy (Mr. Thurgood) was explaining magnetism and electron fields and how CRTs work. Next class we get further explanation and the bonus story of one student ringing him at twelve o'clock at night asking how long before the purple blotches on the television will disappear before his parents got home?
Student Tim had decided to get locate the largest magnet in the house and placed it over the screen to "just see what happens?". The blotches take about two hours depending on the strength of the magnet and if you don't damage the aperture grille. Can't do that with LED screens.
I remember one of my grandmothers had one of those early Microwave ovens that was a giant lead box that ate up the majority of a countertop. Sometimes you see how thin and lightweight Microwave ovens you can find in stores these days are and wonder about how much changes in steel/aluminum, wave beamforming (magnetrons!), and a general sense of mundanity has impacted the size of the device...
Also interesting to think just how much we do with microwave beams these days (particular good old 2.4GHz and 5GHz, our WiFi friends).
The microwave was launched into a misjudged market.
It was thought necessary for the device to be percieved as able to replace a conventional oven, so it was engineered to handle as large a whole turkey, with the advantage of cooking it faster and with less waste heat into the kitchen.
In physics, cooking always requires more time than heating. The pure heating speed advantage of the microwave was not leveraged until somebody got the idea to make small low-powered units not primarily for cooking but instead for simply heating coffee and snacks in the office. Until that market was saturated, and the price dropped low enough for consumers, and by that time the consumers were already familiar with the device.
Once millions of people could easily put them in their shopping carts, they flew off the shelf.
Source: I was hired as the first computer programmer for an appliance company about the time these were being launched, but I had already been studying businesses for a while.
It required two deliverymen on a truck to carry and install a microwave at this time in the '70's, when the driver alone could usually handle a refigerator. Microwaves were priced comparable to high-end conventional ovens.
OK, I'm old, lets get nostalgic.
It's not the real vintage turkey-sized microwaves like your granny's that I miss having around. Lots of them probably ran for decades after they were made anyway, some might be running still.
What about the low-end bargain units that mainstream consumers buy the most of?
Each decade these achieve lower levels of quality never before thought possible.
Mainstream quality lost forever as the majority of the population becomes too young to remember what it was like.
Fortunately the Alto was not (intentionally and/or largely) engineered to have a finite lifetime. Almost the opposite was the mindset at the time. It was expected to require quite some time to explore the capabilities of such a novel unit, so the electronics were supposed to last for many many years into the future.
That's what makes this restoration quite possibly fully within reach without having to rely on overwhelmingly unrealistic amounts of good fortune.
Also, I do have good ideas about how to handle some of the circuit boards from the Alto pictures I have seen. In my field I kept some '70's-built scientific data systems going continuously since the mid-'80s until retiring them still-working two years ago. These have similar densely populated processing, ROM, and RAM PCB's. All soldering must far exceed NASA specs and be done by a loving operator . . . and that's the tip of the iceberg.
Also remember converting a less-vintage, cheap, but actually-available non-Ball monochrome CRT to replace an original unobtainium old Ball, to interface with a proprietary display system. Just had to figure out where to solder so it would work with a display card it was not intended for.
It can be useful being hands-on to get the best performance out of the electronics itself. A few courses in vacuum tube electronics at an early age seems like it would still be helpful for a lifetime of naural science. Edison was a very clever fellow.
I suppose I would have been more clever not to have discarded a few dozen kilos of Xerox pre-PC office machines two years ago to make room for my working science gear.
When I worked as a field engineer I used to have to fix the bloody things (sometimes on site as well). I'm perfectly happy for forget all about CRTs after all the electrical jolts and other hair raising moments I used to get off them :)
The explanation of raster scanning in portrait mode reminded me of an interesting peculiarity in some other equipment.
Some of the first HP test equipment with raster crt screens, had the displays oriented in a landscape position, but the rasters were vertcal. So they scanned from one side to the other instead of from top to bottom.
I wonder what the reasoning behind that was and how they dealt with that in the framebuffer and character generator.
I can top that. Back in 1973-4 I was running Fortran programs on a CDC 6400 via a remote batch terminal. It consisted of a card reader, a line printer, and a console with CRT and keyboard. It was connected via a 9600bps modem -- a rare device that I'm sure cost thousands of dollars back then.
Anyway, the CRT had a scan pattern I've never seen since. Instead of two levels -- a fast horizontal scan nested inside a slower vertical scan -- it had three: there was a slow vertical scan (presumably 60Hz), a medium-speed horizontal scan that happened once, not for each pixel row, but for each character row, and then a fast character-height vertical scan to draw the character rows. So it drew each character row from left to right one pixel column at a time.
The whole thing must have cost well into six figures. Nonetheless it was a little flaky. Sometimes the video on the console would go out -- I found a particular spot on the case where I could whack it with my hand and the video would come back.
That's interesting and an example of what tricks can be accomplished if your hardware is far more integrated --- I'd guess that the scan order was designed specifically to require the least amount of buffering/auxillary RAM, and also keep the character framebuffer in a linear order. Constraints on character ROM access probably played a part in this too --- the more common pattern of scanning one row of each character in a line requires changing the addresses of the framebuffer and character ROM far more often, while this "character-at-a-time" scan probably means they could pipeline accesses to the framebuffer and ROM.
Display quality is further enhanced by scanning the
raster vertically at a rate of 36,205 Hz with the
vertical lines formed from left to right. Since most
rapid trace variations occur vertically, this gives a
smoother trace by reducing the jagged line appearance
compared to a horizontally scanned raster.
The frame buffer and character generator live in an earlier stage of the display block diagram, so they aren't really impacted by the choice of scan direction.
You also tended to see a lot of vertically-oriented CRTs in arcade games back in the day, so the concept wasn't unique to the Alto. That's the first place I'd look for a replacement CRT.
For a scope, it kind of makes sense. If the wave you are showing changes between scan lines, the display may get confusing. If you scan along the data acquisition, it'll look neater.
As an owner of a vintage monitor (Zenith ZVM-121) in need of repair, this was very interesting, thanks.
The combination of posting on HN and being in the Bay Area really gives this restoration the best possible odds. Oh, and it doesn't hurt that the Alto is a much beloved and pivotal piece of hardware.
As a kid, I cobbled together one working Xerox Star from several partially working abandoned units. My friend's mom was a master typesetter, wordprocessor, and data entry ninja. She used the Star for her business. She loved it.
First and last time I got to play with a Xerox. I had an Apple //e (bought with my own money) at home and used WordStar (right?). After seeing the Star, it felt like I was banging the rocks together. I still think its crazy how far ahead Xerox PARC was for the time. (Ditto The Mother of All Demos.)
For some reason, technology preservation warms my heart. The same feeling I got chatting with the folks at the Computer Museum who are restoring a mainframe to working order.