> removed the 2.5” HDD from the SPARCbook, added it to the USB enclosure, popped it into my Sun Blade 1500 and mounted the slice layout on the hard disk. Even if I didn’t already know Solaris, the SPARCbook manual that I downloaded detailed everything possible about the hardware and the Solaris operating system, including the default slice layout
This is why manuals are so damn important! Communicating to people 20+, maybe 50+ years in the future, what you've done, how the thing works, so it can be fixed and put back into service. All of today's undocumented soon-to-be-black-bricks--cough, I mean, electronic devices--are maddening. If you are lucky, there is a manual somewhere online, and even then, probably not the level that details hardware for fixing and replacing. Heck, you're lucky to find anything that even documents all the software functions available![1] (Recent Xfinity TV thingy, I'm looking at you). The attention to detail in older manuals was incredible. Most things these days are throwaway pieces of junk, not expensive, serviceable items that have durable documentation.
[1] I mean, why would you document the software of a device that is going to upgrade its software transparently as the manufacturer races to finish the features it should have shipped with, or alter the UI to inject ads into it later when their revenue model doesn't work out? Besides, you expect consumers to just give up on it and buy the next thing in 3-5 years.
I once went into a music store and they sold an old studio tape recorder - I say old but it looked new, everything aluminium and all. The thing about that one is that it came with a manual that included all the schematics for the electrical components, so that you can repair, replace or rebuild the electronics if needs be. They weren't so worried about others copying them at the time, I guess - I mean it was a specialist product for a niche market, and just because you have the plans doesn't mean you can reproduce it in a cost effective manner.
Here in Brazil, in the 80's, (probably still happens today) you could buy suitcases filled with many pages of reverse engineered schematics of popular TV and radio sets.
If you had any problem with your TV, would could take it to a workshop, the technician could diagnose it, query the schematic of that specific model and replace any defective part.
There's so much of this. Think of how many perfectly good battery cells from powerbanks end up in landfills because of cheap micro USB connectors that snap off the PCB.
> doesn't mean you can reproduce it in a cost effective manner.
But you could fix it if, e.g. a capacitor blew. And capacitors do have a tendency to blow. Thankfully recapping something typically doesn't require a manual, since they are labelled, but you get the point. Repair is important.
Not just professional equipment, pre-90s every electric/electronic hardware itself larger than a book had at least a high level block diagram on a sticker on the back or on an inside wall, and also more detailed diagrams on manuals. Complexities then went overboard and those diagrams became rarer.
I remember those diagrams also didn't include mechanical drawings, so the last part of your comment is right too. You aren't going to be making magnetic read/write heads or even DC motors from information taken from those diagrams, merely the topology of control circuits - which is an opposite view from how we'd see it today.
My TEAC A-X55 Mark II sound amplifier manual comes not only with the full block schematic, exploded view but with the full BOM and even all the PCB printouts... you can basically build the thing from scratch: https://imgur.com/a/hZeWFUF
That SPARCbook price was $21,000. Site doesn't say if that's inflation-adjusted for today's dollars, and internet being what it is today the top search links and Wikipedia reference links go back to this site as the source.
Still, you're advocating paying $20-40,000 for a laptop with the dual justification that "not expensive" is bad, and that 20+ years in the future someone can give a presentation on it to a usergroup interested in novelty value. That does not seem like a value for money proposition.
Can't find an original ad but looks like similar SPARCbooks were $7,500 in their day. You can still spec out apples and thinkpads to this price, so I think the critique is legitimate.
> Introduced in August 1995, the PowerBook 5300 series were the first PowerPC PowerBooks, ... The 100 MHz 5300c, with active matrix color, sold for $3,900 for the 8/500 configurration, and $4,700 for 16/750.[1]
> Announced in March 1998, ... The PowerBook G3 Series started at $2,299 for 233 MHz with no floppy drive and a 12" screen, and cost around $7,000 fully loaded.[2]
> If 2007 had a title, it might well be "Year of the Laptop." The US notebook market boomed in 2007, with laptop shipments rising 21 percent to a total of 31.6 million units. Desktops still outsold laptops in 2007, but the gap between the two has shrunk as desktop sales continue to decline.[3]
Laptop PCs started from $6000 range down to $2000 range over decades up to 2005 or so, not up from zero to there, at which point it established its position as the default form factor of personal computers. So if you go back in time they only gets expensive despite inflation.
Back when engineering software ran on Unix, exclusively, you had no choice but to use one of these. I remember the Mentor sales guy having one to demo the latest IC design software.
> This is why manuals are so damn important! Communicating to people 20+, maybe 50+ years in the future,
Nobody making a commercial product at consumer (or general business-friendly) pricing today, cares about 50+ or even 20+ years into the future. You damn well better be selling something new to that customer, not supporting them 20 years in the future for your otherwise obsolete product.
I would say that if this is the justification, then documentation be damned.
This goes for super expensive digital items too, and it's really sad - High end Phase One gear, and basically every other manufacturer in this market, has so much hacking potential. It's all just serial interfaces, so far, but you have to reverse engineer it all :/
What you said can be found in areas like industrial, aerospace and the like, as the manuals are part of the product delivery contract, but is not valued in consumer electronics anymore.
Not really. Apple's success merely shows that their methods are good for making them lots of money. There are many ways to judge the quality of a product. I'd argue that how much profit it makes is a bad one for literally everyone except the seller.
The Apple II manuals were Great Literature, not merely Excellent Documentation.
The listings of Woz's riveting 6502 code and his brilliant schematics were the Jane Eyre of its genre.
Or maybe it's more appropriate to compare the fold-out Apple II schematics to the original Marilyn Monroe centerfold of Playboy's first issue in December 1953.
I got the chance of using an SGI Fuel in the early 2000's and also some O2's. I don't know exactly why, I never bought one, I don't even like proprietary or closed solutions but those 90's UNIX workstations... I have an unreasonable attraction to them.
I remember in mid 90's trying some of them for the first time. I've got really impressed by Sun sparcstations and IBM Power machines. Computers which you could spend the whole day programming with no threat to system stability, even after consecutive segfaults, impressed me. And it impressed even more knowing that in the UNIX world it has always been like that. Seeing a x86 again running windows felt mostly like a toy after that. Certainly one of the reasons that influenced me to use linux.
Those machines were expensive. In the range of a tens of thousands dollars. So, working on one of them was a symbol of status and success. Owning one of them, to me, was like owning an yacht or exclusive sports car.
One thing I like about cheap ARM SBC's these days is their performance... these machines have a performance, that I think, is similar to how those UNIX would feel. Linux is way more advanced than those old unices but the feeling is still there. I can install wmaker and get a very interesting nostalgic feeling.
Just to try it, I installed the latest CDE version on my experimental Debian 11 system. It is incredibly usable and performant. Being able to have a well integrated system, with everything needed for development was just another level in those days. While on the PC people rejoiced because of Delphi and Visual BASIC, you could write real GUIs in C with CDE and application builder. I'm not claiming it is better in any way, but it felt incredibly more powerful.
I letande 3d modeling on an SGI O2 back in the late '90s, and subsequently also got started with programming on it. I still remember the joy and rush when my programs compiled successfully, and then half the time wondering how the heck they would build in the first place. Not much has changed since. ;o)
I still have that O2, it's packed up in its original box in my apartment. I also have two of those old SGI flat panel displays, which was pretty amazing back then when flat panels where basically unheard of. I think one of them is broken though, sadly.
I've been meaning to boot it back into action, to see if any of my old "art" is still on there. If it even starts, that is.
Thanks for your comment, I had no idea. Out of curiosity, do you know if it's dangerous or otherwise unadvisable to try and start it before making this modification?
It did in fact start back up again, but to what extent I don't know because it seems the power supply to the flat panel I have is fried. The O2 did play the guitar sound when starting though, and boy did it stir up memories!
The fan is also a lot louder than I remember. Thanks for the tips sir, I'll try to get the screens running to see if I can find any fun data on the drive.
Thank you, I'll check it out! I did get the O2 started, but no screen because the power brick seems to be fried. Gonna see if I can score a new one, but otherwise I'll ask the user group to see if I can access the machine some other way. Many thanks!
My main memory from working on SGIs in the 90s was the mess with mixing 32 bit and 64 bit libraries and such. It's long enough ago that I don't remember the details, but I remember it always being a pain to compile stuff because of having both 32 and 64 bit stuff present on the system.
This thing was like the coolest/rarest thing back then, it was almost legendary cause no one had seen one but everyone rumored they existed.
I had various Sun & SGI workstations at my internships in the 1990s.. all these machines were just cool. The SGI Indy wasn't super impressive but the fact it had a video camera was pretty insane in 1996 or so when we saw them. People were so enamored of the video camera if you had an Indy in the lab at your office people will take so many videos/pictures with it that they'd fill its disk space up. A lot of people I knew had their first digital picture of themselves by getting one from an Indy, cause digital cameras were horrible & stupid expensive at that point. The O2 was a pretty amazing machine too.
There's something retro cool about the new Apple Silicon Macs, it's like we've returned to the days of RISC Unix machines finally. Different OSes and Platforms were much more diverse back then and it contributed to a real sense of wonder when you got to work with a really unique computer. A lot of that is lost today where it seems like 99% of everything is Windows, OSX, or Linux. My first job we were testing on like 15 different versions of Unix + Windows NT.
I still have Ultra 1 Creator with 1GB of RAM that I plan to use for some oldskool development. At the moment it has Solaris 2.5 and SunPC DX2 card for running VB 4 and VC++ 1.52 on Win 3.11... And also for playing Castle of Winds.
Modern hardware and software feels so boring compared to all this madness...
I had a SPARCstation at my first full time developer job. The Sun type 4 keyboard, OpenWindows, 21 inch Trinitron... it was so exotic and different I felt like I was being paid to live in some hacker dreamworld.
Moving to Windows machines after that was a real drag, but the Mac brought me back to some of that UNIX magic.
I used one once in a server room in a quasi-government agency in the very early 2000s. Compared to the plastic Toshiba Windows junk around at the time, it was a total revelation. I tried to get my boss to buy me one for customer demos; he negotiated me down to a Sun Blade 100 for my desk instead. I still felt like I'd won.
That machine is fabulous. That the author had clear and correct documentation to be able to get into the system after so long is a testament to the manuals.
As an aside - It's a bit strange to think how cyclical things in tech are, now we're back at a strange echo of the classic bespoke architecture UNIX Workstation phase with Apple's new series of computers.
>As an aside - It's a bit strange to think how cyclical things in tech are, now we're back at a strange echo of the classic bespoke architecture UNIX Workstation phase with Apple's new series of computers.
I see what you mean. Let's see: IBM had (and has) POWER (Is RT/PC's CPU an ancestor of POWER? Or a different genus?) HP had PA-RISC, DEC had Alpha, Sun of course had SPARC. (Oracle still sells new SPARC hardware, yes? Although I doubt any true workstation-class devices.) Did Apollo have its own custom silicon? Who else am I forgetting from the Workstation Wars days?
Oh yeah I remember that, I think my uncle had one. I remember being mesmerized with the motion of the keyboard, and my uncle getting pissed with me opening and closing it repeatedly to try and figure out how it worked. Good times!
I had one running OS/2 and it had a PCMCIA modem card I used to dial into the bank's token ring network. That thing was the bees knees back in the day!
Sun made such fantastic products for their time and it's a real shame they never made it into the next millennium intact and profitable. So much innovation happened there.
I worked for Sun. The company mindset never shifted from building big-iron megaservers. Which is a shame because the workstation market lulled for a bit, but is actually back with a vengeance nowadays with heavy-duty GPUs, beefy CPUs, etc, for media and video editing--think $20k workstations. Apple has a slice of that market and a few PC manufacturers that focus on that, and it's very profitable. Not million dollar server profitable, but profitable. Sun neglected its own very profitable workstation line in favor of the server market, which was completely rocked by the dot com bust.
Sun, with Java, which hides SPARC vs x86 could have made the entire cloud happen a decade earlier. But SPARC was a performance loser compared to Intel and Sun just stubbornly refused to see the horizontal scaling forest for the giant E10k trees.
I don't know that SPARC was such a performance loser to x86 so much as Intel simply having a commanding lead in process technology at the time. Even Apple capitulated. That lead gave them a market domination that allowed them to reinvest (think Amazon) in their own fabs. x86 was a dog that Intel spent $1B/yr on improving so that your code ran fast. They took that strategy to the bank for as long as they could which was at least a decade longer than anyone thought they could.
As I think you're pointing out, Java wasn't a pivot. It was rather an excuse not to pivot. Sun really needed a Gates style pivot with Internet Explorer or a Bezos style pivot with his services manifesto. Instead you got a ponytail.
Also, before missing the cloud, Sun missed routers. Still, they were a pretty successful failure. We should all fail so well.
SPARC rapidly became a performance loser to x86. And, at the time, Intel had no process advantage. None at all.
Intel Pentium II was almost exactly 100% as fast as UltraSPARC IIi, on the same process (350 nm), at about 1/20th the cost. That was the last competitive Sun SPARC product, after that they were all brutally slow. The less said about US III and US IV, the better.
By the time that Sun put itself on the auction block, their internal CPUs weren't even good enough for legacy customers. They were obliged to start re-selling Fujitsu SPARC products just to keep SPARC loyalists inside the tent.
You're right. When I compare SPARC with x86, they seem to have the same process node at roughly the same time. I probably meant manufacturing rather than process. Intel at the time was colossal. It was possible for an innovator like Transmeta to come along and develop low power. But then Intel could motor past them on a marketing promise, engineering and finally out manufacturing them. This manufacturing prowess was a lot of the reason for Apple's PowerPC to x86 switch in 2005. Moreover, advantage begat advantage until x86 ran into various walls.
Things have changed since and now TSMC, Transmeta's fab then, is the new colossus. And Apple has even dumped x86 for their own ArmV8 designs (manufactured by TSMC).
It really is amazing that Sun and HP had the hubris to design their own RISC processors in SPARC and PA-RISC; everything that was wrong with the Unix workstation market in the 90s can be summed up there, in a way. Proprietary CPUs and custom repackagings of Unix, all slightly different, but in reality they weren't different enough to justify. And this just left room for Microsoft and Linux and x86 to just come up the middle and make them all obselete.
Lots of vendors throwing around "open" this and "open" that but none of them meant it. Just try writing a GUI app for a Unix machine in the mid 90s. Motif was closed source, with licensing fees. Things like GTK and QT were still primitive. Xt was garbage. It was just a hodge podge of things with the assumption that anybody entering the market had hugely deep pockets.
If they'd all just united around POWER architecture at least maybe they'd have had a chance?
In the 1990s it actually made some sense. If you go back to, say, 1993, every RISC vendor was selling chips that were much, much faster than x86. It was working like gangbusters.
The problem is that development costs rose and rose and rose without bound. Even if you could outsource fabrication, just designing your chip would bankrupt you.
HP, to their credit, saw the writing on the wall. The Itanium started as a joint Intel/HP project. It was HP's attempt to get Intel to pay for their new generation of chips, and sell it to the broader industry, in the hope they could have a new generation that would be (somewhat) backwards compatible with PA-RISC, but get off the treadmill of escalating development costs.
It was a clever plan, but as we know now, it didn't work worth a damn. Practically nobody but HP actually bought the IA64 chips. Intel quickly realized it wasn't worth any further investment and let HP's badly-needed chip series wither on the vine.
Sun didn't even have that much of a plan. They kept making chips they could not afford, and when "Rock" threatened to bankrupt the company... they just gave up. Even though giving up also threatened to bankrupt the company.
All of this is true except that there was no need for every vendor to have their own RISC ISA.
Which is why I'm saying with 20-20 hindsight it feels like they should have all just piled in on POWER. Of all of them born in that era, it's the only one still around. (Apart from ARM of course whose target was something else. MIPS is kinda just on life support.)
I don't remember the politics around the consortium there. I guess it could be that Sun and HP and the others just saw IBM as too much of a competition to want to be in bed with them. But given most of these vendors had shipped 68k arch workstations previously, I would have thought they'd be down with Motorola's offerings?
The problem in the end is these people all saw the other workstation vendors as competition rather than the actual competition, which was Windows/x86 (and later, Linux).
POWER was not originally open for use by others, under any license terms. It also was on the trailing edge of performance for many years. The original POWER 1 and POWER 2 boardsets were enormous power-sucking bastards with price tags to match. (Yes, boardsets. They were not even microprocessors.)
SPARC and MIPS were relatively open from day one but there were fewer takers than you would expect
SPARC had a number of licensees, but most of them chose to either re-use Sun software (Fujitsu, Solborne) or sell their chips directly through Sun (Ross, Weitek)
MIPS had many licensees but it ended up being bought by one of its most successful customers, SGI. Turns out, controlling the product direction is pretty important!
The point is, there were a lot of market forces piling up to drive people to play their own game, even it looks very silly 30 years on.
Sun experimented with an UltraSPARC-branded ultra-wide CMT design in their last days -- Niagara. It was very unsuccessful, because, realistically, what can you do with 32 low-performance cores that you couldn't do with 8 high-performance cores?
The answer turned out to be "nothing"
------------
The "conventional" UltraSPARC designs -- US IIi, US IIIi, US IV, and the canceled "Rock" project -- were just really very mediocre chips sold at tens of times the cost of a similar x86.
Ironically Solaris with SPARC ADI is one of the few UNIXes where C is properly tamed with hardware memory tagging, while Intel borked yet again their attempt to provide such capability on their CPUs (MPX).
Oh, it was totally. Sun lost at least two generations of microarchitectures to mismanagement. They resisted doing out-of-order execution far too long and they just could never keep up with single-core performance. Even process technology would not have been able to rescue the comparatively sophomoric micro-architecture. Intel and AMD and IBM just totally outclassed them.
I have been using SPARC for a decade, in 1998-2007.
In all this time all the SPARC servers and workstations that we had were ridiculously slow in comparison with Intel/AMD CPUs.
Nevertheless they were used because there were a lot of EDA/CAD tools that were available only for Solaris.
As soon as Cadence, Mentor etc. have ported their programs to Linux, SPARC was dead.
Towards the end of that decade, the difference became extreme, my laptop with a 64-bit AMD Turion CPU could run a SPICE simulation much faster than a very large and very expensive multiprocessor SPARC server.
Despite their slowness, the Sun or Fujitsu computers had many nice features before they became also available on Intel/AMD computers.
For example, in 1998 I was impressed that the Sun workstation we had could be powered on and off from its keyboard, because this happened before this became normal also for Intel/AMD computers. At that time we still had PCs with PC/AT power supplies, not with ATX power supplies, so power on and off had to be done from traditional switches.
I remember impressing my boss in about '97 - I got the first Mac in the company as my work computer, and I could switch it on (and off again) with a dedicated key on the keyboard. That was unheard of in his PC-centric world.
ATX power supplies did become popular soon after that, though.
It was really hard to resist the tidal wave of commodity x86 hardware at that point in time. Nobody did, not even Apple. And the server market got eaten by Linux, Solaris had no serious advantage anymore despite being a quality product.
It really is a different story now, though. There's room for more diversity, though they'd probably have to have made the switch to Linux.
Open sourcing Solaris earlier on before Linux ate the whole market -- or embracing Linux, or taking some hybrid approach, could maybe have kept them relevant.
I've been posting those amazing old tech adverts out of Byte, Compute, Antic and Zzap64 magazines 90s, 80s and older. Looking at adverts for retro games, retro electronics and others. Hope to see you there.
I read this article and then went on to a few others. The author of this blog has something, I don't know what it is, but I like it, it brightened up my morning. What a great bit of nostalgia, thanks for posting this Marco!
> Moreover, the source code files, test data and /etc/passwd entries had the full names of the people on the project that I could easily Google. Many of their LinkedIn profiles or academic biographies listed their detailed work on the project during their tenure at Nortel. This information was key to me constructing what the project was and where it went.
Looks like Nortel wasn't very careful with their company data... it may be ancient today, but I guess when that laptop was first taken out of service, it was still a bit "fresher" and more valuable?
My first real job included system administration on a pair of SparcStation 20s (which were replaced with a Sun Enterprise 450, which was then replaced with two Sun Fire V250s). The screenshots bring back fond memories.
ISDN was very, very popular outside the United States. Hell, Tokyo's payphones used to have ISDN jacks for laptop users.
Tadpole later became a vendor that almost exclusively targeted the US military, but it didn't start out that way. I imagine those ISDN jacks saw a lot of use by European and Asian customers.
I went to work for a US bank in London as a sysadmin in 1994 and pretty much the first thing they did was have BT install 2x ISDN channels in my flat. It was incredible speed with what felt like almost no latency back then. What a treat! Moved to USA (also to work for that US bank) and it was like "ISD what???"
ISDN was still prevalent in commercial settings for several years. Businesses would commonly have an ISDN PRI circuit with x channels for data, y channels for voice, and z channels for BRI miscellaneous applications.
Dial-up still very much reigned supreme in the US in '97. However, we had finally stopped saddling it with punitive pricing models that discouraged people from actually taking full advantage of the Internet.
I don't remember cable modems becoming commonplace until maybe 4-5 years later. (though there was a fair bit of overlap, of course)
We had cable at our early-20s-hacker-raver-crazy-person crashpad in Edmonton, Canada in 97, 98. And DSL was available around then, too.
The year before, though, I worked in an office in Toronto with dual ISDN hookup, and it felt fast and state of the art but holy crap... looking back it certainly was not.
Telephone tarrifs were(are) state specific. At least in California for POTS, unlimited local calling had been available for a long time and wasn't too expensive. I don't think California ever got an unlimited ISDN tarrif though, which made that a very expensive proposition.
I think this is a big part of why a lot of the US got cable a bit later than other places.
Because we dropped hourly billing for ISPs and didn't really have a tradition of time-billing for local POTS calls, dial-up Internet really wasn't very expensive at all by '97.
As a result, cable-modem services were probably quite a bit more expensive than dial-up when they were new for the majority of people.
This was the era when we were still trying to convince most adults that the Internet was actually something they needed access to, so justifying the kinds of pricing we have now would likely have been impossible for all but a handful of geeks in those days.
The @Home venture got me "high speed" internet over coax in 1996, from the same company I got my cable TV from. It wasn't DOCSIS-compliant. Road Runner was a competitor at roughly the same time.
Yeah, I'm from SE Florida, so that timing sounds about right.
I recall when it was first being rolled out, it was some sort of weird hybrid service that did phone-up/cable-down. Not sure how long that nonsense lasted, because they weren't doing that anymore when my parents finally got it.
I had several iterations of this in S. Fla.--first "MediaOne" 1-way cable (downlink via a General Electric Sufboard SB-1000 ISA card with uplink via modem), then an external SB-1200 which was a combination cable modem downlink and modem uplink and an ethernet gateway. Finally, they threw all that away and gave us a DOCSIS modem.
I still remember what a pain it was to get that 1-way modem working on linux. I had to print the documentation out from Windows, and then reboot. On the other hand, learning how to get that driver[1] to compile and loaded as a kernel module was very educational.
Back in the dot com days, we had all of our stack running on Solaris. (nothing clever, just Weblogic/Oracle) Our president (?) got a beast of a tadpole to do live demos in the field. The first major demo, he set up to present/pitch, got a cord wrapped up, and managed to spill a glass of water on it. (He then proceeded get the funding anyhow, but we were all looking forward to having that laptop)
It was nice to see Nortel mentioned in the article because I did my "gap year" at Bell Northern Research/Standard Telephones & Cables Ltd (R&D arm of Northern Telecom) back in the 90s and there were several of these SPARCbooks that the Meridian folks used to take with them on remote jobs. I was a mere youngster "helping" to write C code to convert Nortel switch signalling into OSI-standard CMIS. I never got to use one of the SPARCbooks of course but for that year I had an incredible SPARCstation IPX and learned so much from all the crusty old telco engineers. Great post about the SPARCbooks and what a great purchase!
When I started reading this article, I would have bet 100 bucks this laptop was used in a military / government setting for SIGINT (Signals Intelligence) software.
Solaris, solid metal hardware, tens of thousands of $$ - just screams SIGINT to me.
I agree. I have a Compaq Armada E500 lying around, and it's fantastic to even just touch the device. I'd love to use one with a similar build, but updated interior.
This pig actually weighs more than a 1996 Panasonic Toughbook that is built into its own briefcase with a carrying handle and everything, and can survive being dropped on its corner from waist height. I'd take that any day over a cheesy SPARC.
This is why manuals are so damn important! Communicating to people 20+, maybe 50+ years in the future, what you've done, how the thing works, so it can be fixed and put back into service. All of today's undocumented soon-to-be-black-bricks--cough, I mean, electronic devices--are maddening. If you are lucky, there is a manual somewhere online, and even then, probably not the level that details hardware for fixing and replacing. Heck, you're lucky to find anything that even documents all the software functions available![1] (Recent Xfinity TV thingy, I'm looking at you). The attention to detail in older manuals was incredible. Most things these days are throwaway pieces of junk, not expensive, serviceable items that have durable documentation.
[1] I mean, why would you document the software of a device that is going to upgrade its software transparently as the manufacturer races to finish the features it should have shipped with, or alter the UI to inject ads into it later when their revenue model doesn't work out? Besides, you expect consumers to just give up on it and buy the next thing in 3-5 years.