Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Don Estridge: A misfit who built the IBM PC (every.to/the-crazy-ones)
259 points by dshipper on June 6, 2024 | hide | past | favorite | 126 comments



It's rare that a tech story brings me to tears, but I couldn't help feel one swelling up when reading the final paragraphs.

> Eventually, it was Wilkie who made the first move. Overwhelmed with emotion, his eyes red and swollen with grief, he stepped forward and detached the red rosette from the lapel of his suit jacket. It was the same one Don had given him years before. Leaning down, he gently placed the rosette on the casket.

It feels like there should be a movie made about this story...


I feel the same way. The end of the story is just sad. I wish that more companies could break their own structures to offer rewards, bonuses and more freedom to teams like this. The kinds of people that thrive with these kinds of opportunities tend not to do as well with general corporate culture.

So many times a relatively small upstart team with enough freedom will accomplish greatness, only for corporate culture to completely destroy what was.


Although I sometimes wonder how much survivor bias there is in these stories.

How many misfit teams failed?

Which is not to defend corporate hyper-control at all. But I suspect that knowing how often skunkworks projects work, and how success can be affected by different personalities and corporate contexts, might be useful information.


It is more complex than that. In some fields there was a culture of labs and special projects. IBM, Bell Labs, Xerox, and many other companies followed that route and there were corporations with a lot of bureaucracy.

Good thread in /r/AskHistorians: https://www.reddit.com/r/AskHistorians/comments/1kxnd1/what_...


Over a decade ago, I had the chance to interact with at a startup with a Microsoft senior person who joined the firm as CTO. One piece of advice he gave stuck out:

"Heroism doesn't scale"

This wisdom of it was instructive (and you can see it in Estridge's early struggles with success), but so too was the sense that perhaps this "wisdom" was part of what underlay Microsoft's malaise of the late 2000s against Apple and Google (and IBM's in this story and your final comment.)


"Heroism doesn't scale"

You can pretty much replace Heroism with Leadership or whatever.

Question is do we really need it to scale?

Corporatism thrived off the back of the industrial revolution. It is not bad, it just has taken us as far as it can go.

Something more decentralised and organic should take place instead of corporatism for large scale human development and space exploration.


Microsoft has plenty of similar stories, perhaps most famously the story of how Xbox was developed.

I was lucky to be at MSFT during what might have been the last gasps of when a single person could have a huge impact, and it was something I saw from time to time, but it by the 00s and surely the 2010s, it was in huge decline vs the stories that were told about the 1990s.

Edit: Someone posted below about how Windows 3.0 was also a misfit project.

There is also F#, which has had a huge impact on C#.

The entire async framework in .NET was originally developed by a small team headed up by one brilliant engineer.

Silverlight started as a project by a small team in Microsoft Research to see if they could shrink .NET's runtime down to be small enough to complete with Flash.


I love the story... But don't forget this story is the proper selection of events with textual glue and interpretation to make it feel like a novel.

Some statements belong more to the glue than to History, and they should remind us this is a real-life-based * novel *. I especially noted this one: "nobody at IBM had any real experience with [microcomputers]".

IBM senior management was certainly reluctant, but "nobody"... They even had microcomputer products that hit the market:

- IBM 5100 1975, first IBM personal computer

- IBM 5110 1978, 5100 updated for a larger market target

- IBM System/23, under parallel development with the IBM PC and released 1 month before in July 1981: many of the IBM PC features are shared with or taken from it (8-bit Intel processor family 8080 vs. 8088, very same expansion connector, reuse of the electronic expansion cards such as serial, exact same keyboard - just in a different box and with different function keycaps...)


Small fix: the IBM System/23 Datamaster was based on Intel 8085, an improved version of the 8080 (binary compatible, more features, requiring less electronics around).


For people who are interested in this era, read Fire in the valley, I have never read a book so fast in college, a great read!

https://www.amazon.com/Fire-Valley-Making-Personal-Computer/...


>> His divisional heads always had the same answer. Microcomputers—home computing—were a fad. They were low-cost and low-profit. Let others scrabble around in the metaphorical dirt of home computing. The real money was in the markets that IBM’s divisions already dominated—selling vast mainframes and minicomputer systems to large businesses. Cary was even told to buy Atari, which by then had established itself as America’s home video game system of choice. That’s all home computers were good for: gaming.

This attitude was so short sighted. A friend of mines dad was using their Apple II for work-related spreadsheets and thought it was the greatest this ever. Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did". 20 years later Intel seemed to have missed the mobile market due to a similar attitude.


None of those division heads were trying to honestly assess the microcomputer market. They were trying to stay in harmony with opinion at their level and higher in IBM.

That's what you get at that level in a company that big. Anyone who is two or more levels from the top of the org chart and also two or more levels from the bottom lives in a reality that consists entirely of the attitudes and opinions of other people, weighted by each person's ability to impact their career. If they saw that the building they were in was on fire, their thought process would go something like: "Bob isn't here today because he's at that sales meeting. When he hears about the fire he'll downplay it as something minor, so I shouldn't evacuate or he'll think less of me. But Bob's boss Don is here. If Don evacuates and I don't, that might Don feel embarrassed and emasculated, and he'll take it out on Bob. So I need to evacuate if and only if Don evacuates. Bob won't mind me evacuating if Don does it. But Don's office is on the other side of that wall of approaching flames. Shit. My only chance is if he's in a meeting on this side of the building, so I can track him down and see what he's doing. Let me check his calendar real quick...."


A ridiculous and totally unrealistic example, and also the funniest description of office social politics I think I’ve ever seen.


It's close to what Clayton Christensen describes as disruptive innovation (his examples were the steel industry and radio's): incumbents are forced higher up the chain by low quality competitors ("home computers are only good for gaming") that answer an unanswered need well enough. Once these competitors gain a foothold, quality improves and incumbents have less and less of a market.


It’s exactly what CC was writing about. It is the way large organizations think unless someone has the courage to recognize that if they don’t start a new division, some little upstart will eat their lunch. Kodak laughed at the early, feeble digital cameras. Bell Labs sat on DSL because it would kill T1 revenue.

But man is it ever hard to execute. Andy Grove made every exec at Intel read Innovator’s Dilemma, but still it was hard to turn that ship.


You can see the parallels between Kodak creating the first digital camera in 1975, and Google publishing Attention is all you need.


> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did".

I encourage everyone to get a copy of the Hercules emulator and a copy of the "Turn Key 5" MVS distribution and spend a little time using it. The mainframe idea of "computing" and "running jobs" is so comprehensively different it's really hard to map any previous consumer computing experience into it. It's also just a lot of fun because of that.

The whole experience is centered around efficient use of machine resources while providing a comprehensive batch execution and scheduling system for centralized job execution in this environment. The level of accounting, reporting, repeatability, and job language features is actually something worthwhile to dive into.

In any case, I'm willing to bet that IBM's internal ideology is that end users wouldn't want to do the computing themselves, but would instead go to middle men who would would purchase computing either directly from IBM or as some form of "remote job entry" through a third party provider. To that end they were rapidly building out the infrastructure to do just that.

> Intel seemed to have missed the mobile market due to a similar attitude.

In both cases, they're still here, although Intel did a much better job of catching up to their past mistakes.


IBM on S/360 also did a lot of early interactive computing, not just batch processing.

What was common though then and now was renting computers.

Essentially, cloud computing.


This IBM viewpoint of PC's is well entrenched in enterprise IBM Mainframe support teams

I remember in 2010, having to present to a team of Mainframe techs at a bank about how we would be integrating an Identity solution (that ran on Windows servers) into their Mainframes.

They couldn't stop making comments about how useless Windows is and it's just a gaming platform. One guy ranted and raged so hard that he stood up and stormed out of the room.

I remember my Project Manager who was in the meeting looked at them and said 'Guys, we talked about this earlier'...

I can see where that mindset comes from, these guys have been drinking the IBM Kool-Aid for a long time


> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did".

Bureaucracy can be like that. Big bosses who might be really interested in increased profits rely on their subordinates to see the market, but subordinates are risk averse and don't want to change anything. Add corporate politics, people fighting not for innovations or for a market share but for promotions, and you'll get the picture.

It seems to be that they besides all that they were ideological, believed that size does matter and scorned on those who made computers smaller than theirs. Ideology means that people would have troubles to see anything that contradicts their ideology. Peer pressure, social desirability and all these things set up individual biases.


>> His divisional heads always had the same answer. Microcomputers—home computing—were a fad. They were low-cost and low-profit. Let others scrabble around in the metaphorical dirt of home computing.

Those views seemed to be relatively common. Just look at those home computer upstarts, many of which were scrambling to make business machines. Apple's follow ups to the Apple II were the Apple III and Lisa, both intended for business. Commodore seemed to have business computers on their plate most of the time. Tandy also pursued the business market. TI was a bit of an outlier in that they were into minicomputers before personal computers, and quickly jumped ship when they turned out to be low profit. Maybe it was different in Europe, but certainly not in North America.

I'm not entirely sure they were wrong either. A lot of companies rose then fell in the home computer market. IBM themselves haven't pursued the home market in decades at this point. Many, if not most, segments consolidated to the point that there was just one company with any meaningful market share. It could even be argued that the real money these days isn't in the hardware or software for the home market, but in the services they enable access to.


They hamstrung and killed OS/2 similarly.


Yup. I'll be covering OS/2 when I look at operating systems in this period.

The level of foot-shooting by IBM on that one was ridiculous.


I was working at IBM in Boca Raton in 1990-91 when OS/2 was being developed. Wandering the hallways one afternoon, I passed by the OS/2 team where I overheard one engineer explaining to another engineer, "See, when you drag a file to the trash can, it should be a Move operation, not a Copy." I thought, OMG, this project is hosed. This was just a few months before it was supposed to be released.

The first release of OS/2 was a complete disaster. IBM was inundated with calls from customers who were having issues. They pulled every single person on the site into service as customer reps, without any training in OS/2! I was working on a UNIX project at the time and I was an Apple person - I had no clue how to help people with OS/2 or PCs but my manager did not like it when I tried to explain that. So I probably am listed somewhere as the worst OS/2 customer support person ever.


Wandering the hallways one afternoon, I passed by the OS/2 team where I overheard one engineer explaining to another engineer, "See, when you drag a file to the trash can, it should be a Move operation, not a Copy."

But you didn't hear the other engineer respond, "No, no, you don't understand. This is an advanced prototype using functional programming. Moving the document to the trash would have side effects. Creating a copy and placing that in the trash can, however..."


Could IBM have succeeded given Microsoft’s betrayal? Or did Microsoft just give up on IBM ever delivering something?


Arguably they could have avoided Microsoft "betrayal" as well. Ultimately, Windows 3.0 was a skunkworks project of single engineer against the corporate decisions of Microsoft, and only when it was quite closer to complete did it start getting management buy-in.

Pretty sure for a long time it was "cloaked" as stop-gap solution, a continuation of the lesser-known "windows runtime embedded in application" option that some software shipped with.


They had different goals. But it's not clear that Microsoft's goals--a largely hardware-independent OS--ever made sense for IBM. As it turned out, a more proprietary PC architecture didn't really make sense for IBM either but that was sort of beside the point.


In the case of Intel, based on what I saw, they were just desperate/convinced to turn the x86 into a beachhead for mobile (but Flash will be the same!) but that ended up not making sense.


> This attitude was so short sighted.

Reminds me of Kodak.


A friend of mine's father was the head of Digital[0] in Australia and later sent to Boston after being promoted. I distinctly recall speaking to him in around 1995 regarding Linux. He, along with I believe a large number of commercial Unix vendors, snubbed his nose at Linux suggesting it was a passing fad and would never challenge their "serious" Unix. This is interesting because Jon 'Maddog' Hall[1], then CTO of Digital (before it was acquired by Compaq in 1998, acquired in turn by HP in 2002) certainly did get it... I interviewed him once in Sydney circa '99 and had a good long chat once in Taiwan circa '01 after crossing paths by chance. He was traveling the world proselytizing Linux in shorts and flip-flops, had a firm belief in embedded Linux changing the world (Android[2] wasn't released until nearly a decade later in 2008), but was yet to announce he was gay (took another decade). Fast forward 30 years: nobody younger than 40 has practically even heard of the company, Linux is in every household, and the very idea of a commercial Unix a joke.

Furthermore, in perfectly delicious irony, IBM's own modifications to Linux[3] to support the allocation of workloads to its giant server hardware have enabled the popularization of containers, further reducing demands for server equipment, increasing portability between desktop and server environments, and substantially drawing down the cost of provisioning for cloud services - the arch rival to traditional mainframe mentality. Today, in a world awash with dirt cheap and ever-present processing power and storage, as well as recently unimaginable levels of connectivity, we stand almost at the point where the term "server" itself has become an anachronism and consumption-oriented devices draw consumers toward "services" (often as paid for subscriptions).

IMHO some industries which will look nothing like today's version in 30 years' time: food, oil, transport, construction, clothing, health, and education. Carpe diem.

[0] https://en.wikipedia.org/wiki/Digital_Equipment_Corporation [1] https://en.wikipedia.org/wiki/Jon_Hall_(programmer) [2] https://en.wikipedia.org/wiki/Android_(operating_system) [3] https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...


> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did"

They thought it was a fad - that centralized systems (coincidentally, the machines they made) would be the computing platform that people would pay-per-minute/pay-per-hour/pay-per-month to access remotely. They wanted to be an information utility - a supplier to all - instead of selling a small, low margin box for one-time revenue.

"It is difficult to get a man to understand something, when his salary depends on his not understanding it." - Upton Sinclair


So you are saying that they were too far ahead of their time.


Sure, we can go with that


Are you talking about Microsoft Azure?


No but thanks for play


google, fecebutt, amazon, and azure seem to be doing okay with centralized systems providing continuing revenue


I didn't say the model was wrong - I was just saying there were focused on it


it's true, they were


I was living in Texas at the time of Flight 191. I was no fan of the IBM PC but it was still gut-wrenching to hear that the father of the machine had been killed.

https://en.m.wikipedia.org/wiki/Delta_Air_Lines_Flight_191

This was the crash that brought the term "microburst" into the national consciousness.


A fascinating training video provided to American Airlines pilots discussing windshear and microbursts. Flight 191 is an example used https://www.youtube.com/watch?v=FxXwqAm1a-Y


Wow, the wiki page is nearly as good a read as the professionally written article. I broke the IBM-related text under the Passengers section out to its own paragraph, as it changes from statistics to specifics. Hopefully will remain that way.



There is a "high tech middle school" with his namesake in Boca Raton, FL, next to the former 1960s IBM R&D complex.

https://www.palmbeachschools.org/DonEstridgeMiddle


Reading Boca Raton reminded me, they used the names of cities close to the offices for the internal names for products. I co-oped at IBM in Austin in the late 1980s and we tested the new models of PC Jr/Portable/PS2/peripherals and the internal codenames were stuff like Boca Raton and Cedar Key (which none of us got until our boss told us) but I can no longer remember which was which anymore.


Well, TIL. When I see a school named after someone who is not widely famous I assume they were an educator or politician.


Alonzo Mourning and his ex-wife have a North Miami HS named after them.


One consistent theme you get from business history is:

There is very little penalty for being wrong. There is often a huge penalty for being right, if the powers-that-be opposed you.


One of the biggest advantages of being young is still not knowing how real this can be.


thats why you dont take credit for anything


I grew up in South Florida in the 80s and 90s. I was familiar with the IBM office in Boca Raton, nicknamed T-Rex, and had a few school friends who worked at IBM on the IBM PC. From what I can remember, the Boca campus was like garden leave. IBM sent you there when they didn't want you but couldn't fire you. So it was full of IBM misfits who were thrown out of HQ. I never made the connection to Flight 191, and assumed it was because of Hurricane Andrew. But once the PC market took off IBM wanted that team brought back into the veil. A lot of my friends moved to Cary, NC, more famously known as Research Triangle.

Miami, and South Florida overall, is kind of a crazy place to be. Every couple of decades people out west or from up north rediscover we actually exist. There are good engineers here but the West and Northeast have loads of money. So once CS/SWE really took off as a career the companies down here couldn't/wouldn't compete. Trust me, if you were an Asian/Indian kid in Florida in the 90s and told your parents you wanted to work in software they were going to beat some sense into you.

I've watched money flood into the area and then get carried back out when the financial tides changed. I always imagined Miami could have been kind of like a Silicon Valley but the politics, money, geography will work against it.


The IBM PC in particular was probably never especially significant to IBM as a whole whatever its effect on the computer industry generally.

As someone who had IBM as a client for a number of years, we observed that there seemed to be a lot of IBM folks who basically ended up in some Siberia in one form or another.


I always console myself one day that New Orleans will be Miami with drinking water.


Classic example of Worse is Better.

All competing architectures were better than IBM PC architecture, PC BIOS was bad, chosen processor instruction set was the worst, MS-DOS operating system was bad. Only the keyboard was good.

What made it winner was open architecture, 80-column screen and IBM name.


I did some historical research to understand why the PC caught on (it made no sense to my 1980s teenage mind).

A PC with 80 columns card, 64KB of RAM and a floppy drive cost about the same as an Apple II Plus with the same specs (US$2,700).

A BBC Micro would set you back about US$1,500 (£900). It didn't offer slots, but did have 80 columns standard. It also had a lot of ports.

You couldn't even argue that the 8088 was much faster than the 6502. BASIC ran a lot faster on the 2MHz Beeb than on the PC.

The only thing that makes sense to me is that the people who bought it on launch were planning to use more than 64KB of RAM (which was rather expensive then).


It was an open platform. In those days, you were getting a set of printed manuals including schematics of the machine when you bought one. It created an ecosystem of clones and expansion hardware. PC-DOS/MS-DOS provided an easy path to port CP-M software.


its the soft power of the IBM name. it's not possible to describe it now or for people nowdays to understand what that name meant.

you went to a big company they ran IBM. you want to get the same kind of computer 'that they use at work'. and what they have at work is IBM.

like this concept just makes no sense whatsoever in todays culture. computers used to be secret. they were tools of the priesthood. you wanted to join the priesthood. thats why you got IBM.

its like coming to America, you want to learn to speak English. not Esperanto.


It was the 80's equivalent of green text messages vs. blue text messages.


not that many people did buy the ibm pc on launch, because you're right. it was released in mid-01981, and sold less than 200'000 in 01982. the commodore 64 sold 360'000 in 01982, and it wasn't even released until halfway through it. apple was selling a billion dollars a year that year, entirely from the apple ][ line; it wouldn't introduce the lisa until the next year. so that's on the order of half a million apple pcs shipped in 01982

i think ibm's reputation was a pretty important factor, not so much for making people buy it (though it did do that) but for convincing them that other people would buy it. it's easy to forget in 02024 just how dominant, and how malignant, that ibm was in the computing world at the time. they'd built aiken's first computer at harvard, they'd introduced ascii (then spent decades battling it—the pc was their first ascii product), they'd invented fortran, they'd invented relational databases (but kept pushing ims), they'd provided the hardware lisp and timesharing were developed on, and they utterly owned business computing, more thoroughly than microsoft does today

pretty quickly there was a lot of software out there for it. partly this was because programmers were convinced that users would buy it, but also, it was easy to port cp/m software to it. at a time when 'serious' pc software was almost entirely in assembly, you couldn't do that with the apple ][+ (unless you bought another computer from microsoft to plug into one of its slots to run cp/m). also, it was easy to make peripherals for it (though this was just as easy for the apple). and microsoft licensed ms-dos to other vendors like tandy and zenith, resulting in not-quite-compatibles like the z-100 (01982), the dec rainbow (01982), the tandy 2000 (01983), and the sharp pc-5000 laptop (01983). the software written for those machines was also usually easily ported to the ibm pc

you know how ebay and airbnb are utterly dominant because they own a two-sided market? if you want a place to stay, you go on airbnb because that's where the listings are, and if you want to rent out a place to travelers, you list it on airbnb because that's where the travelers are? the ibm pc owned a three-sided market: users, software vendors, and peripheral vendors. not at first, of course, but pretty soon; the s-100 systems weren't that dominant in a market fragmented between apple, commodore, atari, osborne, kaypro, etc.

then, once compaq came out with their ibm-compatible portable computer in 01983, ibm didn't own the market anymore. even less once phoenix started selling their bios in 01984. and that was what really made the ibm pc catch on: no single company's missteps could sink the platform the way commodore did with the amiga and the way apple did with the iigs's successors. ibm did in fact try to avoid introducing an 80386-based ibm compatible in order to avoid cannibalizing their minicomputer and mainframe lines, just as apple did with the iigs, so compaq beat ibm to market in 01986. by like a year!

there are also technical questions. the 8088 is a lot faster than the 6502 at running compiled c, especially with the crappy compilers of the time. it's also noticeably faster at numerical code. and the apple ][+ was running its 6502 at 1 megahertz, not 2. the beeb not having slots was a fatal flaw for much of the market; it turns out there are really a lot of peripherals that work badly over a serial port

and once ram prices came down a bit, 640k of ram became standard; the macintosh shipped with 128k in 01984 and quickly changed that to 512k. using 640k of ram on the 8088 was a lot easier than using it on a 6502, although intel's braindamaged segmentation scheme (avoided on the iigs's 65816) forced you to use 'garbage kludges' like lim ems (01985) to use more than a megabyte at all. that there was so much pressure to be able to use more than a megabyte as early as 01985 should tell you something about how far memory prices had come down and how important it was that the 6502 and 8085 couldn't handle more ram


Besides their name, they chose their market correctly, i.e. where the money/momentum was—small business. Third-party support was great.

Most people in the early 80s had no idea what to do with a computer at home, that's why they mostly were bought by enthusiasts, tinkerers, and gamers. I remember one of the main uses listed on the boxes was "keeping track of recipes." Haha, imagine spending thousands of dollars for a giant clunky thing to organize recipes when a box of index cards would do.


keyboard wasn't great either, to begin with!


Model M keyboards are ridiculously good. They are consistent for every key, have great tactile feedback and are extremely durable.


The Model M came with the IBM PC/AT, years after the original PC and subsequent PC/XT. Those came with the Model F keyboard which had a terrible layout.


The Model M did not appear until 1985, nearly four years after the original IBM PC.


Model F came out in 81 with same bucking springs


The Model F sucked. The key mechanism was fantastic, but the layout was utter trash. The weird return key that's long vertically but has a one-key-sized raised section in the middle is the worst part of it.

You can buy brand-new keyboards with this mechanism now from some small business (sorry, don't have a link handy), but even they offer Model M-like layouts because the original Model F layout is so awful.

The only good thing I can say about the Model F layout is that almost all microcomputers at that time had terrible keyboards, though the reasons they were terrible varied. Compared to junk like the Atari computer keyboards, it probably seemed great, though of course the IBM PC was far more expensive. For really great keyboards, at that time, you had to look at the business-level terminal keyboards and such.


Probably this company: https://www.modelfkeyboards.com


I'm a bit curious about this part of the article:

> Unlike all of its major rivals—including the Apple II—the IBM PC was built > with an open architecture.

The Apple II, designed by Woz, is famously open, to the point the original model came with full schematics and ROM listings which made it trivially cloneable. I'm curious why this isn't considered an open architecture.


Apple II hand proprietary design. You could not clone it legally.

Just like posting source code does not make the code open, publishing schematics does not make the design open.


no, you could very easily clone the Apple II easily and it was done many, many times

it's true that eventually Apple started suing people, but until Apple vs Franklin it was unclear if you could even copyright a BIOS. And once that was determined, people had to clean-room reverse engineer the BIOS but it was possible to make clones (see the Laser 128 and many many others)

this is just as open as the IBM PC was. You couldn't just drop a copy of the PC bios into your clone, you had to go to a 3rd-party reverse-engineered BIOS


Even more, Apple tried to block the Laser 128 and failed, since VTech didn't actually do anything illegal; they reverse engineered the Apple APIs and licensed BASIC directly from Microsoft.

They were actually pretty cool computers; half the price of the name brand Apple while being mostly fully compatible with all the software. I've only played with one once, but I thought it was pretty cool. Sad that VTech only makes crappy kids toys now.


The key was that by publishing the BIOS source in the open, it was hard to prove that you weren't exposed to the source code when you wrote your own BIOS.

The clean-room approach was a neat hack that solved the problem, but it was hard to find people that had never seen the IBM source and could prove it.

https://www.allaboutcircuits.com/news/how-compaqs-clone-comp...


This part is factually incorrect:

> The easiest way to set that standard wasn’t just to sell machines; it was to let other companies sell parts, software, and even whole computers that would be compatible with your machine. Unlike all of its major rivals—including the Apple II—the IBM PC was built with an open architecture.

The Apple II was effectively as open as the PC. And IBM didn't want clones any more than Apple did. Both the Apple II and the PC were eventually legally cloned, and neither company could do anything about it.


It might make an interesting business book--maybe I'll write it--what realistic business strategies companies that are widely viewed as failures could have followed though industry changes that boards/shareholders wouldn't have revolted about.

I'm not even sure IBM is a great example. It had a really rough stretch but is still there as a very profitable dividend-paying large corporation even if it's not considered cool.


> "The system would do two things. It would draw an absolutely beautiful picture of a nude lady, ..."

Lena? (https://en.wikipedia.org/wiki/Lenna)


I did try and find out if it was. Sadly couldn't find anything to confirm it!


Probably monochrome for the very first prototype.


But she is not nude on this pic, right?


The square crop used as a test image is just her face, but the original, full-page shot shows all her body, and she is very much nude.


I see, thank you for this tip.


there’s a sense in which IBM was right to fear the PC - it, in fact, killed their main industry, and they were not able to compete well in the new space, despite defining the standard. maybe they could have pursued it more enthusiastically and done better in the 1990s, but, it still would have been fighting against the tide


It also killed their very successful Selectric business!


In 1981 my sister was just out of college and worked on the PC as her first job. She said the loss of Estridge was devastating, and IBM changed some of their policies around executives traveling together because of it.


well in the long run the naysayers were right. the personal PC business is strewn with dead companies scrounging for pennies. it is basically a loss leader. FAANG - which one of those make PCs? oh right, none of them except Apple which has like a 1 percent PC market share which is used to make apps and videos for phones.

my favorite screwdriver shop (PC parts, cases, cpus, fans, etc) just closed. one of the last in the city. decline in business.



i believe that estridge was being head-hunted at apple as ceo before they eventually hired scully. sad thing is that if they had hired him, he might even be alive today, but he preferred to stay at big blue.


I'd never heard of this, but it's in the article, too. I think the tech world would have looked very different if Estridge had taken over instead of Sculley.


A lot of the time, I really am turned off by these articles in short story form but this one flows well. Good job author!


The end of this article is so beautifully written it made me tear up


For people who are interested in this era, Halt and Catch Fire is a terrific portrayal of the sorts of characters and battles that defined it, albeit from more of a startup perspective.

https://www.youtube.com/watch?v=pWrioRji60A


I really enjoyed the first season (especially the first couple of episodes) as the focus of the story is the release of the product and the struggles associated with it.

The second season seems to become the typical personal drama / relationship / betrayal / writers kung-fu story arch / etc. that every series comprising more than one season seems to spiral into these days.

So, highly recommend the first season!


Came to say the same thing :D

The character of Cameron was highly inspired by Romero. In fact, the book Masters of Doom is kind of a blueprint for the show in some ways


One of my favourite shows of all time. Wish there was more of it...

Thanks for the reminder to rewatch it. I really need that show now.


Fantastic show, highly recommend.


Absolutely horrible show and I advise people to not waste their time.


Such insightful commentary!

I didn't find it a perfect show--especially latterly--but it captured a lot of the era, such as COMDEX, pretty well.


As insightful as the comment I replied to.


Precisely


Why is that?


It's extremely contrived and deus-ex-machina all the way through.

The "history" the show goes over is crammed into it's Drama first story. The history is there for nostalgia bait, not to celebrate the history or educate. That's why pretty much everything interesting that happened in computing was mangled into coming from the same like 5 people.

And the characters are all just narcissistic assholes who are self destructive because it means the show gets to carry on for another season. It also has the cliche density to feel like a high schooler's homework project.

If you find yourself addicted to reality TV drama, you will enjoy it.

Imagine if you tried to take "It's always sunny in Philadelphia" seriously, and also were watching it because you cared about Philadelphia history. I felt actively patronized the whole time.


I was annoyed by the C64's that were shown as having "C:>" prompts as if they were MS-DOS machines. I think I get where that was coming from -- people who were too young to remember the period were designing the sets and vaguely remembered that "old-timey" computers had "C:>" prompts and given that C64's are old computers, assumed they did too.


Completely inaccurate analogy using IASIP and Philly. lol.


Wrong. That's like saying the Patrick O'Brian Aubrey/Maturin novels are "contrived." Hello, it's fiction. It isn't there to educate, except insofar as the situations the characters find themselves in teach you something about what it was like back then.

Season 2, though, I couldn't watch.


The acting is horrible. The writing is terrible. The story veers off into areas unrelated. There is a moment of a thoroughly unrelated homosexual encounter. (What was the point of that?!) Moments I know that are so really stretched too far to be believable. And on and on.

So bad I couldn't finish the series.


"The writing is terrible"

and what fiction have you ever written?

I lived through that era, too. As historical fiction, HACF is pretty decent. "areas unrelated" -- hello, it's FICTION about a specific era, and, without knowing what specifically you're talking about: maybe those "unrelated" areas are period-setting, or character-developing.

HACF not for you. Let's leave it at that. You can't please everyone.


I think the homosexual encounter is to show how joe and folks like joe use sex as a tool to exert power over other people.


Sorry you have bad taste and don’t understand character development outside of in your face plot points ¯\_(ツ)_/¯


[flagged]


> The fact that one of the main characters is a "cool" girl at a time where all the characters were a bunch of nerdy guys

From what I recall seeing as a kid, and when working in my teens, that wasn't too unlikely.

There were a lot of women in computing, of various pre-Yahoo-IPO eras.

I think the relative numbers diminished dramatically with the dotcom gold rush, and the newer bro cultures, and gatekeeping.

Only in recent years have we started seeing more women in "tech" again.


I'll add Judy Estrin to the list of cool girls from the time, in addition to the non-famous ones I worked with. Her little company stomped our big company at making X terminals back in the 80's. We were probably 70:30 M/F among our ~two dozen new grads.


No, you're wrong. I was there. It was mostly guys, but there were plenty of women as well. Radia Perlman, to name just one famous one. I recall another from Lawrence Livermore whom I met at an IETF. I went to grad school with several of them.

There certainly is a lot of woke revisionist BS around. No doubt about that.

Edit: in my current book "This New Internet Thing" one of the characters, Cassie, wants to adopt a child as a single woman. I got advice on that process by Heidi Buelow (RIP), who was last at Oracle but worked on the Xerox Star, and adopted two children of her own (Cassie is not modeled on Heidi, except for that). Heidi unfortunately died while I was writing it, and I can't find any online obituary on her.

Since we're talking about "back in the day" naturally some of those people have crossed the Great Divide. You can find a few women in here as well:

https://decconnection.org/memorials.htm

Not a lot, of course. But not none.


i know a significant number of women who were important in the tech industry back then too


You guys have in fact proven my point by bringing up all two of them.


And since two is more than zero, it's not actually unrealistic to write a fiction about such a character.

A fiction protagonist is often the outlier because they have more interesting interactions than the median type and hence there's more material for the story. This is a pretty basic tenet of drama, not some woke invention.


I agree.


This is the first time I've heard someone refer to DEC as "Digital." Is that an Australian quirk? Not that it's wrong, as it is part of the name and could likely be a regional expression and/or historically accurate, and in any case it's before my time in the industry.

> but was yet to announce he was gay (took another decade)

I don't know why this detail was included; not that it's anything to be ashamed of. It just doesn't seem relevant at all to the other points you have raised, and seems a bit insensitive or judgemental imo.


There's nothing wrong with including irrelevant details if they make a comment more interesting; and I don't see what was insensitive or judgmental in the GP.

"Please don't pick the most provocative thing in an article or post to complain about in the thread. Find something interesting to respond to instead." - https://news.ycombinator.com/newsguidelines.html

We detached this subthread from https://news.ycombinator.com/item?id=40598981.


> There's nothing wrong with including irrelevant details if they make a comment more interesting; and I don't see what was insensitive or judgmental in the GP.

It’s insensitive/judgmental in my view to casually mention the public disclosure or lack thereof of someone’s sexual orientation as a curiosity to be commented upon. The original commenter characterized their lack of coming out of the closet as some kind of moral failing or incongruity by juxtaposing it alongside other life changes that are matters of personal preference and nearly commodity-level interchangeability (their clothing operating system choices). This is problematic because of the implication that being “out” is a mere personal preference with minimal stakes, and is offensive because it is close to shaming them for not being out sooner, and close to saying that sexual orientation is a choice, which is a known right-wing talking point and dehumanizing to many in the gay community.


It's a Digital thing.

IIRC, the company itself promoted the use of "Digital" rather than "DEC", with the latter being accident of funding availability when it was founded.


Ha! I had no idea what parent meant with 'Digital', thanks for clearing things up :)


PC shutdowns have no place on HN, we deal in facts. It was included because I personally felt that it was an interesting historic tidbit... especially since he'd flipped from suit to flip-flops and commercial unix to Linux, but still kept this secret. No doubt in those days executives with career plans probably had to keep such things unknown. Can't comment on Digital analogues.

Re. classic PC shutdown emotive, character mud-slinging focused response below, check again: I did not state an opinion, I did exactly the opposite. Unsure what you think you are "calling me out" on - all I expressed was a (very public) fact[0] (furthermore, about which discussion has been intentionally invited), and when questioned, expressed the purpose for doing so was an interest in sharing that fact for others, since it is historically notable with respect to distinction from the current era. If my interest in sharing is now a matter subject to your offense, feel free to be offended, but don't post about it.

[0] https://www.linux-magazine.com/content/view/full/55727


Please don't get sucked in to spats like this on HN. I know it's not always easy but it's an effort we all need to make.

I realize the latter part of the GP comment was provocative but please don't respond to a bad comment by breaking the site guidelines yourself. That only makes things worse.

"Don't feed egregious comments by replying; flag them instead."

https://news.ycombinator.com/newsguidelines.html


[flagged]


[flagged]


[flagged]


I'll elaborate then.

@contingencies was telling a personal anecdote about his own life, which happened to involve meeting someone while that someone was still in the closet.

Unprovoked, you decided to take umbrage with this. There was no reason to do so.

In doing that, you dragged the thread off-topic and continued to escalate.

That was wrong. You shouldn't conduct yourself in such a fashion.


Please don't get sucked in to spats like this on HN. I know it's not always easy but it's an effort we all need to make.

https://news.ycombinator.com/newsguidelines.html


Sorry dang. I tried to explain myself clearly without making things personal, but there's no rescuing this kind of engagement once it starts. The only winning move is not to play. I'll do better.


> Unprovoked, you decided to take umbrage with this. There was no reason to do so.

I found it curious that they mentioned it, and described why I thought it was inappropriate. Contrast this with them, who presented their own off-topic anecdote without justification. The original “provocation” as you call it was by them, by presenting someone’s sexual orientation as a curiosity like some kind of carnival barker.

> In doing that, you dragged the thread off-topic and continued to escalate.

Asking for clarification and pleading for adherence to the site guidelines is no escalation. It’s a conversation. You and they are the ones making it adversarial, then gaslighting me for my original points made.

> That was wrong. You shouldn't conduct yourself in such a fashion.

I disagree, but even if your points were justifiable, which I don’t believe they are, you are dogpiling by framing the issue as you have done. That was and is wrong. You shouldn’t conduct yourself in such a fashion.


Please don't get sucked in to spats like this on HN. I know it's not always easy but it's an effort we all need to make.

https://news.ycombinator.com/newsguidelines.html




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: