Hacker News new | past | comments | ask | show | jobs | submit login
50 Years Later, We’re Still Living in the Xerox Alto’s World (ieee.org)
270 points by samizdis on March 1, 2023 | hide | past | favorite | 84 comments



This is an excellent book explaining the wonders of Xerox PARC and what they wrought for us all. It also explains succinctly why and how Xerox failed to monetize all of the innovations created within. (Their sales force didn't know how to sell something without a lease and a per-imprint charge (where's the click), and management didn't really understand why what PARC created was important or how to monetize it. Also, the people at PARC had some measure of trouble knowing how to commercialize anything.)

I read the book at a 17 year old shortly after it came out, and found it very very compelling.

https://www.amazon.com/Dealers-Lightning-Xerox-PARC-Computer...

Also, here is a thread about the book with Alan Kay - https://news.ycombinator.com/item?id=22379275

Another thread with Albert Cory is here - https://news.ycombinator.com/item?id=31626413


> This is an excellent book explaining the wonders of Xerox PARC and what they wrought for us all.

There's another great book which puts Xerox PARC into context, "The Dream Machine" by M. Mitchell Waldrop [0]. It goes into much detail about how Xerox PARC came to be and how the ideas behind it developed. It's the best book on the computer history I've read.

[0] https://press.stripe.com/the-dream-machine


thanks for the shoutout. Dealers of Lightning is indeed excellent.

My book takes the path of putting fictional characters (except for Dan, who is me) in it, who do not know how it's going to turn out. I had the help of nearly everyone who's still alive, and all the actual events really happened. Xerox really did have a guy with a roll of $100 bills for paying off the unions at the trade shows.

As for the 40+ year-old debate about what Xerox should have done, Jerry Morrison and I considered that at length here:

https://www.albertcory.io/lets-do-have-hindsight


That was a great post!

I didnt realize you did 3+Mail too, the whole 3com office suite and hardware is something almost lost to history, same with 3com NBX PBX.


Thanks! 3+Mail is in the second book: https://www.albertcory.io/the-big-bucks

Actual events like me pulling apart the ThinNet Ethernet cable and taking down the entire company for a few seconds (no one noticed),

The Lisa adopting Hungarian notation because of the Simonyi influence (in one part of the code).

DataPoint having a terrific LAN before anyone else, and going nowhere with it.

3Com buying Bridge, a leader in routers, and running it into the ground, meanwhile ignoring Cisco.

3Com'ers spending all day arguing about a post on WantAds, because they were so desperate for a social network.


Yeah, I read about the Datapoint thing in the book about Datapoint, I was surprised, it was the parallel universe that I knew nothing about.


I thought I was the only person who read that :)


I suspect we are one of either tens or hundreds on here who have read it ;-)


Xerox PARC is the stuff of legend. Its an accident of both time (ICs are becoming available), money (Xerox is happy to pour money into the thing with no real objectives) and people (a literal who's who of talent.)

New, but possible, industries last only for a moment. Gathering talent for that moment and just letting them create, results in an explosion. It's rare precisely because its impossible to manufacture.

These days money is easy. People is harder. Right-Time is impossible. Computing is ubiquitous, nothing new will have the same depth as PARC. The biggest change since PARC has been touch screens. That's significant, but a single technology.

When PARC happens again it won't be in tech, but in some other field, something that's new, but possible.

I have lived a life in the shadow and shade that PARC created. I tip my hat, raise a glass and salute you all.


> Right-Time is impossible.

I think we’re right around the Right-Time era for biotech and genetic engineering. CRISPR is easily Xerox PARC level of influence on the next era of tech.


Yup - we have higher res graphics, and (maybe) quicker networking, but we're doing almost exactly the same things, in almost exactly the same manner. Progress??

I despair that we've all got astonishing computer power but it hasn't got us far at all. If anything, we just accept poorer apps and less efficient systems. Though they do look prettier. Sigh.


Strong disagree. Quick off the top of my head:

Kids today on their laptops have more music tools than a $500,000 studio would have had 20 years ago.

VSCode compared to the tools I used in the 90s blows my mind every day. I'll be like 'hmmm, you know what would be a nice tool/feature' go to marketplace, yep, someone made it and I am now using it 5 minutes after having a thought.

My son moved to a remote location yet can keep in touch with friends and family easily sharing videos/playing games together. I talked to my grandparents once every other month over a poor quality phone call. Cost of communication isn't even a consideration when I reach out to my son today.

https://www.ableton.com/en/ https://code.visualstudio.com/ https://store.steampowered.com/


Obviously I didn't make my point clearly, because your examples confirm what I'm thinking.

Electronic music has been around for ages - I was a great fan of Kraftwerk. Your kids have huge amounts of music, I had a bit - they are doing the same thing, just more of it.

You love VSCode - but you're using it to cut code at the same (or similar) level of complexity to what I was cutting 40 years ago. Maybe quicker, or easier, but essentially the same activity.

Comms is cheaper and easier (I remember when email became a thing - wow - we had a stable route from Oz to MIT - back in the uucp days!) but we're doing what we always did - talking and writing to each other.

Where are the new things? The things that didn't exist previously? What are we using this tech for, that we just couldn't do before?

The only answers that has been a little convincing is gaming and big data. It wasn't possible/practical to do large scale simulation before. But many of our games are largely the same old activities ... shoot-em-ups, DnD. The tiny "retro" games might be one of few things that just didn't exist before "micro" tech. And big data (eg. ChatGPT) just wasn't possible before enormous amounts of accessible data ... but it's looking like it's just a great Eliza :(


There's VR. There's AR that's only been possible with smartphones getting better. I will be the first to say I'm disappointed that a lot of the cool, new technology just isn't practical outside of gaming.

However there are some new useful things like Apple Pay and Google Pay, more convenient and accurate map applications as opposed to old GPS systems, instant text recognition and translation of images on your phone. But these aren't based on flashy technology, just incremental improvements like your examples.


World improving tech constanty appear

E.g payment systems and logistic solutions.

Cloud also changed a lot.


Considering what happened during the last 2660 or so years, I think whether payment systems improved the world is still up for debate.


How about the Web?


If you mean web not internet then there is a similar confluence there, Cern, money into research, Tim Berners-Lee and advocates of what he created.


I meant "The biggest change since PARC has been touch screens" -> hasn't the Web been a bigger change than touch screens?


In the early 1980s, I got a job at Western Union (the old telecom company, not the thing it became). I was programming special reports in 4GL languages like RAMIS and MARK IV plus a little COBOL, and I was asked at one point to see if I could find a use for the Xerox Star, a descendant of the Alto. It was the first time I had ever heard of a mouse, and I had never seen a real GUI before. The system was very cool, but my experience was that it was best used as a word processor and typesetting machine. It had a programming environment, but I did have a language manual and was never able to write any programs for it.

I loved the machine, and even wrote part of a short story using the word processor. They moved it to corporate headquarters after two weeks, so I did not get very far with it, but I remember loving the experience. Windows was a poor substitute to the feel of the Star interface.



>Windows was a poor substitute for the feel of the Star interface.

I'm intrigued. Any specifics about design decisions come to mind?


I saw one of these in the early 1980s at a Xerox engineering facility in LA.

It was shocking!

They also had a LAN where H/W engineers submitted their (graphically captured) schematics to the in-house PCB fab, via electronic transfer.

A week or two later, their prototype PCBs would arrive via the mail delivery robot!!! I'm not kidding, the robot would follow it's route and beep in front of an individual's office when they had mail on the cart.

It was WAY more back-to-the-future than any bullshit dispensed by M$ and Goggle today.

And WHY haven't we advanced beyond the human/computer interface introduced 50 years ago? Quite simple really: technical innovation is not seen as driving shareholder value.

Why should a company invent something new when its "more productive" to just squeeze their vendors, employees and customers more tightly?

This is really in the same vein as M$ brain damaging billions of people into thinking "this is what a computer is".

There are no current trends in opposition of this corporate domination.

If you think housing, economic equity, or employee rights are trampled today, just wait. It's only going to get worse, until the mass of the population wakes up to who is actually screwing them!

It wasn't hippy tree huggers that sent all US manufacturing to China. undermining the entire US workforce, and leading to the current situation. Where those same assholes are now crying about the advancement of Chinese efforts to enact it's own world domination. CEOs paid for that rise in Chinese aggression by buying the crack of products produced with cheap slave labor. While continuing to dumb down the US population, and sell the same computer that Xerox didn't know what to do with 50 years ago.

Prepare yourself, it's only going to get worse...


> And WHY haven't we advanced beyond the human/computer interface introduced 50 years ago?

I like the current interface of mouse, monitor and keyboard.

I hate using phone apps. The keyboard is too small, and I hate typing with my thumbs, and the screen is too small. When the iPad came out, Apple fanboys says everyone would only use phones and tablets within 5 years. It has been about 12, and a lot of people are still using laptops and desktops.

And voice interface will probably be a disaster. Everyone thinks it will be like "Star Trek", where they have actors who know how to deliver lines. Most people do not speak that well.

To give an extreme example: I know one guy (with a CS degree from UT, no less) who honestly says more filler words than actual words. "Um" and "ah" are his favorites. Then when he does get a sentence going, he stops halfway through, gives you another ten seconds of filler words, and starts a different one. I think if this guy tried a computer with a voice interface, it would achieve the singularity just to smack him.

Amazingly enough: His wife is from Taiwan, he met her when he spent a few years in Taiwan, and I have seen icons w/Chinese characters on his screen. I wonder if he speaks Chinese the same way he speaks English.

Robert Smith of the Cure said one day he and his wife noticed they had left a camera on after recording a message, so they watched the rest. He said they quickly devolved and were shocked at how dumb they sounded.


Just last night, I was faced with painfully "typing" Mythbusters into my TV, it offered to let me speak, and I groaned and tried it for the first time. It worked, to my amazement. But VR does seem to work pretty well with my voice.

As for the "um" and "ah": people adopt a stilted, formal method of speaking with VR systems. It's no more natural than typing an email.


Frankly I wish more people spoke more formally all the time, myself included.


> And WHY haven't we advanced beyond the human/computer interface introduced 50 years ago? Quite simple really: technical innovation is not seen as driving shareholder value.

Well, in of itself it doesn't. You need to come up with technology that has applied uses that people are willing to pay for.

The Xerox WIMP (Windows, Icons, Mice, Pointers) UI was (and still is) very well adapted to desktop computers with increasingly large screens and increasingly powerful CPUs (multi-tasking - windowed UI), as well as to humans that are very visually driven as well as adapted to use hands (mouse) to manipulate things.

Finally, in recent years we are moving past WIMP, especially for mobile, in the guise of user interfaces driven by gestures, voice and now (ChatGPT) natural language interaction. These are again things that play to human interaction modalities, but have only been enabled (as well as demanded - mobile form factor) by technological advance.


There's an element of "network effect" here. Most (non phone) apps use the WIMP interface, therefore all OSes support it. All the apps are written for the WIMP interface because 1) that's what the OS supports, 2) that's what all the other apps do, and 3) that's what everybody knows.

To break through that, a new interface paradigm has to be massively better - enough better that it's worth buying a new computer, a new OS, and all-new apps (from a very limited selection) to get it. That's an insanely high bar, which no alternative to the WIMP interface has cleared. So the WIMP interface continues unchallenged... except for phones. It's going to continue to rule the desktop/laptop world, though, and not because of "corporate greed". WIMP will continue because nothing is enough better to be worth the cost of switching.


Up until Windows 95, WIMP could have been toppled by something acheivable and better. That was nearly 30 years ago, so maybe something better that couldn't be acheived back then is supressed by the status quo.

WIMP on hand held devices isn't great (ask early windows mobile). It's just too fiddly. But if touchscreen interfaces were better at a desk, we'd be doing that on desktops and laptops now. Touchscreens are readily available, it's just way more effective to have an interface that isn't occluded by using it.

Voice based interfaces clearlty work for some, but they're really fiddly for non-linear access.


> But if touchscreen interfaces were better at a desk, we'd be doing that on desktops and laptops now.

The ergonomics of touch desktops don't really work. Lifting your arm (big muscle groups) and trying to manipulate fingertip sized widgets (fine motor muscles) in concert is very strenuous. There's a reason writing apologies on a chalkboard used to be punishment in schools.

For one-off activities or very short sessions it might work for some people. However the WIMP interface with its proxied inputs of a keyboard and mouse work far better for more people for longer durations.

Touch and direct input like a stylus works much better on displays that can sit horizontal (or mostly so). No matter how good a touch interface is, people just physically can't use them for long periods on a vertical display.


> Touch and direct input like a stylus works much better on displays that can sit horizontal (or mostly so). No matter how good a touch interface is, people just physically can't use them for long periods on a vertical display.

Yes, and if you set up a horizontal display for a desk, you're going to end up with neck strain. You could maybe do something at an angle, like a drafting table, but probably still balancing neck and arm issues. And you'd probably need to adjust display construction to reduce the distortion from viewing it at such a high vertical angle.


You're completely right. The horizontal display is only better than the vertical touch display, it's still worse than the vertical non-touch display with horizontal input devices.


    So the WIMP interface continues unchallenged... except for phones.
I used Siri on the iPhone for the first time in trying to return a lost property to its owner. The experience left much to be desired. Siri failed.


Do you realize the concepts for computing tablets came out of PARC as well?

Alan Kay gave a couple lectures at Stanford on "How to Invent the Future" and went over all the stuff that came out of there. https://www.youtube.com/watch?v=id1WShzzMCQ

Much of what we use today was conceptualized 50+ years ago. PARC's outputs paid off immensely for Xerox over a very long period of time since they picked a tech (laser printing) that was perfect for their wheelhouse. Companies and our culture these days don't allow for exploration like PARC was allowed. There is too much focus on monetizing everything and having an ROI.


Yes - so much of modern computing came from PARC. Ethernet too. The reputation of PARC though has always been that it was a squandered asset - they invented so much genuinely useful technology and Xerox failed to capitalize on the majority of it.

For a while Blue Skies research labs such as PARC, IBM Watson, Bell Labs seemed to have disappeared, but arguably they are back (different companies/labs though) now with a focus on ML/AI. I'm not sure which commercial labs are still doing fundamental material science research though.


I think we will see engineering innovation in the AI/ML space as new processors are developed. We may get more interesting software as well. One problem is that academics need million dollar compute budgets at the moment to even get close to the output of Google/OpenAI/FAIR. Hopefully the DOD, DOE, NSF and NIH and ultimately Congress will act.


>It wasn't hippy tree huggers that sent all US manufacturing to China. undermining the entire US workforce, and leading to the current situation.

The working class still hasn't recovered from what NAFTA did to our share of wealth. (I'm trying to find the tweet + graph where I learned this in vain)

29 years later and we haven't caught up. The TPP (NAFTA, but for China) would have made it worse.

Further reading: https://www.pewresearch.org/social-trends/2020/01/09/trends-...


The TPP specifically didn’t include China.


the entire point of TPP Was to exclude China...


and stay off my lawn!

i also wonder at the amount of DARPA/DoD/secret funds that went into the early advances, and where they are now (a cia guy told me they saved Sun from bankruptcy early on by buying a bunch of machines, Smalltalk's killer app was a hyperlinked encyclopedia created by the CIA, etc )


> Smalltalk's killer app was a hyperlinked encyclopedia created by the CIA,

no, it was the aliens on a side trip after they visited Roswell.

Smalltalk didn't have a "killer app."


There was one cool app called "The Analyst", used by CIA it is said. I once saw a demo of how to use it something like an expert system. it had a spreadsheet where any cell could be any Smalltalk object . But seems it has fallen by the wayside and was never ported to newer Smalltalk platforms (?)

http://www.bitsavers.org/pdf/xerox/xsis/XSIS_Smalltalk_Produ...

https://comp.lang.smalltalk.narkive.com/hZvXNNyC/xsis-the-an...


"killer app" is a term for something that breaks a technology wide open, like VisiCalc for the PC, or Mosaic / Netscape for the Internet. Do you really want to put The Analyst on that plane?

Smalltalk's never been any more than an intellectual influence, albeit a big one.


I think you are both right. The term "killer app" can be used for something that causes people to buy something else no matter how expensive just to have that first thing. Visicalc for the Apple II, Lotus 123 for the IBM PC and clones (previous apps ran equally well on non clone MSDOS machines), and the web browsers as you mentioned.

Normally it is associated with making something very popular, and in that regard the Analyst certainly wasn't. But it did get a small group to buy absurdly expensive D-machines from Xerox just to run it, though I think it was more than those who bought them to run Lisp:

http://www.bitsavers.org/pdf/xerox/dolphin/1100_Scientific_I...

More about the Analyst from later in its life:

http://www.bitsavers.org/pdf/xerox/xsis/


> I'm not kidding, the robot would follow it's route and beep in front of an individual's office when they had mail on the cart.

The future I was promised... sigh.


Yes, individual offices sound nice.


I love modern "actually we don't have enough space for all the people who come here to work" architecture.

Nice roof garden though.


“ Why should a company invent something new when its "more productive" to just squeeze their vendors, employees and customers more tightly?”

That’s why I don’t like having tech giants like Apple and MS. Their output in terms of innovation is very low compared to their size. But they are big enough to either suppress or buy any innovation that’s coming from smaller players.


> If you think housing, economic equity, or employee rights are trampled today, just wait. It's only going to get worse, until the mass of the population wakes up to who is actually screwing them!

A lot of people have been voting for who has been screwing them since they pulled the lever for Saint Ronny with both hands. Why would they realize anything now?


Hah, I love the rant. Maybe if we get psylocibin approved all the depressed programmers and tech folks will develop the creative insight to buil the next generation of PCs!


You can be too early.

The Alto in 1975, when I first saw one, was a very expensive machine. It was a minicomputer, a modified Data General machine. Affordable hardware to do that was over a decade away. Along the way was the Xerox Star (too expensive), the Apple Lisa (too expensive), the original Macintosh (too underpowered), quite a few workstations based on Unix and Motorola 68000s (expensive but usable), and the forgotten era of shared-logic word processors (affordable, but dumb). It wasn't until the late 1980s that a mass-market Smalltalk machine was affordable.


That was the entire point of what they were doing at PARC: build a possible computing future. They were not trying to build a mass market solution, they were trying to pull what could be commonplace 10-15 years in the future with smart, motivated people and money-is-no-object hardware budgets. Sure, they couldn't turn it into products in the 70's, but they would have had one hell of a product roadmap for the next 50 years if management understood what they had.


That was indeed the thinking in the mid-1970s. But that was just after Xerox's peak profit years. Wikipedia: "Following these years of record profits, in 1975, Xerox resolved an anti-trust suit with the United States Federal Trade Commission (FTC), which at the time was under the direction of Frederic M. Scherer. The Xerox consent decree resulted in the forced licensing of the company's entire patent portfolio, mainly to Japanese competitors. Within four years of the consent decree, Xerox's share of the U.S. copier market dropped from nearly 100% to less than 14%.[31]"


True, but it was never Xerox's thinking and they didn't do it out of altruism. The government was breathing down their necks re: antitrust in the late 60's (i.e. the peak profit years) and taking on some of the ARPA projects at PARC helped buy Xerox some time on that front. They also only had a 5 year commitment to the projects and based on a couple of oral histories I've heard, it sounds like they started dismantling things as soon as they could.


"Xerox PARC" is not one word. There was PARC, and then there was SDD, which was trying to turn it into a business product (not a mass market). Something that Xerox would sell for "knowledge workers" in Fortune 500 companies.

Xerox never had any interest in selling for home use, and it was a genuine shock when Apple proceeded to do exactly that, and office workers started bringing their Apple II's to work so they could use VisiCalc.

Something everyone at Xerox would prefer to forget: they rushed out the 820 to stop that shit, a CP/M based PC that they called, believe it or not, "the Apple killer." You probably think I'm making that up.

I was there. Star 1.0 had no spreadsheet, believe it or not. Spreadsheets were in people's minds, but it was way too late to add one in.

Lastly, it's claimed by multiple Xeroids that the laser printers made enough money for Xerox to pay for all of PARC's expenses, easily. I tried to verify that, but it's just not possible.


I wasn't there, but I've heard/read accounts from those who were... so I believe you. None of the 'serious' tech companies back then took 'personal' computing (whether at home or the office) seriously. They were too entrenched in the server room mentality. When the upstarts made it obvious the business they were losing out on, they scrambled to shove some products out the door but ultimately failed at least in part because they still didn't understand the space.[1] This is part of why none of them are around to any significant degree today.

[1] Another example was HP not wanting Woz working on their PC... I guess the 'professionals' had everything under control?


> None of the 'serious' tech companies back then took 'personal' computing (whether at home or the office) seriously.

With, of course, the minor exception of that easily forgotten serious tech company (IBM).


No, I would include IBM high on the list. IBM did a half-assed PC design to rush something to market[1] after being late to PCs which was a significant factor in why they lost control of the PC market a handful of years after entering it.[2] They actually had a few shots at a comeback (the Thinkpad was a great laptop etc.) but didn't like margins and decided to become a services business instead[3]... how's that working out?

[1] They decided the OS wasn't that important... sure, let Microsoft own that. The CPU architecture wasn't that important... sure, let Intel own that. Just glue a bunch of off-the-shelf logic together for the rest. After all, the real value is in the IBM logo on the box. What could possibly go wrong?

[2] Sure, they were calling the shots for the 8086 and (mostly) 80286 generations. They tried to rectify their mistake over half a decade later with the PS/2 (i.e. proprietary everything) but by then it was too late. Compaq and the clone makers owned the market from the 80386 on.

[3] Out of the frying pan and into the fire?


I think this analysis is truthy, as opposed to true.

> did a half-assed PC design to rush something to market

the alternative to "half-assed" and "off the shelf" would have been "an IBM-wide committee taking 5 years to do an all-IBM device that cost $30,000." They "rushed it out" because the market was moving fast and a ponderous bureaucracy would have missed it.

For a number of years, "IBM-compatible" was the buzzword in PCs. The market was simply too big for them to keep owning it. There were still good profits to be made on customers who prided themselves on being all-IBM. They could have led the army instead of being the army as they had in the mainframe days. That would have required a subtlety that just wasn't in them.

The PS/2 and OS/2 were indeed mistakes and helped make them a joke.


> the alternative to "half-assed" and "off the shelf" would have been "an IBM-wide committee taking 5 years to do an all-IBM device that cost $30,000."

Which IBM did. They built the IBM Instruments Computer System Model 9000. Came out in 1982.


A hot seller, that was. Wikipedia says:

> Reasons cited for the failure of the System 9000 were its poor performance and high price, which led to the IBM PC being used where price was of concern, and to other 32-bit microcomputers being used where performance mattered.[10] IBM closed its Instrument division in January 1987, reassigning the approximately 150 employees that had worked for it to other positions.

150 employees? they'd use that many people to give names to the broom closets.


That story is in our page on what Xerox should have done:

https://www.albertcory.io/lets-do-have-hindsight

miraculously, they broke free of the IBM bureaucracy. For a while.


The Alto could emulate different instruction sets via microcode (like their previous machine emulated a PDP-10) and it came with a built in emulation for the Data General Nova, but internally it was a very different machine. The biggest similarity was that it used the same ALU chip, the 74181.

The article mentions that "Kay gambled his budget on Lampson and Thacker’s proposal, calling it the “Interim Dynabook.”" and this budget was for Kay's group to buy more Novas and add a frame buffer to them and high resolution displays like they already had been doing. Smalltalk-72 initially ran on one of these modified Novas. With the default emulation the Alto ran all this old code right away.

Later on the Alto's flexibility allowed them to do things the Novas couldn't, like implement BitBLT and run Smalltalk or Mesa bytecodes directly. So they were not only a bridge to the past, but to the future as well.

Some people who wanted an Alto but couldn't get it made their own:

https://en.wikipedia.org/wiki/PERQ

https://en.wikipedia.org/wiki/Lilith_(computer)


The Alto did things that even modern computers don't do as well. There's a demo I like to point at for this: https://youtu.be/AnrlSqtpOkw

Look how programmable this is by the user! Today's OSes and their division of software into "applications" erect barriers to software composition that don't need to be there.


Programmability only matters to users who can program - which is a tiny minority. There is no sense in which HN or a PhD-packed centre like PARC represents the interests of typical users, who are perfectly happy to use relatively dumb information appliances that can only be customised with ready made applications controlled through some vaguely graphical UI/OS.

PARC's real legacy was the creation of a workable model for non-expert UIs. The Smalltalk/LISP angle was never going to be more than incidental.

I'm personally a huge fan of it. But it's completely opaque to a typical user and always will be.

AI promises to be at least as transformative because it won't be long - more than five, less than ten years - before it allows complex customised context-aware task-oriented voice commands that aren't mired in the thousands of random switches, file location inconsistencies, and other dinosaur bullshit of the UNIX command line.


"Can program" is a spectrum, not a binary. It depends not only on the user's skill, but on the environment they're working in.

Introduce kids to Scratch and they will program. Have someone use Excel and they will program, though they won't say that's what they're doing!

> I'm personally a huge fan of it. But it's completely opaque to a typical user and always will be.

I don't think we can say this with any certainty. Most people have never had an environment like this available to them.


I guess the experience was faster than most of Electron-based "apPS" nowdays.


And yet slowly-slowly-slowly, we're moving back towards this: "These and the other mainframes and minicomputers of the era were room-size affairs, almost always located somewhere away from the user and almost always under the control of someone else."

More and more computing is going up into the 'cloud' which is just "fancy" for someone else's computing mainframe.


Todays cloud would seem like absolute magic to those of the mainframe era.


I wonder if Google is today's Xerox and all the research happening at Google is equivalent to stuff PARC did in 80s


Doubtful. There are glimmers of good ideas, just like there were out of Microsoft and Apple research in previous decades. But ultimately, the scope and scale of what PARC was working on (and prior to that (D)ARPA) hasn't been attempted since AFAIK.


Anyone else spending an unreasonable amount of time thinking about why the mouse is on the left of the keyboard in this picture? I thought left handed people generally still used the right hand for the mouse? Is that not true? Why did they use this picture specifically, is the mouse on the left supposed to mean something related to the article?


I'm left handed but always used a mouse right handed growing up for a few reasons:

1. My family was poor, I didn't have access to a computer that was ours so I only used other people's and it was set up with mouse on right

2. Video games would require so much effort to rebind to play comfortably with mouse on left

Now as an adult I have a symmetrical mouse and switch which side I use during the workday (where I mostly am just typing and not mousing) occasionally for comfort


Im left handed and can use the mouse on both sides. Started using it on the left when i was young, but was almost forced to use the right hand because of space issues in computer labs where they try to fit as many computers as possible next to each other. Whenever I tried to use it with my left hand, I ended up entangled with the person on my left.


I do think that you are overthinking it.

That mouse is not shaped to the right hand. There is no reason to assign significance to it in my opinion.


Xerox Alto, the PayPal Mafia. Any other groups figure importantly in the history of tech? I love reading about this stuff.



Well, the Macintosh team certainly thought they were.

There's a YT video about General Magic that tries to paint them that way, and they thought of themselves that way at the time.

https://www.youtube.com/watch?v=JQymn5flcek

Take it all with a grain of salt.


TOTO’s Universal Design Research Centre in Japan. Over 900 researchers improving human condition. https://www.totousa.com/toto-universal-design-center


You might enjoy “The Idea Factory” about Bell Labs. I found it hard to put down.


I personally live in the world of smartphones. I also tried VR: both Valve Index and Quest 2 and the second one is a nearest thing to what I'd call a breakthrough. Another couple of years - and we may have something really interesting in that field.


We are living with similar computers to the Xerox Palo Alto. Steve Jobs got inspired the Palo Alto.

Then we also have The Mother of all demos which was influental for the Xerox Palo Alto.

The mother of all demos: https://en.m.wikipedia.org/wiki/The_Mother_of_All_Demos https://youtu.be/yJDv-zdhzMY


Pity that Mesa, Mesa/Cedar, Interlisp-D and Smalltalk did not have the same luck in OS design and system languages.


But it's not the inventor who made the money (Xerox), it's others who got rich (Apple, MS, Google)).




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: