>Active-matrix was the much better, much more costly alternative. The difference being that passive matrix could not handle motion well, so it was easy to lose track of the mouse cursor if it moved across the screen too rapidly.
For the kids today, this is why we used to have Mouse Trails in settings!
I just checked on my Mac, and we no longer seem to have that option.
It was an option in classic Mac OS, passive-matrix screens were an option on PowerBooks even into 1998's WallStreet PowerBook G3 (a machine which can officially run Mac OS X!)
This didn't work with early LCDs; areas near cursor literally stayed blank for couple fractions of seconds. Wiggling would only make it worse, and negative space created could not be spotted either because UI was mostly blank in the first place(dark or light).
I make the pointer bigger on both macOS and Windows. Unfortunately, at least on macOS, it becomes too imprecise for clicking if you max out the size, but I can't deal with hunting and pecking for my pointer so I do push it as far as I can.
Probably stems from the days of using computers with much lower resolutions where the mouse pointer was therefore relatively large and easy to find. My Amiga 500 typically ran at either 320 x 256 or 640 x 256 (with rectangular pixels), but the mouse pointer was a 16 x 16 hardware sprite, which locked to the lower resolution IIRC, so it was always 5% of the width of the screen, and 6.25% of the height. This is absolutely enormous by today's standards, even with the mouse cursor enlarged to, not its maximum size on macOS, but its maximum useful size.
Interesting. And by now every platform has enough information to define the mouse pointer size in physical units rather than pixels.
If anyone at Apple is listening, highlighting the screen where the pointer is (dimming the others) or just having the option of resetting the position to a known place, would work just fine.
Sometimes I like to imagine a world where Commodore and Atari saw the writing on the wall. Instead of competing each other to death, while the IBM PC open architecture took off, they collaborate to create a joint open architecture of their own.
How different might the IT world look today if we had had a deluge of Amiga/ST clones.
I wish Commodore had had a quarter of Apple's marketing skills. More powerful, cheaper hardware with a significantly more capable and extensible OS could have made the Mac a footnote if executed properly and would have supplied some interesting competitive pressure to the wider market.
Not sure. While the ST was an awesome home computer when released in the mid eighties, there was little, if anything truly innovative about it. It's rather true to its marketing slogan "power without the prize". W/o competition between Atari and Commodore, prizes would surely be higher, but then, what would be left?
I think about that as well although I like to imagine that both Atari and Commodore survived as did BeOS, RadioShack/Tandy, OS/2, and all the machines I've only heard about but never used (esp. the European machines).
Although I suspect that even if all that stuff survived well into the internet era, the rise of web-based UIs would have lost everything interesting about each platform and rounded every corner to deliver the boring, ugly cross-platform software that is so popular today.
I think only Evernote did a good job of cross-platform where they wrote a platform specific UI layer onto a common foundation that did all the work and communicated with the servers. Even that didn't last because eventually they also bought into Electron which is basically the gray goo of software.
I never used an Atari computer, nor did I know anyone who did, but I always wonder what the world would be like if Windows and macOS didn't "win".
If Atari and Amiga had won instead, what would the world look like?
What would the server world look like? Would there be some weird "Amiga Server Enterprise Edition"? Would servers just be Linux without any meaningful competition?
Would Atari have shook the world by introducing a new CPU that resulted in amazing battery life compared to the Amiga competition? Would some of us be using AtariPhones? Would Android be a thing?
Would retrocomputing folks talk about their Windows 3.1 boxes the way that Ataris and Amigas are currently talked about?
I'd assume it would have been pretty similar but we'd be running on Motorola CPU's (or a descendant of them)
The platforms would have needed to be opened up to clone builders to reach critial mass.
Amiga was a lot like DOS/Classic Mac OS, single user, unprotected memory...we would have seen it added on to like Windows 3.x/9x, until a re-written version with the right stuff took over (like Windows NT/2000/XP did).
Someone like Linus would have still likely written a UNIX clone and open-sourced it.
The minicomputers and UNIX servers/workstations would have still hung around for a while. The real trick is the Amiga CPU and the rest of the hardware would need to keep getting improved, catch up in speed, reach 64 bits, SMP...
> Amiga was a lot like DOS/Classic Mac OS, single user, unprotected memory
AmigaOS was actually implementing real preemptive multitasking and was much more "modern" than MacOS and Windows from the same era, and a lot of things were actually unix-alike ! Comparing it to DOS is like an insult :)
But yes, memory was unprotected because mot Amiga CPUs didn't have an MMU (but if you did, you could use this https://www.nightvzn.net/portfolio/web/amiga_monitor/archive... ).
In the IBM compatible world, the clones drove down the price then drove forward progress. It is doubtful that much of a clone market would develop in the Amiga/Atari world since the parent companies were already competing against IBM compatibles on price. Without clones to break free (as happened in the IBM compatible world), there wouldn't be clones to drive forward progress. I'm not even sure cloning the Amiga would be practical. Apparently Commodore had enough trouble "cloning" the Amiga (i.e. developing higher performance, yet compatible machines). Without the clones driving progress, companies like SGI and Sun would likely still be in the picture.
If Amiga/Atari domination somehow did happen, I suspect the CPU situation would be flipped around, with Motorola having both the incentive and finances to continue on with a fully compatible 680x0 line of processors and Intel chasing after its own equivalent of an 680x0-to-PPC transition.
As for the retrocomputing thing: DOS/Windows 3.x nostalgia is very much a thing in today's world. In that alternate reality, they would likely be higher profile (as Amiga/Atari are today).
I think if Atari and Amiga won the world would have a lot more focus on the media side of things, the playful, graphical, musical expression would be more evident. I don't think we'd have spent as long in the beige-box era, and maybe we'd still have little colorful imacs running something Haiku-esque, with enlightenment on top and some breakcore tracker music on bootup.
This is my fantasy, you're welcome to enjoy it while you're here and remember, no shoes on the couch.
the playful, graphical, musical expression would be more evident.
This was also the Mac's distinguishing feature at the time. It still is, so some extent. A lot of what drove mass adoption of home computers was that everyone wanted to bring the same computing environment, i.e. OS, they used at work or at school at home as well. This was DOS/Windows or System 7.
We also had (well, have) a Unix-like extension for the shipped TOS/GEMDOS/GEM OS in the form of MiNT/FreeMiNT, which went on to form the foundation of the official MultiTOS which unfortunately died along with the ST when Atari Corp died.
It had/has a POSIX(ish) API, device files, mountable filesystems, pre-emptive multitasking, TCP/IP, etc. while still being able to run classic TOS binaries.
You can run this in an emulator or on hardware still today and it still gets active development, under the GPL.
Atari HATED to share anything with people outside the company. The couldn't even help developers build software for their machines, let alone let someone copy & commoditize their hardware. The Apple II was incredibly open and extensible, and successful. Macs where not and never more than a minor player until computers shifted to a mobile, general consumer product and Apple out executed and leveraged their single ecosystem.
> If Atari and Amiga had won instead, what would the world look like?
If the PC didn't establish itself, Commodore might have opted to release the 900 as a Unix workstation. With any luck, they'd port Coherent to the new Amiga and it'd be a Unix machine from day 1.
From watching Youtube, Ataris may have been the best computer to be exposed to in one's teenage years for so many reasons. It was very capable and efficient for it's CPU. Still might be to learn to build with constraints.
> The server would still be UNIX, but the big iron UNIXes, not GNU/Linux with some BSDs and Windows on the mix, there would be no reason for Linux.
Linus would still get frustrated by the AIX'es and Solaris'es of the world, AT&T would still try to get rid of any BSD being publicly offered and Linux would still be invented under the GPL and get the GNU userland for free.
The biggest difference is that it would be currently used on more architectures than the two it's mostly used on these days.
Without PC as it was, most likely Linus would never had Minix to play with, or be frustrated with its MS-DOS compatible PC, so the genesis wouldn't have taken place.
> And vertical integration, plenty of it, as it has become the norm again nowadays.
Except, not really. If you work at a startup or business that has to deal with "vertical integration" at-cost, your first goal is to get rid of it. Fly.io, Datadog, managed K8s, all of this stuff is literally first-to-go when scaling a business that needs to be profitable. Business-conscious users know they're being fucked over whether it's Apple, Microsoft or Oracle - you can't market it as "integration" to people that see the monthly bill.
And in the EU, vertical integration from large companies that can EEE their competitors is under extreme scrutiny now. Many execs have exploited it to engender themselves an unfair advantage and deserve the same scrutiny Microsoft got for their "integration" case.
If American governance shows the same competence, "vertical integration" will be the most feared business model of the 21st century and innovation will be put back on the table. For everyone.
The only PCs left that aren't subject to vertical integration are gamer PCs, and even lose are losing to consoles, as people focus on other devices for their daily computing activities.
Large majority of the population is using unupgradable laptops, where at most memory sticks, hard drives, battery can be changed.
I haven't touched a desktop at work since 2006.
Some even make do with a tablet for their computing needs.
Likewise those that have servers in-house, those are no longer PC towers under a desk, rather slice of pizza boxes in server racks.
I am a huge Amiga fan, but the Amiga was going nowhere and was never going to win. The OS is just as terrible as classic Windows and MacOS from a reliability standpoint; yes not using a message pump for timeslicing was a really nice property but in most ways the design was _worse_ in terms of any hope of eventually getting memory protection in place.
I love the Amiga - it represented a unique point in time that coalesced a lot of interesting technologies and people together trying to do something interesting - but it was as far from a technology that had long term potential as you could get, pretty much in every way.
Ironically, the Atari ST's OS -- much maligned as 'primitive' by Amiga users -- was not like this. It had a proper syscall mechanism through TRAPs -- so proper 68000 architecture memory protection entirely possible with user/supervisor separation etc etc -- and an event loop with message passing (tho rarely used). Later extensions to add unix-like multitasking (MiNT -> FreeMiNT) actually ended up fairly elegant, and memory protection is a possibility for some things.
My understanding is that AmigaOS syscalls were basically JSRs?
The original shipped OS was basically a fork of CP/M and PC-DOS-ish but GEM overtop of it showed more attention to cleaner architectural concerns, though it was never really used to its full intent.
I was a die hard Amiga fan and I agree. The way to memory protection on some “what could have been” Amiga, would have been running a copy of the OS for every application and change the IPC if you wanted two programs to talk to each other
It was .. bearable .. back in those days I wrote a lot of database recovery software, and mostly worked on an undelete tool for the Progress 4GL Database which I customized for the customer needs - so all I really ever needed to edit on the Portfolio, while on a cross-country tiger-team flight, were the constraints/extents/geometry of the deleted databases, so it was never a full strain .. back at HQ, I of course just used my dumb terminal and MIPS machine for 'real development', but it was always fun to arrive at the catastrophe, wire up the Portfolio, and either run the DOS binary or do a recompile on the target recovery machine ..
I did write a few games for it, entirely on the machine, in-between flights .. that was fun. Still out there somewhere (swar.c, a space-war game..)
I recently bought an Atari Portfolio that was in working order. But it stopped working pretty quickly and now only shows random characters on the screen. Too bad, because I was really looking forward to that easy money.
I suppose these might have been attractive to very well-heeled musicians because of the MIDI ports, which was one of the reasons that the full-sized ST was popular with them.
Yes, the mini-MIDI port on a 1991 laptop is truly unique.
But probably there wouldn't have been much of a market for that. Computer-driven live music performance was still very exotic. Laptop jockeys were a decade away.
It was due to their rock solid MIDI Sequencing with the advent of the AKAI S1000 Sampler and the move away from the Amiga dominated 'Tracker' scene with the introduction of Cubase as primary DAW.
Computer-driven live music performance was very much a thing long before 1991. The 'computers' in question were Analog sequencers using control voltages, and things like the LinnDrum providing click tracks to trigger sync. Roland expanded on this with the release of their TR-808 drum machine and sequencer in 1980, utilising a precursor to MIDI known colloquially as DIN-Sync or Sync24
https://en.wikipedia.org/wiki/DIN_sync
This gave way with MIDI to the sequencing of outboard gear via a variety of hardware sequencers and computer/DAW combos - bringing us to the Atari ST and the first few generations of PPC and G3 Towers as we entered the true age of the PC DAW.
Zappa even took his Synclavier on the road in 1988. You can hear it all over the albums from that tour. It was almost certainly the most expanded (and expensive) unit on the planet. By 1991 it had an astounding 768MB of RAM.
Synclavier was a serious powerhouse that straddled the analog/digital era - additive, digital, and FM synthesis with unique sampling features. They were lucky with the cross-pollination in the US University scene at the time.
It was originally envisaged to be the 'Dartmouth Digital Synthesizer', borrowing the then innovative FM synthesis technology from Stanford which was eventually the basis for the Yamaha DX line of synths, with the DX7 being the indisputable king of late 80s popular music.
That 24-bit, 50kHz sample rate and the AD/DA converters were glorious, but even the workflow and palette editing functionality were so unique and revolutionary that there's value in a full 1:1 software emulation. I've had a lot of fun playing 'guess the hit single' with the Synclavier and Fairlight emulations in the Arturia Collection
> Computer-driven live music performance was very much a thing long before 1991. The 'computers' in question were Analog sequencers using control voltages, and things like the LinnDrum providing click tracks to trigger sync. Roland expanded on this with the release of their TR-808 drum machine and sequencer in 1980, utilising a precursor to MIDI known colloquially as DIN-Sync or Sync24 https://en.wikipedia.org/wiki/DIN_sync
On one level, I'm absolutely onboard with this perspective. On the other hand I think this is bending the definition a little bit too far. What we're specifically discussing here is using general purpose portable computers as part of a live performance.
The Fairlight CMI falls into an interesting middle ground because, at least in theory, you could probably have created and run general purpose software applications on it. Would have made a pretty wild (and ludicrously overpriced) word processor or spreadsheet station. But, of course, the software it ran was all geared towards music production, and is a very direct forerunner of the kinds of music production software that would become increasingly available for general purpose computers.
Definitely fair points re: the Synclavier and the Fairlight.
That said, from memory I'm pretty positive there's a few 'sidenotes' in the era which would have utilised general purpose portable computers as part of a live performance. The UK synthpop acts cobbling together gear post-Depeche Mode's 1981 'Speak and Spell' Album, with stuff like the Alpha Syntauri setup for the Apple II used by Herbie Hancock and Laurie Spiegel coming to mind.
You then went even more niche, for the sake of academic argument, with the Amiga demo and modscene which often focused on the use of Tracker MODs for live performance and 'DJing' on COTS consumer PC hardware.
I'd also eat my hat if there weren't Jazz and new-wave artists utilising the FM Chips in the early NEC and PC-88 style line at the time - i.e. the natural progression of the chiptune scene going full polyphony and fidelity from the MOS chip in the C64.
The Atari ST wouldn't have been so onerous to bring on tour. Even the display wouldn't be the biggest/heaviest piece of gear a band would bring with them.
That is a... beautiful laptop. It looks modern. With a beefier CPU, display, memory, and disk, something in that case could be released today and it'd sell.
Though it's edged out by the Amiga, the Atari ST was truly a thing of beauty in its day. My wife was pretty chuffed to hear that a model in the line has her name (of course, the STacy).
No, it doesn't. It looks better in certain important ways:
1) the keyboard has real keys, not those stupid "island" keys that are all the rage now.
2) the screen has a taller aspect ratio, which is better for actual computing work, whereas laptops these days all have wide screens because of economies of scale with TVs and because people want to watch HD video full-screen instead of doing real work.
This looks more similar to machines from the golden age of laptops, which was probably between 2000 and 2010.
I just bought the largest laptop I could find for work to get the vertical space. Turns out they don't make bags for those anymore though so I have a gigantic backpack that I have to squeeze in somewhere :-)
Yeah, the 16" Samsung Galaxy Book 3 Pro 360 taxes modern bags to the point I've been putting it in a sleeve and double-bagging it in my overnight bag when traveling.
I've been considering using an old ThinkPad briefcase I had, and did bring home a ThinkPad laptop bag which was surplus from work, and will probably use that in the future.
Meanwhile, my work laptop (with much smaller, but almost as wide screen) fits in a sling bag.
Biggest non-modern tell for me -- the keyboard is at the bottom of the case and there's no wrist rest. That shift was about as drastic as the hw keyboard -> blank glass of the iphone transition: Pre powerbook, keyboards were at the bottom of the system. There were weird side mount trackballs, or trackpoints. Post powerbook 100/140/170 -- trackball in the bottom center, keyboard above that.
Trackpads came later, but didn't really affect the overall layout.
My ASUS has a POWER-button where my end-button used to be... I thought we stopped mixing in power-buttons on the keyboards ages ago for obvious reasons.
My Macbook Pro has a power key: in the top right, next to F12. I'm generally not a fan of them either although, in this case, it doesn't really cause me any problems.
When Eject buttons became obsolete, the obvious thing for Apple to do was to finally put a Delete key there. But nope. For a while it was some weird "lock" thing, and now it is indeed a power button.
You really think people Backspace away old E-mails and files they want to delete (for example)?
I wondered if everyday users noticed the omission. Then I was waiting for help in an Apple store and heard a woman come in and tell a salesperson that she and her daughter were happy with their new MacBooks, except for one thing they hated: the lack of a Delete key; she asked if there was a way to remap a key to be Delete.
Backspace vs. Delete is a non-issue for 99% of consumers because they have those keys.
They probably do. On my newer MacBook Pro, the key says "Delete", even though it's really backspace.
On my desktop, I do most of my email in gmail and mutt, so I've never used "delete" to delete emails. Command-backspace does the same thing as Command-delete in Finder, if you want to delete files that way. I never do.
And I'm quite sure this is the cumbersome method consumer-level Mac users wade through day after day. Even if you have a full-sized keyboard with a Delete key, you're stymied by Apple's bizarre ignorance of its own trash can in Finder. If you select stuff in Finder and press Delete, nothing happens. It should move it to the trash can, which Windows users have managed to handle for decades. But nope; there's not even an OPTION to make it work.
Mac users also submit to needing double the keystrokes they should to delete characters, hitting the right arrow key over and over to get past stuff they want to delete and then Backspacing it away.
Apple used to use full mechanical keyboards on their laptops -- the Powerbook 1xx era (with the exception of the 100) had wonderful keyboards. Good feel, good travel. Over time they flattened down, buy the ibook/tibook era they were super flat. Still mechanical, and popping the tabs at the top opened it up for memory/disk access. The downside there was that they were flexy, and not great to type on.
The island keyboards came in when they put the keyboard base inside the aluminum case and screwed it in with 60 of the smallest screws you've ever seen. Pain in the ass when you needed to replace one, but _way_ more solid and better typing experience.
Interesting to see that Atari did complete their notebook project. As far as I know (note: might be decades of internet tall tales), Commodore was also trying to get one rolling, but eventually gave up and released merely a small factor breadbox system, the A600, albeit shipped with ready support for internal hard drives, as well as the then barely standardized PCMCIA interface.
The fact that the Atari ST's standard display was monochrome helped a lot, as they could use a cheap (and power efficient) laptop display and still maintain compatibility with most applications. That wouldn't have worked for the Amiga of course...
The Amiga A600 has nothing to do with a laptop project. It was a pet project on engineer who tried to make a cheap Amiga 500 with a analog gen-lock (it was called Amiga 300), it was supposed to be 50$ cheaper. But it didn't end up cheaper and for some reason management canceled Amiga 500 and Amiga 500+ and sold as Amiga 600. Management canceled Amiga500 that was still selling well and instead selling Amiga 600 that didn't sell well.
I often like to daydream of a world where some SGI engineer used a titanium casing to build the "SGI tiBook" instead of Apple, and we ended up in a world where SGI, not Apple, is the trillion-dollar computing company of the 21st Century.
Of course, its just a fantasy, but somehow I feel like an SGI tiBook would've won over a lot of nerds, a lot faster, than Apples' variant did ..
Some have built portable setups with an Indy and an Indy presenter. I remember 25 years ago someone showing off with such a custom portable configuration, with an Indy motherboard and PSU, a keyboard and track ball and an Indy presenter crammed inside a small metal suitcase, very chic (but required external power).
Fake versions were made for the movies "Twister" and "Congo" but AFAIK these were completely fake (the actual Indy driving the screen was off camera somewhere).
For the kids today, this is why we used to have Mouse Trails in settings!
I just checked on my Mac, and we no longer seem to have that option.