I'm still wondering why the Amiga didn't become ubiquitous. I had one and I kept getting called a liar about its price - for the performance compared to PCs of the time, surely I omitted a zero.
Like Atari with the ST, Commodore basically failed to capitalize on the original amiga and by the early 90s PCs have mostly caught up.
While the amiga 1000 was revolutionary in 1985, AGA[1] was not that special in 1992 (especially as it wasn't particularly beneficial to 3d games like doom, which were becoming the new hotstuff).
And frankly by the late 80s the 680000 architecture itself had hit a dead end performance wise. Maybe Motorola could have pulled off what Intel did with Pentium, paper over the aging CISC with RISC internals, but instead we got PowerPC.
I'm not sure if you ever used a PowerPC Mac when they were a mix of emulated 68k and PowerPC, but they were notoriously unstable. The mix of a lack of memory protection and emulated CPU instructions would have been the same for the Amiga and Atari ST whose OSs also lacked memory protection and safety features of more modern operating systems.
I have an Atari Falcon 030, Atari's last and best machine. It is a really really nice machine. But it was hobbled by poor software support -- it's only now that hackers are discovering what they can do with the combination of the 68030 and the Motorola 56k DSP in it (Example: Quake 3 has been ported/rewritten recently for it, using the 56k for 3d acceleration.)
I used to wonder what the world would have been like if the 68000 systems won out. But now we're seeing a world where the ARM belatedly wins out, which is kind of neat, tho ARM is arguably now as "evil" as Intel :-)
What is an interesting mental exercise is imagining what would have happened if the 6502 or 6809 architectures had expanded and done well. Those architectures had insanely fast interrupt processing and very fast (single CPU cycle) memory access. Some really neat machines could have been made if they'd continued to advance them, gotten past the 64k memory address limit and into high clock rates. Western Design stopped at the 65c816, a 16-bit variant of the 6502. Something faster and funner than the Amiga could have been built with a 32-bit 6502 descendant and chipsets similar to what was in the Amiga. That would have been really neat.
That was possible only because at the time, memory was faster than the core and could keep up. Modern CPUs run the core at several times memory speeds, and there is latency involved due to physical constraints.
One difference between Motorola and Intel, is Motorola was less concerned with breaking backwards compatibility.
If the 68k family had continued to evolve past the 1994s 68060, I'm sure they could have just dropped backwards compatibility to some of the more complex addressing modes, or just devoted less silicon to them, and making what remains faster. Kinda like happened with the ColdFire version of the 68k family.
The not-implemented instruction trapping in the 68K could be easily evolved to cover less frequently used instructions to make room for more optimized implementations of the frequently used ones.
What really killed the 68000 was the move to RISC, in particular Apple's move to PowerPC. That took away any hope of future evolution (they even managed to release the 68060 after that, but that was it) and collapsed the high-end 68K business.
The Motorola Series 900 machines were interesting - I had one under my desk at work for quite a while. They had stackable units, including one that contained a SCSI 3.5" floppy drive that was way faster than regular ones. We also had a DG unit with the m88k.
Yep. The one thing Intel frets over these days is cache misses. Hyperthreading is all about keeping that pipeline busy even if the original thread encounter a cache miss.
> That was possible only because at the time, memory was faster than the core and could keep up. Modern CPUs run the core at several times memory speeds, and there is latency involved due to physical constraints.
That begs the question: Would we be better off if CPU clock speeds were set such that the memory could keep up again, and we software developers learned to work within real constraints again, rather than expecting the CPU makers to keep working miracles to deliver ever more performance? I have no wish to go back to programming in Applesoft BASIC or 6502 assembler as I did in my childhood and early teenage years. But programming a 32-bit processor clocked to match the speed of memory, in C++ or Rust, wouldn't be so bad.
Would we be better off if CPU clock speeds were set such that the memory could keep up again
Absolutely not, because of the locality principle. As Terje Mathisen used to say, "All programming is an exercise in caching."
Locality isn't a property of a specific coding style or methodology, it's just the way programs work. No matter what kind of architecture we end up using 50 years from now, it will have a fast cache of some kind, backed up by slower memory of some kind. We'll have a different set of problems to confront in day-to-day development work, but hobbling the CPU won't be the answer to any of them.
Sure, caching is important. But today, we have multiple layers of cache: registers, L1, L2, sometimes L3, and RAM, all of which are caches for nonvolatile (increasingly flash) storage. All of that layering surely has a cost. So what would we get if a processor with no caches between registers and RAM were manufactured using a current process (say, 14 nm), clocked such that DRAM could keep up (so, 100 MHz if another comment on this thread is accurate), and placed on a board with enough RAM for a general-purpose OS as opposed to an RTOS for a single embedded application? Would the net result be any more power efficient than the processors that smartphones use now?
L1/L2 cache levels are transparent optimizations over the top of register--ram, so eliminating them in a RAM-bound application would save you transistors (power usage) without losing performance. But although a few certain RAM-bound applications might perform equivalently, you've destroyed all other classes of application in the process.
Power efficiency is more complex, often it's better to briefly burst then get back to sleep faster, rather than drag things out at 100 MHz, but a specific answer would depend on many factors.
> That begs the question: Would we be better off if CPU clock speeds were set such that the memory could keep up again
Memory latency is at best about 10 ns. I don't think a 100 MHz CPU would better in any way than what we have now. Well, except power requirements would sure be very low.
Yep I have some here, and there's even a board for putting it and the 6809 into Atari 8-bit computers (!). It's a fine processor, but limited to 64k address space still. But fast, and fun to play with.
Your last paragraph describes first ARM chip :) ARM was all that(fast interrupts, fast ram) and more (30k hand laid transistors). It took a while, but ARM is taking over from the bottom up, making Intel ignore highend and concentrate all of its efforts on power efficiency (raw performance all but stopped in last 5 years).
It's really hard to give people from this generation a sense of how fast things advanced and how big the leaps were in the 80s and early 90s. We went from 8 bit to multimedia 32 bit in less than 10 years. For devs it was a new box every year or two. Sometimes two new computers in a year.
My first 386 dev box was purchased in 1987. Within a year of that, I was using a Compaq portable 386 (http://en.wikipedia.org/wiki/Compaq_Portable_386). 386 was a big deal because it finally got us Intel devs a flat address space... so no more trying to fit data into tiny pages (64k). 386 killed the chief advantage of 68k architecture, and for whatever reason Motorola just couldn't get the clock speed up fast enough.
There were two things that were interesting about the Amiga in 87: video toaster for doing cheap video effects (think intro sequence to Better Call Saul, not awesome demos) and gaming.
But gaming on the PC made a huge leap in 1987 when IBM shipped ALL of their new PS/2 computers with a 256 color video adapter called the VGA (seem to remember the lowest end models only doing 256 colors in 320x260 mode... but that was good enough)... Eventually TrueVision and even ATI had video cards that could do the same sorts of things (or better) than an Amiga.
So many great computer ideas died in the 80s and early 90s... but it was really evolution... most of them died because a generalized solution (i.e. VGA with video out + software) eclipsed a specialized solution (i.e. Amiga with video toaster).
My parents insisted on buying a PC for home use. It was mainly so they could do accounting for their business at home. I had a friend who had an Amiga and I spent pretty much as much time as possible at their house using it. We even had an Amiga only store in our local mall (in the U.S.!).
I think that, in the way that Apple products are now showing up in work places due to people preferring them at home. The reverse happened in the 90s. People wanted or needed to bring work home, and their offices supplied them with PCs.
The productivity situation on PCs was always just a bit better or standardized than Amigas.
What mystified me more was that, during this time period, the Apple Macintosh took over the creative market -- especially in visual arts. The Amiga always came across to me as a far better creative machine, with better tooling, than the stuffier Mac. Again it may be due to better support for WYSIWYG output during printing and pre-press, better color matching etc. But the Amiga just felt more creative and fun to me.
Also, by the time the 68040 came out, it was starting to become clear to everybody that Motorola wasn't going to be able to keep the performance edge up. Apple switched to PowerPC but Commodore couldn't afford to. There was a whole plethora of PowerPC cards for the Amiga, to try to keep them going, but it was really obvious by then that it was game over, and people started to hunt around for the next system.
Amiga "owned" the TV market for a long time because it happened to get some important products first (Video Toaster for example).
Apple got better desktop publishing tools first.
E.g. if you wanted to do TV you during some period would want a product like the Video Toaster. If you wanted to do newspapers, you'd want Quark.
While there may have been certain platform quirks that tilted the initial creation of those tools in one direction or another (such as genlock support for video for the Amiga), platform mattered far less than application, and early application traction in a niche would paper over a lot of other platform issues.
WYSIWYG output for printing was largely still an application issue, not a platform issue, for example. Exactly for those kinds of reasons, an application lead also translated to a platform lead for those kind of niches where people would buy the platform to support an application rather than the other way around. People would buy Quark, and a Mac to run it, not pick a system and see what desktop publishing would run on it. If you loved the Amigas pre-emptive multitasking and "colourful" (compared to the Mac..) environment, tough - it couldn't run Quark (been there - had exactly that discussion back in those days).
Regarding PPC, note that the PPC cards for the Amiga appeared after Commodore had already gone bankrupt, as far as I know. At least PowerUP first appeared in '97 after Amiga Technologies announced Amiga going PPC in '95. It'd been largely obvious the game was over at least from 95-96 even for most die-hard supporters.
Interestingly, had Commodore continued it's clear the next generation Amigas would have most likely been different - the prototype "Hombre" chipset was a SOC that included a HP PA-RISC core [1]. Interestingly Commodore apparently choice PA-RISC primarily with the intent of being able to run Windows NT (at the time of the decision, the lower priced PPC - and MIPS - alternatives were not supported for NT) - something which would have been massively controversial with a lot of Amiga users.
The Apple II family straddled the divide between work and play pretty well, though it was probably more of a play machine in later years. In 1988, when I was almost 8 years old, my parents bought an Apple IIGS as our first computer. My mother, who is an accountant, ran accounting software on that machine, but the rest of us also had a lot of fun with it, in both the 8-bit Apple II emulation mode and the 16-bit native mode. A couple of years later, she bought a PC. Whether it was so she could have better accounting software, or just the same accounting software as her colleagues, or because the rest of us liked the GS so much, I don't know. I don't think she ever told me directly. In any case, the net result was that I could spend more time on the GS, both playing around and learning to program. Of course, my siblings spent quite a bit of time playing games on the GS as well.
I didn't know anyone with an Amiga, and there weren't any at school. The only Commodore machine I ever got my hands on was my paternal grandfather's Commodore 64, which seemed quite limited compared to the Apple IIGS we had at home. From what I've read, it seems that the Amiga had better graphics than the GS. And of course, the Amiga's processor was faster, unless one added an accelerator card to the GS, which we never did. The GS's sound chip (an Ensoniq) was more advanced in some ways; it had 32 oscillators. But samples had to be stored in that chip's own RAM, and there was only 64K of that. Still, there were a few good trackers for the GS; the best one was NoiseTracker from the FTA.
"What mystified me more was that, during this time period, the Apple Macintosh took over the creative market -- especially in visual arts. The Amiga always came across to me as a far better creative machine, with better tooling, than the stuffier Mac."
The Mac had a couple years head start on Amiga, and Apple had a pretty large brand name and relationship with retailers that catered to businesses than Commodore did. By 1985, Commodore (despite the CBM name) was fairly synonymous with games. There were games for Apple as well (Macs, IIe, etc) but there wasn't as much of a stigma of Apple as a 'game computer company' at that point.
And... it cost more. We all know when something costs more it must be better, right? ;)
"Again it may be due to better support for WYSIWYG output during printing and pre-press, better color matching etc."
Basically that. Whenever you wonder why something "odd" gets a foothold, look for the money trail.
One thing to note is that there was a couple of Amiga variants that lived on in broadcast media, as it was very capable of doing video work.
BTW, the BYOD kinda happened back in the day as well. There is a claim that accountants brought their personal AppleII to work so they didn't have to fight for mainframe time.
Edit: oh, and i wonder how much the dock connector had to say for the long term uptake of iPhone in the corporate world. Never mind that Apple was quick to offer a WSUS like service to handle app rollouts.
The old saw goes VisiCalc sold more Apples than Apples sold VisiCalc. Of course then Lotus 1-2-3 came on to the market and even MultiCalc struggled to compete with that.
There never was a high volume machine that really used the '040 to its potential. The Next Cube was pretty good and there was an Alpha from DEC with it as well but that chip would have been a very nice one to have in a machine like the ST Falcon, in the end it was mostly heat (or powerconsumption if you wish) that killed it rather than that it didn't have the raw performance.
I seem to recall that right around the time Intel released the 80486 they started getting into the Mhz wars and clock multiplying the hell out of everything. DX2 then DX4. The fastest 68040 maxed out at what...40Mhz, while the 486 ended up somewhere at 150Mhz or so.
But the low volumes definitely hurt Motorola's ability to keep up with Intel's R&D. Clock for clock, the 68k architecture was faster, but Intel figured out how to throw a lot more clocks at the problem and they kept doing that until PowerPCs were not really a consumer-level home computer chip anymore.
> getting into the Mhz wars and clock multiplying the hell out of everything. DX2 then DX4
For those reading who aren't old timers, this needs to be elaborated on.
This wasn't any trick or sleight on Intel's part. The parts really did run 2x as fast or 4x as fast internally. That was a major achievement. An instruction that took 5 clocks to execute at 33 MHz took 5 clocks to execute at 50 MHz and took 5 clocks to execute at 100 MHz. This was all accomplished over a relatively brief period of time compared to today's rate of CPU speed advancements.
What slowed the CPU down was the growing mismatch between the internal operation and the external memory bus which continued to run at (usually) 33 MHz. It was possible to run the external bus at 50 MHz but most designs didn't. The 8K byte on-chip cache helped mitigate the mismatch.
>> The parts really did run 2x as fast or 4x as fast internally.
Actually, the Intel 486DX4 ran at only 3x the speed, despite the name. Intel couldn't use the DX3 name because of a trademark owned by AMD, who also had a part called the Am486DX4. The AMD part was also available with a 40MHz bus and a tripled 120MHz CPU clock.
By the time Intel sold > 66 MHz 486's, the Pentium was already out. They remained a low end alternative, but still topped out at 100 MHz. AMD took their 486 clone (called the 5x86) eventually up to 150 MHz.
Which also reminded me that it probably didn't help that UNIX workstation vendors often replaced 68K with their own RISC architecture. MIPS was a attempt at a standard, but...
I think if they could have got a version of the Amiga into the same price range as a video game console, they probably would have been ok. The pricing at PC levels meant it was a "family" decision and the PC was going to get picked.
The other problem is the Amiga never had Adobe as a developer. That would have made a huge difference.
They were pretty much doomed by then. Plus the fun quote from the Wikipedia article "Ultimately, Commodore was not able to meet demand for new units because of component supply problems." Lovely.
Jay Miner was lost that same year. It is just so sad.
The articles nails it talking about the distribution channels. The small shops that were selling Apples and Ataris refused to carry Commodore computers when they started selling VIC-20s and C-64s at big box stores like Toys R Us. When Commodore had a more powerful, and expensive, computer available, it was too expensive for those stores and they could not support it.
Commodore outsold Apple on the low end by a large factor for many years (and continued to do so in terms of units on a worldwide basis pretty much to the bitter end).
They'd have continued to support Commodore if Commodore didn't continuously mess them around. A major factor was the price war that was to be Jack Tramiel's parting shot: Commodore overnight announced a massive price drop and left their dealers to take the hit on all inventory they already had on hand.
That was typical for Tramiel playing hard-ball but also severely hurt a lot of businesses that had bet on Commodore.
You see the difference when you look at the US market vs. Europe - Europe was handled by subsidiaries that often handled their dealer networks far better. Particularly the UK and Germany were Commodore strongholds. The UK subsidiary actually tried to get financing for a buyout of Commodore International after the bankruptcy, and survived for quite a while on their own financial strength.
Commodore's more significant mistake was as the article said "Commodore hedged their bets everywhere — except in the Amiga’s most obvious application as a game machine, from which they ran terrified."
Which is the exact opposite of what they should have done. Instead they should have split their market into two, one focusing on high-end workstations and the other game and home computing.
I remember when they were talking about the next generation of custom graphics chips and there was all of this hold up, which was so unnecessary because there was a simple solution. They only needed to add three of the same chip in one for each color red, blue and green or for controlling every third scan line (I had even read of someone doing this with three separate Amigas). There's your next gen workstation right there.
> they should have split their market into two, one focusing on high-end workstations and the other game and home computing.
That's exactly what they did (and very successfully), with the 500 and 2000 models.
The main problem for mainstream adoption at the time was that the machine, being so successful as a multimedia and games machine, was regarded as a toy.
But the stake through the heart of the Amiga, and all the other proprietary platforms of the time, was that the IBM PC architecture became a standard that was open for all manufacturers to produce for. You can't compete with that.
(Unless you are Steve Jobs. In fact the accepted wisdom at the time, with Apple and the Mac teetering on the brink of annihilation, was that they should open up their hardware platform and become a software company.)
>That's exactly what they did (and very successfully), with the 500 and 2000 models.
I'd argue that while the 500 was what they needed, the 2000 was quite unambitious. What Amiga really needed in their 1987 high-end product was a 14MHz 68020 (or EC020) based, 32-bit system, with a graphics chipset to match. Take the old chipset, but give it 2x the clock and 2x the memory bandwidth, and all sorts of things (including non-interlaced 8-bit 640x400 video) become possible.
If they had started working on this in late 1985, I think they would have had a chance at being taken more seriously. Instead, they didn't even start revising the chipset until 1988, and they didn't have a native 32-bit based system until the 3000 in 1990 (and even then, still with the old chipset).
In fairness, I understand the US-based side of Commodore was thinking along these lines for the 2000, but lost out to the Commodore German division's model of a cheaper 1000 with slots and a better case.
What Apple proved was that an alternative to the PC was possible, if you were clever enough in your technology and your marketing. Commodore was neither.
Unfortunately, while Exec (the kernel) and DOS (the file system) were not really tied that hard to the architecture, Intuition (the GUI) was very tied to the hardware. Change the video, and you have a new GUI to code for. Also, it may not have been all that possible to enhance the hardware from the given design [1].
[1] A bit-plane video architecture. For example, say the video is 256x256 (for simplicity) 1 bit color. That's 8 bits per pixel, 8K video buffer. What 2-bit color? Okay, the low bit of each pixel s in one 8K video buffer; the high bit of each pixel is in another 8K video buffer. Want 8 colors? Add another 1-bit 8K video buffer. Now you have one pixel spread across three bytes.
Never mind that Intuition also allowed control access to the Copper [2] so you could also specify the screen resolution and color. You could split the screen (upper half, 320x100, dual play fields (three bit planes defining the background, three bit planes defining the foreground) with 32 colors, and the lower half 640x200 (interlaced) with four colors). You could also specify up to 8 hardware sprites.
Yup, Intuition was very tied to the video hardware.
[2] A special 3-instruction CPU to control the hardware registers based on the video beam position, allowing you to change the entire video setup virtually anywhere on the screen (any given scan line, within four pixels horizontally); colors, memory used for the video buffer, resolution, sprite locations. You could literally display all 12 bits of supported color on a single 1-bit video page by mucking with the register settings on a per-4-pixel basis using the Copper. It'd be a long Copper program, but the 68000 would be completely idle.
The Amiga 1000 could support a 640x400x4 bitplane 30fps interlaced display with its 16-bit path to memory at 7.14MHz. If you double the clock, and double the width of the data bus, then a 640x400x8 bitplane at 60fps becomes theoretically possible, which would have made it an easier sell for business (the need for a non-interlaced 640x400 mode was obvious quite early on)
Obviously there's more to improving the graphics architecture than just improving the clock and data path. My point was that the easy benefits of Moore's law at that point would have substantial improvements possible, if Commodore was able and willing to make the attempt. "What amazing graphics thing are we going to do next?" was the question Commodore needed to be asking themselves in 1985, but it's clear they didn't. How much of that was due to money, and how much was due to corporate culture, is not clear to me.
And WRT Intuition: it was tied to the hardware, but you overstate the case. Later Amigas had new graphics modes, and Intuition had support for 8 bitplanes from the beginning. I don't think 640x400x8 @ 60Hz was impossible, or even that difficult, in 1987, from an OS perspective.
BTW, anyone else disappointed that there are no Amiga engineering alum who have discovered this thread?
I don't know what kind of machines you used in 1985, but 80x25 was easily done on an A1000, and was what you were competing against on the PC side in most instances.
There were many problems for Commodore, but lagging on the graphics side vs. the PC was not an issue until years later.
He said non-interlaced. You had to buy (an expensive after market) scan doubler to get rid of the flicker on the Amiga. Biggest mistake they made .. they should have included a true workstation class high resolution mono video mode.
Doing productivity apps on the Amiga in its highest resolution vs the Atari in its highest (monochrome) was no comparison.
The Atari did better in certain markets -- MIDI sequencing, big time, and to a much lesser extant DTP -- because of this.
But having that nice mono screen for productivity didn't make the ST a big success, so I don't think the original comment stands.
The Amiga was not a success because it wasn't an IBM PC.
I would also argue that the 500 wasn't game machine enough. It should have been sold with controllers, and a cartridge slot. A home computer addition that added a keyboard and disk drive, could have been bought as a bundle or add-on. If they had done that I think they could have hung in to be a player in the Nintendo age.
On the high-end the 2000 was also not enough, again they were hedging too much trying to keep costs down. They should have pumped up the specs and doubled the price. Instead of Sun Workstations in their offices they should have been using Amiga workstations.