Hacker News new | past | comments | ask | show | jobs | submit login
The 68000 Wars, Part 3: We Made Amiga, They Fucked It Up (filfre.net)
139 points by mgunes on April 11, 2015 | hide | past | favorite | 85 comments



Obligitory links as they are not on the page...

  Part 1: http://www.filfre.net/2015/03/the-68000-wars-part-1-lorraine/    
  Part 2: http://www.filfre.net/2015/04/the-68000-wars-part-2-jack-is-back/


I sometimes like to escape into a fantasy daydream world when I'm at work, where the British government of the 80s realized the importance of controlling your own destiny, and just bought a million Acorn Archimedes for the entire civil service, military, universities etc, enough that it became a no-brainer for everyone else in the UK. We could be 10-20 years ahead of where we are now. Same if the Germans and Dutch had just bought Commmodore and Atari outright and moved their production over.

Of course if they had they would probably have been tricked into buying Amstrad shit.


I live in a similar fantasy land, however I think Acorn had ample help from the UK government. After all, pretty much every secondary school in the UK purchased them. That and free national TV advertising from the BBC.

The problem was that platforms that were limited to a single nation were not viable, Acorn never really broke into the US market. I don't believe there was even much adoption in Europe. You can see the same story in Japan with MSX.

That said, I agree that RiscOS was far superior to many competitors. But as we still see, it is adoption and the ecosystem around the platform which is often the deciding factor in success than the superiority of the basic technology.

Still more than a little nostalgic though... and wish it was able to address international markets better. Anyone have some cash to buy the rights to RiscOS and open source it? I think it only take 30K GBP or so. :)


If just the cash were the problem I'd be happy to do it but I think RiscOS's time has past.

QnX now, that would be another matter but I suspect RIM will want more then 30 K for it. I already approached Quantum when they still owned it but there was absolutely no way to get them to even respond. Which is a pity because I think something like QnX in open source form would absolutely rock.


RiscOS is open source https://www.riscosopen.org/content/

Hmmm.. if you count non-commercial restriction, parts have that.


That and free national TV advertising from the BBC.

Ah that is not true. In fact Acorn paid the BBC a royalty on every Beeb sold.


And hadn't the British government also already had a pretty serious go at building a national champion in computing with ICL https://en.wikipedia.org/wiki/International_Computers_Limite... ? IIRC part of the problem was that when it came to building a global computer business on the foundation of a home-market advantage, US companies had by far the biggest and wealthiest home market.


That would require the UK civil service to treat STEM graduates as equals to the Classics grads - instead of some below stairs oily menial.


Yes the British civil service has a long history of ignoring British engineers and just doing whatever the Americans tell them. A key example is Black Arrow. Or the TSR-2. Exactly the same with the PC. Imagine how it could have been...


The UK managed to mess up the TSR-2 all by itself [1].

[1] http://www.rafmuseum.org.uk/documents/Research/RAF-Historica...


Thank you for posting that doc. My Dad worked on TSR-2 avionics (at Ferranti) and I grew up entertained by tales of TFR and moving map displays. Interesting reading.


I remember being take out to CIT on a school trip and seeing the TSR2 sadly sitting at the back of one of the hangers.

A plane so hot that one of the protype's on a single engine could walk away from the fastest fighter of the era!

If I ever run a laundry files game 666 squadron will be using the advanced version of the TSR 2 rather than the Bombcords


Yeah, sometimes I like to daydream too: what would happen if Poland released to the public their 16bit computer built in early 70s:

http://en.wikipedia.org/wiki/K-202


Hey! I loved my Amstrad.


what's that about Amstrad?


This is a great series. I can also really recommend the book the author wrote, which presents a more technical view of the Amiga:

https://mitpress.mit.edu/books/future-was-here


Yes, the whole "platform series" from MIT press is fantastic.

The first book in the series "racing the beam", especially so.


Are they complementary? As in, is it worth to read the book after reading the blogposts, or do they tell the same story?


Yes, very much so. The book contains some quite technical walkthroughs of the Amiga, both software and hardware. In enough detail that a technical person will be able to understand more or less exactly what's going on, but without requiring any knowledge about the Amiga.


Conversely, the blog posts tell the story of the Amiga's origin in more detail than the book. So they do complement each other.


Thank you both. I'll definitely have a look at it, I'm fascinated by computer history.


I'm still wondering why the Amiga didn't become ubiquitous. I had one and I kept getting called a liar about its price - for the performance compared to PCs of the time, surely I omitted a zero.


Some of it is in the article.

Like Atari with the ST, Commodore basically failed to capitalize on the original amiga and by the early 90s PCs have mostly caught up.

While the amiga 1000 was revolutionary in 1985, AGA[1] was not that special in 1992 (especially as it wasn't particularly beneficial to 3d games like doom, which were becoming the new hotstuff).

The amiga of the 90s was the 3dfx voodoo.

[1] http://en.wikipedia.org/wiki/Amiga_Advanced_Graphics_Archite...


And frankly by the late 80s the 680000 architecture itself had hit a dead end performance wise. Maybe Motorola could have pulled off what Intel did with Pentium, paper over the aging CISC with RISC internals, but instead we got PowerPC.

I'm not sure if you ever used a PowerPC Mac when they were a mix of emulated 68k and PowerPC, but they were notoriously unstable. The mix of a lack of memory protection and emulated CPU instructions would have been the same for the Amiga and Atari ST whose OSs also lacked memory protection and safety features of more modern operating systems.

I have an Atari Falcon 030, Atari's last and best machine. It is a really really nice machine. But it was hobbled by poor software support -- it's only now that hackers are discovering what they can do with the combination of the 68030 and the Motorola 56k DSP in it (Example: Quake 3 has been ported/rewritten recently for it, using the 56k for 3d acceleration.)

I used to wonder what the world would have been like if the 68000 systems won out. But now we're seeing a world where the ARM belatedly wins out, which is kind of neat, tho ARM is arguably now as "evil" as Intel :-)

What is an interesting mental exercise is imagining what would have happened if the 6502 or 6809 architectures had expanded and done well. Those architectures had insanely fast interrupt processing and very fast (single CPU cycle) memory access. Some really neat machines could have been made if they'd continued to advance them, gotten past the 64k memory address limit and into high clock rates. Western Design stopped at the 65c816, a 16-bit variant of the 6502. Something faster and funner than the Amiga could have been built with a 32-bit 6502 descendant and chipsets similar to what was in the Amiga. That would have been really neat.


Maybe Motorola could have pulled off what Intel did with Pentium, paper over the aging CISC with RISC internals, but instead we got PowerPC.

I think one of the factors could be that 68k is a bit harder to decode than x86 - while the instruction set is more orthogonal, the encoding has less structure; compare http://goldencrystal.free.fr/M68kOpcodes.pdf (68k) with http://i.stack.imgur.com/07zKL.png (x86).

very fast (single CPU cycle) memory access

That was possible only because at the time, memory was faster than the core and could keep up. Modern CPUs run the core at several times memory speeds, and there is latency involved due to physical constraints.


One difference between Motorola and Intel, is Motorola was less concerned with breaking backwards compatibility.

If the 68k family had continued to evolve past the 1994s 68060, I'm sure they could have just dropped backwards compatibility to some of the more complex addressing modes, or just devoted less silicon to them, and making what remains faster. Kinda like happened with the ColdFire version of the 68k family.


The not-implemented instruction trapping in the 68K could be easily evolved to cover less frequently used instructions to make room for more optimized implementations of the frequently used ones.

What really killed the 68000 was the move to RISC, in particular Apple's move to PowerPC. That took away any hope of future evolution (they even managed to release the 68060 after that, but that was it) and collapsed the high-end 68K business.


Note that Motorola also did the m88k which was their RISC approach. It generated a bit of interest, but was never successful. https://en.wikipedia.org/wiki/Motorola_88000

The Motorola Series 900 machines were interesting - I had one under my desk at work for quite a while. They had stackable units, including one that contained a SCSI 3.5" floppy drive that was way faster than regular ones. We also had a DG unit with the m88k.


A lot of what was the 88K was put into the PowerPC.


Yep. The one thing Intel frets over these days is cache misses. Hyperthreading is all about keeping that pipeline busy even if the original thread encounter a cache miss.


> very fast (single CPU cycle) memory access

> That was possible only because at the time, memory was faster than the core and could keep up. Modern CPUs run the core at several times memory speeds, and there is latency involved due to physical constraints.

That begs the question: Would we be better off if CPU clock speeds were set such that the memory could keep up again, and we software developers learned to work within real constraints again, rather than expecting the CPU makers to keep working miracles to deliver ever more performance? I have no wish to go back to programming in Applesoft BASIC or 6502 assembler as I did in my childhood and early teenage years. But programming a 32-bit processor clocked to match the speed of memory, in C++ or Rust, wouldn't be so bad.


Would we be better off if CPU clock speeds were set such that the memory could keep up again

Absolutely not, because of the locality principle. As Terje Mathisen used to say, "All programming is an exercise in caching."

Locality isn't a property of a specific coding style or methodology, it's just the way programs work. No matter what kind of architecture we end up using 50 years from now, it will have a fast cache of some kind, backed up by slower memory of some kind. We'll have a different set of problems to confront in day-to-day development work, but hobbling the CPU won't be the answer to any of them.


Sure, caching is important. But today, we have multiple layers of cache: registers, L1, L2, sometimes L3, and RAM, all of which are caches for nonvolatile (increasingly flash) storage. All of that layering surely has a cost. So what would we get if a processor with no caches between registers and RAM were manufactured using a current process (say, 14 nm), clocked such that DRAM could keep up (so, 100 MHz if another comment on this thread is accurate), and placed on a board with enough RAM for a general-purpose OS as opposed to an RTOS for a single embedded application? Would the net result be any more power efficient than the processors that smartphones use now?


L1/L2 cache levels are transparent optimizations over the top of register--ram, so eliminating them in a RAM-bound application would save you transistors (power usage) without losing performance. But although a few certain RAM-bound applications might perform equivalently, you've destroyed all other classes of application in the process.

Power efficiency is more complex, often it's better to briefly burst then get back to sleep faster, rather than drag things out at 100 MHz, but a specific answer would depend on many factors.


> That begs the question: Would we be better off if CPU clock speeds were set such that the memory could keep up again

Memory latency is at best about 10 ns. I don't think a 100 MHz CPU would better in any way than what we have now. Well, except power requirements would sure be very low.


>What is an interesting mental exercise is imagining what would have happened if the 6502 or 6809 architectures had expanded and done well.

It was, but for some reason Hitachi didn't publicize it, and word only escaped years too late: http://en.wikipedia.org/wiki/Hitachi_6309


Yep I have some here, and there's even a board for putting it and the 6809 into Atari 8-bit computers (!). It's a fine processor, but limited to 64k address space still. But fast, and fun to play with.


Your last paragraph describes first ARM chip :) ARM was all that(fast interrupts, fast ram) and more (30k hand laid transistors). It took a while, but ARM is taking over from the bottom up, making Intel ignore highend and concentrate all of its efforts on power efficiency (raw performance all but stopped in last 5 years).


When you say ARM is now as "evil" as Intel, are you referring to processor architecture, business practices, or both?


>(Example: Quake 3 has been ported/rewritten recently for it, using the 56k for 3d acceleration.)

Wow, what is that like? I'd love to see a demo video .. know of one?


Seems to be this (Quake 2, not 3, and still in development): https://www.youtube.com/watch?v=hDXSMgW-r5M&index=1&list=PLN...


Yes, sorry, Quake 2.

Long thread here: http://www.atari-forum.com/viewtopic.php?f=68&t=26775


Maybe Motorola could have pulled off what Intel did with Pentium, paper over the aging CISC with RISC internals

They did; that's what the mc68060 was. Too little, too late and (as you say) corporate attention directed at PPC.


If anyone is interested, Stuart Brown's "Doomed: The Embers of Amiga FPS" is a nice little documentary of the history mentioned above.

https://www.youtube.com/watch?v=Tv6aJRGpz_A


It's really hard to give people from this generation a sense of how fast things advanced and how big the leaps were in the 80s and early 90s. We went from 8 bit to multimedia 32 bit in less than 10 years. For devs it was a new box every year or two. Sometimes two new computers in a year.

My first 386 dev box was purchased in 1987. Within a year of that, I was using a Compaq portable 386 (http://en.wikipedia.org/wiki/Compaq_Portable_386). 386 was a big deal because it finally got us Intel devs a flat address space... so no more trying to fit data into tiny pages (64k). 386 killed the chief advantage of 68k architecture, and for whatever reason Motorola just couldn't get the clock speed up fast enough.

There were two things that were interesting about the Amiga in 87: video toaster for doing cheap video effects (think intro sequence to Better Call Saul, not awesome demos) and gaming.

But gaming on the PC made a huge leap in 1987 when IBM shipped ALL of their new PS/2 computers with a 256 color video adapter called the VGA (seem to remember the lowest end models only doing 256 colors in 320x260 mode... but that was good enough)... Eventually TrueVision and even ATI had video cards that could do the same sorts of things (or better) than an Amiga.

So many great computer ideas died in the 80s and early 90s... but it was really evolution... most of them died because a generalized solution (i.e. VGA with video out + software) eclipsed a specialized solution (i.e. Amiga with video toaster).


AFAIK the lowest end was called MCGA.


My parents insisted on buying a PC for home use. It was mainly so they could do accounting for their business at home. I had a friend who had an Amiga and I spent pretty much as much time as possible at their house using it. We even had an Amiga only store in our local mall (in the U.S.!).

I think that, in the way that Apple products are now showing up in work places due to people preferring them at home. The reverse happened in the 90s. People wanted or needed to bring work home, and their offices supplied them with PCs.

The productivity situation on PCs was always just a bit better or standardized than Amigas.

What mystified me more was that, during this time period, the Apple Macintosh took over the creative market -- especially in visual arts. The Amiga always came across to me as a far better creative machine, with better tooling, than the stuffier Mac. Again it may be due to better support for WYSIWYG output during printing and pre-press, better color matching etc. But the Amiga just felt more creative and fun to me.

Also, by the time the 68040 came out, it was starting to become clear to everybody that Motorola wasn't going to be able to keep the performance edge up. Apple switched to PowerPC but Commodore couldn't afford to. There was a whole plethora of PowerPC cards for the Amiga, to try to keep them going, but it was really obvious by then that it was game over, and people started to hunt around for the next system.


Look towards applications. It's that simple.

Amiga "owned" the TV market for a long time because it happened to get some important products first (Video Toaster for example).

Apple got better desktop publishing tools first.

E.g. if you wanted to do TV you during some period would want a product like the Video Toaster. If you wanted to do newspapers, you'd want Quark.

While there may have been certain platform quirks that tilted the initial creation of those tools in one direction or another (such as genlock support for video for the Amiga), platform mattered far less than application, and early application traction in a niche would paper over a lot of other platform issues.

WYSIWYG output for printing was largely still an application issue, not a platform issue, for example. Exactly for those kinds of reasons, an application lead also translated to a platform lead for those kind of niches where people would buy the platform to support an application rather than the other way around. People would buy Quark, and a Mac to run it, not pick a system and see what desktop publishing would run on it. If you loved the Amigas pre-emptive multitasking and "colourful" (compared to the Mac..) environment, tough - it couldn't run Quark (been there - had exactly that discussion back in those days).

Regarding PPC, note that the PPC cards for the Amiga appeared after Commodore had already gone bankrupt, as far as I know. At least PowerUP first appeared in '97 after Amiga Technologies announced Amiga going PPC in '95. It'd been largely obvious the game was over at least from 95-96 even for most die-hard supporters.

Interestingly, had Commodore continued it's clear the next generation Amigas would have most likely been different - the prototype "Hombre" chipset was a SOC that included a HP PA-RISC core [1]. Interestingly Commodore apparently choice PA-RISC primarily with the intent of being able to run Windows NT (at the time of the decision, the lower priced PPC - and MIPS - alternatives were not supported for NT) - something which would have been massively controversial with a lot of Amiga users.

[1] https://en.wikipedia.org/wiki/Amiga_Hombre_chipset


The Apple II family straddled the divide between work and play pretty well, though it was probably more of a play machine in later years. In 1988, when I was almost 8 years old, my parents bought an Apple IIGS as our first computer. My mother, who is an accountant, ran accounting software on that machine, but the rest of us also had a lot of fun with it, in both the 8-bit Apple II emulation mode and the 16-bit native mode. A couple of years later, she bought a PC. Whether it was so she could have better accounting software, or just the same accounting software as her colleagues, or because the rest of us liked the GS so much, I don't know. I don't think she ever told me directly. In any case, the net result was that I could spend more time on the GS, both playing around and learning to program. Of course, my siblings spent quite a bit of time playing games on the GS as well.

I didn't know anyone with an Amiga, and there weren't any at school. The only Commodore machine I ever got my hands on was my paternal grandfather's Commodore 64, which seemed quite limited compared to the Apple IIGS we had at home. From what I've read, it seems that the Amiga had better graphics than the GS. And of course, the Amiga's processor was faster, unless one added an accelerator card to the GS, which we never did. The GS's sound chip (an Ensoniq) was more advanced in some ways; it had 32 oscillators. But samples had to be stored in that chip's own RAM, and there was only 64K of that. Still, there were a few good trackers for the GS; the best one was NoiseTracker from the FTA.


"What mystified me more was that, during this time period, the Apple Macintosh took over the creative market -- especially in visual arts. The Amiga always came across to me as a far better creative machine, with better tooling, than the stuffier Mac."

The Mac had a couple years head start on Amiga, and Apple had a pretty large brand name and relationship with retailers that catered to businesses than Commodore did. By 1985, Commodore (despite the CBM name) was fairly synonymous with games. There were games for Apple as well (Macs, IIe, etc) but there wasn't as much of a stigma of Apple as a 'game computer company' at that point.

And... it cost more. We all know when something costs more it must be better, right? ;)


"Again it may be due to better support for WYSIWYG output during printing and pre-press, better color matching etc."

Basically that. Whenever you wonder why something "odd" gets a foothold, look for the money trail.

One thing to note is that there was a couple of Amiga variants that lived on in broadcast media, as it was very capable of doing video work.

BTW, the BYOD kinda happened back in the day as well. There is a claim that accountants brought their personal AppleII to work so they didn't have to fight for mainframe time.

Edit: oh, and i wonder how much the dock connector had to say for the long term uptake of iPhone in the corporate world. Never mind that Apple was quick to offer a WSUS like service to handle app rollouts.


The old saw goes VisiCalc sold more Apples than Apples sold VisiCalc. Of course then Lotus 1-2-3 came on to the market and even MultiCalc struggled to compete with that.


There never was a high volume machine that really used the '040 to its potential. The Next Cube was pretty good and there was an Alpha from DEC with it as well but that chip would have been a very nice one to have in a machine like the ST Falcon, in the end it was mostly heat (or powerconsumption if you wish) that killed it rather than that it didn't have the raw performance.


You might be right.

I seem to recall that right around the time Intel released the 80486 they started getting into the Mhz wars and clock multiplying the hell out of everything. DX2 then DX4. The fastest 68040 maxed out at what...40Mhz, while the 486 ended up somewhere at 150Mhz or so.

But the low volumes definitely hurt Motorola's ability to keep up with Intel's R&D. Clock for clock, the 68k architecture was faster, but Intel figured out how to throw a lot more clocks at the problem and they kept doing that until PowerPCs were not really a consumer-level home computer chip anymore.


> getting into the Mhz wars and clock multiplying the hell out of everything. DX2 then DX4

For those reading who aren't old timers, this needs to be elaborated on.

This wasn't any trick or sleight on Intel's part. The parts really did run 2x as fast or 4x as fast internally. That was a major achievement. An instruction that took 5 clocks to execute at 33 MHz took 5 clocks to execute at 50 MHz and took 5 clocks to execute at 100 MHz. This was all accomplished over a relatively brief period of time compared to today's rate of CPU speed advancements.

What slowed the CPU down was the growing mismatch between the internal operation and the external memory bus which continued to run at (usually) 33 MHz. It was possible to run the external bus at 50 MHz but most designs didn't. The 8K byte on-chip cache helped mitigate the mismatch.


>> The parts really did run 2x as fast or 4x as fast internally.

Actually, the Intel 486DX4 ran at only 3x the speed, despite the name. Intel couldn't use the DX3 name because of a trademark owned by AMD, who also had a part called the Am486DX4. The AMD part was also available with a 40MHz bus and a tripled 120MHz CPU clock.


By the time Intel sold > 66 MHz 486's, the Pentium was already out. They remained a low end alternative, but still topped out at 100 MHz. AMD took their 486 clone (called the 5x86) eventually up to 150 MHz.


Which also reminded me that it probably didn't help that UNIX workstation vendors often replaced 68K with their own RISC architecture. MIPS was a attempt at a standard, but...


Intel 486s topped out at 100mhz. I think the others went up to 133. (let's ignore things that fit in a socket for a 486, but weren't really 486's)


I think if they could have got a version of the Amiga into the same price range as a video game console, they probably would have been ok. The pricing at PC levels meant it was a "family" decision and the PC was going to get picked.

The other problem is the Amiga never had Adobe as a developer. That would have made a huge difference.


Well they did try to make an Amiga games control as a last hail mary of sorts.

https://en.wikipedia.org/wiki/Amiga_CD32


They were pretty much doomed by then. Plus the fun quote from the Wikipedia article "Ultimately, Commodore was not able to meet demand for new units because of component supply problems." Lovely.

Jay Miner was lost that same year. It is just so sad.


The articles nails it talking about the distribution channels. The small shops that were selling Apples and Ataris refused to carry Commodore computers when they started selling VIC-20s and C-64s at big box stores like Toys R Us. When Commodore had a more powerful, and expensive, computer available, it was too expensive for those stores and they could not support it.


Well those small shops, especially if they carried Apple, probably knew not to offend. Apple was no better to stores than MS was/is to OEMs.


Commodore outsold Apple on the low end by a large factor for many years (and continued to do so in terms of units on a worldwide basis pretty much to the bitter end).

They'd have continued to support Commodore if Commodore didn't continuously mess them around. A major factor was the price war that was to be Jack Tramiel's parting shot: Commodore overnight announced a massive price drop and left their dealers to take the hit on all inventory they already had on hand.

That was typical for Tramiel playing hard-ball but also severely hurt a lot of businesses that had bet on Commodore.

You see the difference when you look at the US market vs. Europe - Europe was handled by subsidiaries that often handled their dealer networks far better. Particularly the UK and Germany were Commodore strongholds. The UK subsidiary actually tried to get financing for a buyout of Commodore International after the bankruptcy, and survived for quite a while on their own financial strength.


Proprietary chips.

The IBM PC was basically off the shelf components bar the BIOS. And so when Compaq clean roomed said chip, it was a free for all.

Especially as IBM was a old name in the corporate world, and so the PC had a foothold where the Amiga did not.


Commodore's more significant mistake was as the article said "Commodore hedged their bets everywhere — except in the Amiga’s most obvious application as a game machine, from which they ran terrified."

Which is the exact opposite of what they should have done. Instead they should have split their market into two, one focusing on high-end workstations and the other game and home computing.

I remember when they were talking about the next generation of custom graphics chips and there was all of this hold up, which was so unnecessary because there was a simple solution. They only needed to add three of the same chip in one for each color red, blue and green or for controlling every third scan line (I had even read of someone doing this with three separate Amigas). There's your next gen workstation right there.


> they should have split their market into two, one focusing on high-end workstations and the other game and home computing.

That's exactly what they did (and very successfully), with the 500 and 2000 models.

The main problem for mainstream adoption at the time was that the machine, being so successful as a multimedia and games machine, was regarded as a toy.

But the stake through the heart of the Amiga, and all the other proprietary platforms of the time, was that the IBM PC architecture became a standard that was open for all manufacturers to produce for. You can't compete with that.

(Unless you are Steve Jobs. In fact the accepted wisdom at the time, with Apple and the Mac teetering on the brink of annihilation, was that they should open up their hardware platform and become a software company.)


>That's exactly what they did (and very successfully), with the 500 and 2000 models.

I'd argue that while the 500 was what they needed, the 2000 was quite unambitious. What Amiga really needed in their 1987 high-end product was a 14MHz 68020 (or EC020) based, 32-bit system, with a graphics chipset to match. Take the old chipset, but give it 2x the clock and 2x the memory bandwidth, and all sorts of things (including non-interlaced 8-bit 640x400 video) become possible.

If they had started working on this in late 1985, I think they would have had a chance at being taken more seriously. Instead, they didn't even start revising the chipset until 1988, and they didn't have a native 32-bit based system until the 3000 in 1990 (and even then, still with the old chipset).

In fairness, I understand the US-based side of Commodore was thinking along these lines for the 2000, but lost out to the Commodore German division's model of a cheaper 1000 with slots and a better case.

What Apple proved was that an alternative to the PC was possible, if you were clever enough in your technology and your marketing. Commodore was neither.


Unfortunately, while Exec (the kernel) and DOS (the file system) were not really tied that hard to the architecture, Intuition (the GUI) was very tied to the hardware. Change the video, and you have a new GUI to code for. Also, it may not have been all that possible to enhance the hardware from the given design [1].

[1] A bit-plane video architecture. For example, say the video is 256x256 (for simplicity) 1 bit color. That's 8 bits per pixel, 8K video buffer. What 2-bit color? Okay, the low bit of each pixel s in one 8K video buffer; the high bit of each pixel is in another 8K video buffer. Want 8 colors? Add another 1-bit 8K video buffer. Now you have one pixel spread across three bytes.

Never mind that Intuition also allowed control access to the Copper [2] so you could also specify the screen resolution and color. You could split the screen (upper half, 320x100, dual play fields (three bit planes defining the background, three bit planes defining the foreground) with 32 colors, and the lower half 640x200 (interlaced) with four colors). You could also specify up to 8 hardware sprites.

Yup, Intuition was very tied to the video hardware.

[2] A special 3-instruction CPU to control the hardware registers based on the video beam position, allowing you to change the entire video setup virtually anywhere on the screen (any given scan line, within four pixels horizontally); colors, memory used for the video buffer, resolution, sprite locations. You could literally display all 12 bits of supported color on a single 1-bit video page by mucking with the register settings on a per-4-pixel basis using the Copper. It'd be a long Copper program, but the 68000 would be completely idle.


The Amiga 1000 could support a 640x400x4 bitplane 30fps interlaced display with its 16-bit path to memory at 7.14MHz. If you double the clock, and double the width of the data bus, then a 640x400x8 bitplane at 60fps becomes theoretically possible, which would have made it an easier sell for business (the need for a non-interlaced 640x400 mode was obvious quite early on)

Obviously there's more to improving the graphics architecture than just improving the clock and data path. My point was that the easy benefits of Moore's law at that point would have substantial improvements possible, if Commodore was able and willing to make the attempt. "What amazing graphics thing are we going to do next?" was the question Commodore needed to be asking themselves in 1985, but it's clear they didn't. How much of that was due to money, and how much was due to corporate culture, is not clear to me.

And WRT Intuition: it was tied to the hardware, but you overstate the case. Later Amigas had new graphics modes, and Intuition had support for 8 bitplanes from the beginning. I don't think 640x400x8 @ 60Hz was impossible, or even that difficult, in 1987, from an OS perspective.

BTW, anyone else disappointed that there are no Amiga engineering alum who have discovered this thread?


Poor old Commodore didn't have the money, so management couldn't deploy the resources until their backs were against the wall.

If the A1000 had managed an 80-column non-interlaced mode, it might have done better initially, but as it was, the machine was a joke for business.


I don't know what kind of machines you used in 1985, but 80x25 was easily done on an A1000, and was what you were competing against on the PC side in most instances.

There were many problems for Commodore, but lagging on the graphics side vs. the PC was not an issue until years later.


He said non-interlaced. You had to buy (an expensive after market) scan doubler to get rid of the flicker on the Amiga. Biggest mistake they made .. they should have included a true workstation class high resolution mono video mode.

Doing productivity apps on the Amiga in its highest resolution vs the Atari in its highest (monochrome) was no comparison.

The Atari did better in certain markets -- MIDI sequencing, big time, and to a much lesser extant DTP -- because of this.

But having that nice mono screen for productivity didn't make the ST a big success, so I don't think the original comment stands.

The Amiga was not a success because it wasn't an IBM PC.


I would also argue that the 500 wasn't game machine enough. It should have been sold with controllers, and a cartridge slot. A home computer addition that added a keyboard and disk drive, could have been bought as a bundle or add-on. If they had done that I think they could have hung in to be a player in the Nintendo age.

On the high-end the 2000 was also not enough, again they were hedging too much trying to keep costs down. They should have pumped up the specs and doubled the price. Instead of Sun Workstations in their offices they should have been using Amiga workstations.


They did become ubiquitous, but not in the United States.


My favourite computers of all time were my A500 and A1200. I learned the most about how a computer works and how to program it. I learned how to create and control sound and graphics and how these are represented inside the machine. I made my own games in a mix of high-level languages ( Amos! ) and 68000 assembly. I made 3D animations assembled frame by frame onto a VHS tape deck. I even used the Amiga to VJ at clubs and student parties.

Really loved those machines.


The Deathbed Vigil: The Last Day At Commodore https://youtu.be/jvJjFYHGTnU


I just realized that the Amiga story is exactly what would have happened to Apple in an alternative timeline where the company had filed for bankruptcy without Jobs. Or the other way around, if they had succeeded we all may be wearing Amiga Watches on our wrists right now.

It's very interesting to see how good/bad managerial decisions or tiny details can totally sink an advanced technology and change the course of the future. Can we imagine a present with a current technology without the advances that, for example, Apple has brought? Maybe we'll all be still using Nokia phones or Palm PDAs.


The Sharp X68000 was more successful and powerful than any other machine in the "68000 wars." It came a few years later, but outsold both Atari and Amiga (the number I heard was 15 million units sold). However it sold only in Japan, and really only for games -- despite having higher spec'd multimedia and even having video input that could have made a "video toaster" type app like the Amiga possible. Just shows you how massive the Japanese gaming market was in the 80s/early 90s that that niche machine sold only there could outsell them.


My favorite 68k machine is the 3b1. It had a windowing system with mouse, but never any apps that would be interesting for the desktop user. Funny that Apple switched to UNIX fifteen years later. If they had started with it, instead of that stupid system where half the OS was in ROM, it would have been so far ahead.


I had an Amiga for a while. You couldn't hook up standard keyboards and monitors to it, everything was just different enough to be incompatible. It didn't bode well for Amiga's future.


A "standard" keyboard here means "a keyboard made for IBM PC AT clones". When you put everything in context (and consider the Amiga predates the first 386-based PCs by a couple years), it made no sense for the Amiga to have a PC keyboard connector any more than it would make sense for an Apple II to have one.


The monitor issue is/was more serious, though not really an issue early on. The problem there was that the Amiga graphics modes would not work with many cheaper PC monitors, so we had to spend extra on expensive multisync monitors or Commodore branded monitors. And unlike the keyboard, that was something people wanted to upgrade.


Yep same nonsense on the Atari. Computer vendors back then really did frustrating things around peripherals. It was bad enough they were so expensive, but external disks, monitors, mice, RAM everything was proprietary, which made it worse. Eventually things settled on SCSI, then IDE, but there were many years there where it was wild west on standards for all of the above.

I just bought an Atari TT off eBay. It will nicely drive a VGA monitor, but not for its highest resolution (1280x960), for which it needs a special ("ECL") monitor. An adapter from that to VGA is $175 from a hobbyist -- almost as much as I paid for the computer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: