Hacker News new | past | comments | ask | show | jobs | submit login

> they should have split their market into two, one focusing on high-end workstations and the other game and home computing.

That's exactly what they did (and very successfully), with the 500 and 2000 models.

The main problem for mainstream adoption at the time was that the machine, being so successful as a multimedia and games machine, was regarded as a toy.

But the stake through the heart of the Amiga, and all the other proprietary platforms of the time, was that the IBM PC architecture became a standard that was open for all manufacturers to produce for. You can't compete with that.

(Unless you are Steve Jobs. In fact the accepted wisdom at the time, with Apple and the Mac teetering on the brink of annihilation, was that they should open up their hardware platform and become a software company.)




>That's exactly what they did (and very successfully), with the 500 and 2000 models.

I'd argue that while the 500 was what they needed, the 2000 was quite unambitious. What Amiga really needed in their 1987 high-end product was a 14MHz 68020 (or EC020) based, 32-bit system, with a graphics chipset to match. Take the old chipset, but give it 2x the clock and 2x the memory bandwidth, and all sorts of things (including non-interlaced 8-bit 640x400 video) become possible.

If they had started working on this in late 1985, I think they would have had a chance at being taken more seriously. Instead, they didn't even start revising the chipset until 1988, and they didn't have a native 32-bit based system until the 3000 in 1990 (and even then, still with the old chipset).

In fairness, I understand the US-based side of Commodore was thinking along these lines for the 2000, but lost out to the Commodore German division's model of a cheaper 1000 with slots and a better case.

What Apple proved was that an alternative to the PC was possible, if you were clever enough in your technology and your marketing. Commodore was neither.


Unfortunately, while Exec (the kernel) and DOS (the file system) were not really tied that hard to the architecture, Intuition (the GUI) was very tied to the hardware. Change the video, and you have a new GUI to code for. Also, it may not have been all that possible to enhance the hardware from the given design [1].

[1] A bit-plane video architecture. For example, say the video is 256x256 (for simplicity) 1 bit color. That's 8 bits per pixel, 8K video buffer. What 2-bit color? Okay, the low bit of each pixel s in one 8K video buffer; the high bit of each pixel is in another 8K video buffer. Want 8 colors? Add another 1-bit 8K video buffer. Now you have one pixel spread across three bytes.

Never mind that Intuition also allowed control access to the Copper [2] so you could also specify the screen resolution and color. You could split the screen (upper half, 320x100, dual play fields (three bit planes defining the background, three bit planes defining the foreground) with 32 colors, and the lower half 640x200 (interlaced) with four colors). You could also specify up to 8 hardware sprites.

Yup, Intuition was very tied to the video hardware.

[2] A special 3-instruction CPU to control the hardware registers based on the video beam position, allowing you to change the entire video setup virtually anywhere on the screen (any given scan line, within four pixels horizontally); colors, memory used for the video buffer, resolution, sprite locations. You could literally display all 12 bits of supported color on a single 1-bit video page by mucking with the register settings on a per-4-pixel basis using the Copper. It'd be a long Copper program, but the 68000 would be completely idle.


The Amiga 1000 could support a 640x400x4 bitplane 30fps interlaced display with its 16-bit path to memory at 7.14MHz. If you double the clock, and double the width of the data bus, then a 640x400x8 bitplane at 60fps becomes theoretically possible, which would have made it an easier sell for business (the need for a non-interlaced 640x400 mode was obvious quite early on)

Obviously there's more to improving the graphics architecture than just improving the clock and data path. My point was that the easy benefits of Moore's law at that point would have substantial improvements possible, if Commodore was able and willing to make the attempt. "What amazing graphics thing are we going to do next?" was the question Commodore needed to be asking themselves in 1985, but it's clear they didn't. How much of that was due to money, and how much was due to corporate culture, is not clear to me.

And WRT Intuition: it was tied to the hardware, but you overstate the case. Later Amigas had new graphics modes, and Intuition had support for 8 bitplanes from the beginning. I don't think 640x400x8 @ 60Hz was impossible, or even that difficult, in 1987, from an OS perspective.

BTW, anyone else disappointed that there are no Amiga engineering alum who have discovered this thread?


Poor old Commodore didn't have the money, so management couldn't deploy the resources until their backs were against the wall.

If the A1000 had managed an 80-column non-interlaced mode, it might have done better initially, but as it was, the machine was a joke for business.


I don't know what kind of machines you used in 1985, but 80x25 was easily done on an A1000, and was what you were competing against on the PC side in most instances.

There were many problems for Commodore, but lagging on the graphics side vs. the PC was not an issue until years later.


He said non-interlaced. You had to buy (an expensive after market) scan doubler to get rid of the flicker on the Amiga. Biggest mistake they made .. they should have included a true workstation class high resolution mono video mode.

Doing productivity apps on the Amiga in its highest resolution vs the Atari in its highest (monochrome) was no comparison.

The Atari did better in certain markets -- MIDI sequencing, big time, and to a much lesser extant DTP -- because of this.

But having that nice mono screen for productivity didn't make the ST a big success, so I don't think the original comment stands.

The Amiga was not a success because it wasn't an IBM PC.


I would also argue that the 500 wasn't game machine enough. It should have been sold with controllers, and a cartridge slot. A home computer addition that added a keyboard and disk drive, could have been bought as a bundle or add-on. If they had done that I think they could have hung in to be a player in the Nintendo age.

On the high-end the 2000 was also not enough, again they were hedging too much trying to keep costs down. They should have pumped up the specs and doubled the price. Instead of Sun Workstations in their offices they should have been using Amiga workstations.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: