I bought an A1000 when it came out. I then became good friends with the original team in Los Gatos, CA as we worked through pushing it into new places. My "day job" was a Sun, and my Amiga could do more than my Sun 3/50 could and cost a lot less.
One of the under appreciated challenges was that as amazing as the silicon was, it really couldn't run at a high enough frequency to give you both acceleration AND a 640 x 480 non-interlaced display. That lead to workarounds like the "flicker fixer" because in the US at least, often the 60Hz interlace of the display and the 60Hz power line frequency "modulating" fluorescent lights meant you got some rather annoying "beat frequencies" on your screen. I had a long persistence monitor but it was an artifact from a different age and not generally available. (also kinda lame for animations because smearing).
I always lamented that Commodore was set up as a giant tax dodge rather than a computer company. And I got frustrated when Jean-Louis Gasse would announce some new "revolutionary" feature of Macintosh that the Amiga already had.
It definitely would have been different had Commodore been a "real" computer company. I tried to get Sun to buy them but alas, by the time that was possible Sun was already running away from the desktop workstation into the server room where they hoped to make a stand against Windows NT.
The reason they didn’t do a full 640×480 was because they were focused on TV-compatible video hardware, not because an 8MHz 68000 couldn’t handle it.
Even without Commodore’s terrible corporate structure they would have had a rough time improving Amiga because they painted themselves into a corner with an expensive chipset that developers had to work with directly for good performance and that didn’t have a lot of room to grow in a compatible fashion—the first real revision to the Amiga was 1990’s A3000, the A500 was just a value-engineered A1000 while the A2000 was just an A1000 with internal slots. No real change for 5 years would have prevented market growth, even if corporate management had been at all competent.
They would have had a lot more leeway to make more interesting systems faster without the crushing weight of compatibility engineering. Apple was incredibly smart on that front with Macintosh—and the early introduction of Mac XL helped ensure developers didn’t ignore the “don’t touch the hardware directly” directive, at least for business software. That made the Mac II possible in 1987, and by the A3000’s release in 1990 Apple had an entire line of full 32-bit workstations that ran virtually all Mac software without weird caveats. And Windows 3.0 was also in the mix at that point on 386 and better PC hardware…
> The reason they didn’t do a full 640×480 was because they were focused on TV-compatible video hardware, not because an 8MHz 68000 couldn’t handle it.
The interleaved memory access of the CPU & Agnus means that the amount of data that can be pulled from the RAM is directly related to the bus speed.
This, combined with the horizontal line frequency and the bus width determines how many memory accesses there can be per line - to get a 640x480 progressive scan you'd have to double the amount of accesses per second and there's not enough access slots to do this at 7 MHz.
When A1200 came out (too little too late by the end of '92) with AGA the CPU speed was doubled to 14MHz with a 32-bit wide bus which finally allowed for higher resolution progressive modes, more colours etc.
Or you dual-port the VRAM. Or you have a division between RAM banks managed by the OS, e.g. “slow” and “fast.” Or you have the custom chips mediate access to the VRAM entirely, but offer a DMA feature for, say, block transfers of bits. Or…
There are lots of ways to support a full 640×480—or more—with an 8MHz 68000. Sun, Apollo, and HP all supported at least 1024×768 on their 68000/68010 workstations.
The A3000 wasn't really much of a revision. True, it had a hack that made it possible for the CPU to access chip memory 32 bits at a time, but the chipset remained using a 16 bit data bus for all accesses. When running in the 640 pixel wide hires graphics mode using 4 bits per pixel, the memory bandwidth was completely saturated. The really sad part is that a motherboard the AGA chipset (which actually increased memory bandwidth for graphics, but not for the blitter and other parts of the chipset) was ready in an A3000 form factor about a year before the A4000 shipped. Heck, the A4000 couldn't ship in quantities to meet demand because Commodore was cash starved and couldn't afford to pay HP to make more chips. What a series of complete and total failures by management.
I agree. What seemed to be a game changer at the time and maybe was for a short period, the blitter and the bitmap concept, showed later on to be more of a burden than an asset.
That limitation was not about frequency, it was about supporting widespread TV hardware. Flickering was inherent to how TVs worked at the time, but Amiga software did not sufficiently account for that. If they had used the interlaced modes correctly, i.e. as a minor tweak over simple scanline doubling rather than a true increase in vertical resolution, the flickering would've been a non-issue.
The Amiga only displays one interlace field per 1/30th of a second -- that is to say that it is effectively 15(ish)fps in interlaced mode. It's only using a traditional 200 line output, just phase shifted for the second field. It's basically a hack.
It flickers because the persistence of most CRT screens is less than 1/15th of a second, and so by the time you see the second field, the first field is already diminishing in brightness. IIRC Commodore released (or wanted to release?) a monitor that had a longer persistence.
> The Amiga only displays one interlace field per 1/60th of a second -- that is to say that it is effectively 30fps in interlaced mode. It's only using a traditional 200 line output, just phase shifted for the second field. It's basically a hack.
That's how TV's worked back in the day, yes. It would have led to flicker when you had lots of sharp changes in image content across scanlines, but not otherwise. E.g. filling the whole screen with a solid color, or even a smooth color gradient, would've worked quite fine in interlace modes.
Unfortunately, this inherent limitation of interlaced display went unrecognized in the Amiga dev community - everyone would've known that interlaced screens tended to flicker a lot but most devs didn't know why, or how to properly work around the issue.
That wasn't my experience at all. Everyone knew what caused flicker and how to work around it. For example, that's why Workbench 2.0 had a much lower contrast black on grey default color scheme, instead of Workbench 1's more garish white on blue. (Supposedly those were picked to look more readable on cheap monitors.)
The problem with running Workbench in interlaced mode is that it just gave you tiny text and icons: the ROM only had a single bitmap font in 8 and 9 pixel height size, and adding custom fonts to your boot disk would've taken up a whole lot of space.
(Similar to running unscaled High-DPI today on systems that weren't designed for it out of the box.)
So back in the day, Commodore flew me out to Frankfurt Germany to interview me for the role of CTO of Commodore[1]. We had a wide ranging discussion about the role and Commodore's strength as a brand, etc etc. I wanted to dig into the financials though because even then, I had a reasonably good understanding that corporations are organized around their core mission, they invest in that mission to the exclusion of things that are "not core."
As we got to discussing things, the structure of holding companies holding holding companies and subsidiaries in the Netherlands licensing technology out to other subsidiaries, it was obvious that Commodore invested most heavily in creating a corporate structure that was "perfectly legal" and could argue away any tax liability based on any level of income. The CEO at the time, Rattigan, was a finance guy (in my opinion) not a computer guy. He liked the C64/C128 "better" because they had better margins and less of a support burden than the Amiga did. (emphasis mine)
I left unconvinced they were ever going to be serious about building computers, very much in the consumer/toy mindset of maximum margin/maximum volume. I politely let them know I would not be interested in pursuing the discussions further.
[1] Fun fact, the guy that took the job lives about 1/4 mile from me and we're friends :-)
I don't support this version. For me much more looks like, Commodore/Atari tops were very afraid of being accused of having a monopoly on PC market, but they couldn't create some semi-competitor like AMD that years, and did not have good enough control, to make open source design like IBM PC.
Sun surpassed Commodore in revenue first a couple of years after the Amiga launched, I think?
Though with the management chaos at Commodore maybe the Sun market cap had already surpassed them?
At the point Sun hit revenue parity Sun was one of the fastest growing tech companies, and Commodore had never reclaimed the peak revenue it reaches in 1984 (when it exceeded a billion USD), and so I'm not surprised it didn't seem attractive by the time it might have been financially possible.
At the time (early 90's) Commodore was very much struggling. Since I knew their hardware team and had a lot of respect for them, and they were 68K based and seriously considering doing UNIX workstation level like things, my pitch to the corp dev folks was for a "modest" investment, Sun could produce an economical workstation that leaned in on hardware acceleration (important note: Andy Bechtolsheim was a huge "don't bother with custom chips, CPUs will get fast enough" person) and gave Sun a way to create a flanking attack on Windows. Here's an OS that doesn't have the baggage of SunOS, can compete head to head with DOS, and cause Microsoft pain trying to swat it away.
Apparently it wasn't compelling enough of a pitch :-) But then again it was one of those times where I got a lesson in how folks cannot see what they cannot see.
It was just a little too late as the workstation market was drying up and they arrived with a Sun 3/50 competitor in a market that Sun had already started to walk away from. You could buy alternate CPU for the A4000 as it had the CPU installed on a daughter card and I think you could get a PowerPC setup in that chassis.
I am aware :-). It was the direction Amiga (although not Commodore) was headed. The key is that Commodore didn't see it as strategic nor could they justify re-organizing around it once the end of the runway was pretty clear.
Commodore had major management rot and corporate issues. Maybe that rot would have reached Sun, which would have been detrimental to the future of Sun at that time.
I think that without Sun, the world of computing today might have looked quite different.
Commodore was also surprisingly big, and Sun grew surprisingly fast, and I don't think that was a good match, as much as it'd have been intriguing.
Commodore reached a billion USD revenue in 1984. Throughout most of the rest of the 1980s, its revenue was in the $800+ million range.
March 30th 1986, SF Examiner reported Sun's revenue as $115.2m.
By 1988, however, Sun passed $1bn in revenue, according to The Press Democrat, who also reported their 1987 revenue as $537.5m.
At some point in that interval it'd probably have been possible for Sun to get a good price for Commodore, but I think at that point Commodore would've already seemed like a bad deal to them given their growth trajectory. A couple of years later Commodore was in terminal decline, and Sun had kept growing and while Commodore at that point might have been small fry enough to be more viable to pick up without changing Sun too much, I'm guessing they didn't seem very relevant to Sun at that point.
The demoscene was brilliant. People shared them sneakernet-style like games back then too, which speaks volumes for how good demos were. Very fond memories.
It's not surprising the Village Voice covered this - NYC had a rich cyberculture during the mid 1990s, fostered by the various BBS/ISPs (MindVox, Interport, Panix, etc) as well as a number of startups in Silicon Alley (https://en.wikipedia.org/wiki/Silicon_Alley).
The Amiga was dead the second they put all of those whiz-bang chips in them that allowed them to do things that the PC and Mac couldn't do.
One company cannot out-innovate thousands for long-- so everything the Amiga could do in 1985, the PC could do better by 1987 with ISA cards from thousands of different manufacturers.
HAM in 1985 was great. VGA in 1987 was better.
2x 8-bit stereo channels in 1985 was great. SoundBlaster CMS in 1987 wasn't better, but it didn't take long before 16-bit audio hit the PC.
Video overlay with the A2000 was great, and carved out a niche in AV production for decades. But again, VGA was better, ubiquitous GUI accelerators like the Mach32 made windows fly so fast the Amiga could only dream, and frankensteining accelerators into a closed "complete" system was not the answer people were looking for.
Hell I had, and still have, my A2000 and remember the torturous pain of upgrading the video in it, with some apps that would only display on the built-in video because the add-in cards couldn't support the native graphics some apps that would work on both until you dropped in the accelerator file thingy and then they would only work on the add-in card, motherboard ram/chip ram/fast ram/expansion card ram/CPU accelerator RAM all of which could lead to incompatibilities or performance gotchas.
Generally speaking, you stuck in a VGA card and CGA and EGA worked. When a faster card came out you stuck it in and loaded a TSR or drivers and things worked faster. Installed Windows? Get a card with Windows acceleration. 1280x1024@24-bits in 1992 while the Amiga was outputting TV resolutions.
I installed a Picasso video card and performance worsened because I had paid for the then-astronomical 8MB "fast" RAM upgrade and whoops! video cards only worked with up to 6MB of fast ram and any configurations with more had to use segmented RAM which tanked performance. Some programs would only work with the Picasso drivers, some with the cybergraphics drivers, and others still with only EGS drivers.
At almost the exact time this article came out I moved to PC, got to play Doom, and the rest was history.
My memory is very different about the timing. Not saying that your claim of sustaining innovation is not very complex.
I remember that Microsoft continued with DOS for long and Windows <= 3.x (and a bit later too) was [quasi] just an UI on top of DOS.
My main explanation in 2024 is that Microsoft is Microsoft, no matter what they sell. Microsoft is a business machine like nobody have seen before, they fail at zillions of projects and acquisitions but they continue to exist and growth! Think that Apple found their way very late on so business wise Commodore was doomed, it is not about the hardware. I remember in the late 80s joking to my friends spending money on a Sound Blaster and a CGA... VGA.... SuperVGA. I developed in assembler for the 68x line and just after I learnt some x86 I couldn't believe you need to know about offsets for pointers. But Microsoft is Microsoft and Intel was Intel. The 386 line changed everything. It is documented in the Andy Groove book.
This is not personal with your comment but I think several times in HN the community doesn't understand the complexity about going to market and touching the really complex business reality. This goes far from using fancy programming languages, frameworks, etc. Even receiving huge investments.
Indeed, based on Wikipedia ~4 months of difference but we are comparing a microprocessor with a full computer though. The Amiga included capabilities (sound, graphics, sprites) that took more time to incorporate in average PCs.
A last observation, I remember sprite support was an important capability for early microcomputers but the PC completely ignored that.
In my view it is entirely about the hardware. By the time the Amiga 1200 came out, PC makers could sell you a 33 MHz 486 with <10 cycle multiply, and VESA graphics card with convenient 1 byte/pixel graphics memory arrangement. Against this, muh blitter and HAM and dual playfield mode means nothing. Your dad's bitplane games might be smoother on the Amiga, meanwhile Wing Commander looks like nothing you have ever seen before and let's not even get on to Wolfenstein 3D. (As a kindness to Amiga fans, I am not even going to mention Doom or Syndicate.) The PC games ended up better, and RIP Amiga.
By the time of the Amiga 1200, Commodore was effectively already dead. It was a walking corpse.
AGA, the chipset in the A1200, was a stopgap produced because the originally intended graphic chipset, started in 1988, had been beleaguered by delay, in part because of severe underfunding of R&D, and they needed what was effectively a quick and dirty intermediate hack fast.
The "AAA" chipset that was intended to be the next generation, and which had e.g. chunky graphics modes and massively upgraded memory bandwidth and blitter speeds etc., and would've made all the difference for things like Doom, never made it out the door. The project was entirely cancelled in '93, as they concluded PC's would catch up to AAA by the time it'd be finished.
Instead, after AGA, Commodore tried one last ditch attempt and got frustratingly close to getting a massively upgraded design w/3D acceleration, up to 32 bit chunky modes and built to scale from being a standalone system (with an on-chipset PA-RISC CPU) to be licensed for consoles and set top boxes, to a chipset for an upgraded Amiga, with an option to also sell it as a high end graphics card.
[Of course, whether any of that would have saved Commodore was another matter - without a severe overhaul of corporate culture, it was just a question of time; Commodore had basically never, through it's entire existence, even as it rapidly grew in the 8-bit days, been more than one unresolved crisis away from collapse]
As someone who lived through that era, I agree, the 486 murdered almost all of the competition (and nearly even killed Macintosh). Practically everyone I knew who played console games bought a 486 to play games on instead (and also that newfangled Internet thing didn't hurt, which was much nicer on a 486)
Further, Commodore made a big bet on the CD32 but 3DO was already out there, Sega Saturn was coming out and PlayStation was looming (and would have finished Commodore off had it lived another year)
Like Atari Corporation, Commodore failed to innovate enough and the result was not unexpected.
CD32 was not so much of a big bet as a hail mary while hoping to buy enough time for their other chipsets to be completed. They clearly did see a big future there - up to and including the point where their last chipset (Hombre) was planned to be usable either as a standalone chipset for consoles or set-top boxes or as a chipset for a larger computer (or a high end graphics card).
But the CD32 is little more than a CDTV-like variant of the A1200 with a quick and dirty attempt at mitigating the by then disastrous miscalculation of not having any chunky modes in AGA by adding chunk-to-planar conversion in hardware into Akiko (CD32 specific chip that otherwise served to cost-reduce a bunch of glue logic and controllers for the CD-ROM and game ports), but it was very basic (write 32 8-bit chunky pixels to Akiko registers; read 8 32-bit words for 8-bitplanes worth of 32 pixels back out).
> As a kindness to Amiga fans, I am not even going to mention Doom or Syndicate.
I played the hell out of the Syndicate demo on my Amiga, maybe you’re thinking of something else? It was a ton more playable on there than any of the Amiga attempts at Doom clones of the time, thats for sure.
There was definitely a point where PCs left the Amiga in the dust and it was around the time VGA and better started happening though. The A1000 was an amazing beast when it came out but Commodore basically sat on their ass when it came to improving the thing.
No, definitely Syndicate! On an Amiga 500, it felt much too slow compared to the PC version. Compared to the 68000, a 1970s design, the higher clock speeds and much better architecture of the 386 and better really helped. (No, you don't have as many registers... it is still a lot more effective than the 68000!)
It also looked a lot nicer as it ran in 640xsomething mode (640x400 or 640x480 I assume).
Perhaps it was a lot better on an A1200. The 68020 is also a much better CPU than the 68000.
> There was definitely a point where PCs left the Amiga in the dust
Yes but by that time you had other home computing platforms like the Acorn Archimedes, arguably the next evolutionary step in the home computing niche. x86 hardware would've been very expensive at the time and mostly used for business purposes, not in the home.
The Acorn Archimedes was never serious competition for Commodore in that space.
I remember - as an Amiga user - cheekily pitching software upgrades to Acorn to help address the massive shortcomings we saw in RiscOS compared to AmigaOS. Even ca. 1996, a couple of years after Commodore's device, we were still smug about how much better we saw the Amiga as being.
We were willing to accept the hardware wasn't bad, but the Amiga userbase never really considered the Archimedes in the same ballpark as an overall system.
I got an Acorn A3010 to replace my Amiga 2000. It was a massive improvement! Not for games as there weren't all that many but the more serious software was amazingly good.
Impression Publisher ran rings around every DTP package on the Amiga. It was an all around better machine, for those kind of tasks. I still miss the Draw program that was included in the ROM (!) of RiscOS.
But alas, the PC made it all history. And while I'm happy with my current hardware, it sure isn't as exiting as the older era...
I'm not surprised - the A3010 was 5 years newer. The more direct competitor for the A3010 would have been the Amiga 1200, but their specs meant they certainly targeted very different markets - "nobody" bought any of the "small" Amigas for DTP.
You should read up about the corporate shenanigans at Commodore. The company was effectively self-kneecapped by uninterested suits; they would likely not have survived regardless of any development in the field.
Amiga had good 3D games before the PC; breakthroughs like DOOM happened when the company was already on its way down, after a number of wrong strategic turns - including an unwillingness to foster a liberal ecosystem of third-party add-on manufacturers, which was what eventually dragged the IBM-PC into modern multimedia.
I am not saying that there was not a breaking point but I say that Amiga had enough time and its demise was not inevitable. This was much before Doom. As it is say in the thread they could see themselves as a game console and market that. Nintendo Switch is worse than PS5 but they remain in the market and they started selling game cards. May be Commodore needed a Mario and a Donkey Kong. I think the three coprocessors in the Amiga were a breakthrough before intel included the FPU. This is why I return to the business execution part. And also Commodore was confronting Microsoft and Intel directly. I also felt that Commodore was a good replacement for the Mac, it has WYSIWYG word processor, 3D software, etc.
And yet not only are modern computers special snowflakes, with exception of the desktop market only alive due to custom hardware used for AAA games on esports, we get whiz-bang chips on all of them.
The only difference is that instead of Assembly and DMA configurations, we now use whiz-bang chips vendor specific APIs.
Commodore and Amiga went down due to bad management.
PC weren't much better unless people were willing to buy whiz-bang sounds cards from Creative Labs or AdLib, while coding for VESA cards wasn't much better without a wrapper library, as contrary to the previous BIOS standards, using Super VGA required drivers from the card vendor.
Naturally not a problem for Windows 3.x games, but not everyone was that keen in adopting WinG from Microsoft.
No matter the audio, or the graphics capabilities, the Amiga still had a vastly superior operating system. AREXX was far, far ahead of its time. The multitasking was ahead of its time. Those are the things that kept me on Amiga for so long. No other computing platform really compared in the ways that Amiga really shined.
> The Amiga was dead the second they put all of those whiz-bang chips in them that allowed them to do things that the PC and Mac couldn't do.
Amiga wouldn't have existed, or mattered, in the first place without all of those whiz-bang chips in them, and wouldn't have survived the years it did without them even if some irrelevant version had happened without them. But that never would have happened to start with - Amiga, the company, was formed around the ideas of building a gaming system.
Looking at the history of Commodore, it's also not at all clear that this was the issue - Commodore was a deeply dysfunctional company, and had been for many years before they even bought Amiga, with extensive problems around underfunding R&D, but then randomly yanking projects and lacking product direction.
It became an issue certainly, because it meant that the dysfunction of Commodore was sufficient to kill the whole platform - a risk the PC hardware avoided thanks to clones, and the OS/hardware split of the PC platform avoided because even MS stumbling would've "just" have led to another OS winning out.
But without the features of the chipset, nobody would have cared. It was almost only in the US the Amiga was seen predominantly as a "professional" machine that sold as a competitor to PCs in the early years, and it wasn't in the US Amiga sold.
The bulk of the Amiga market was in Europe, especially the UK and Germany. Germany also did have a more professional market for the Amiga as well - with e.g. the A2000 being driven by demand from Commodore's Germany subsidiary, but not as the only thing.
To such an extent that after Commodore International failed, the Commodore UK management tried to finance a buyout of its own parent company, and kept operating for as long as they could still scrounge up stock.
[This US situation was created in large part because Tramiel, before he was fired and bought Atari, had seriously destroyed Commodore's US more 'serious' dealer market with price cuts without preparing the dealers followed by putting Commodore 64's in Kmart, and so when the Amiga was launched Commodore struggled to get sufficient distribution in outlets that could handle a much more expensive machine]
> Generally speaking, you stuck in a VGA card and CGA and EGA worked. When a faster card came out you stuck it in and loaded a TSR or drivers and things worked faster. Installed Windows? Get a card with Windows acceleration. 1280x1024@24-bits in 1992 while the Amiga was outputting TV resolutions.
This is very different from how I remember things. People were mocking the PC well after Commodore went bankrupt for how PC users had to muck around with drivers and weird command line settings while on the Amiga you just dropped driver in, and had cards that supported AutoConfig [1], which meant we would laugh at PC users whenever they mentioned IRQ's because all of that nonsense was automatic.
The Amiga was not originally meant to be a game console - the Amiga 1000 launch event featured Andie Warhol and Debbie Harry ("Blondie") using the computer to make art. For a period, the Amiga was the gold standard for video FX in TV stations and film studios. It just so happened that its features were very good for games, so the gaming industry produced a lot for it.
The abortive successor CD32 was a game console (trying to do what Playstation did a couple of years later), but it was a late hail-mary from an already zombified company.
The original Amiga was built with expansion ports, but Commodore saw them as a way to sell more widgets - very few add-ons were produced, and they were very expensive. Aftermarket manufacturers existed, but they couldn't grow significantly: unlike IBM-compatible PCs, the Amiga never managed to get a presence large enough to sustain a large hardware ecosystem.
The Amiga had been intended as a games machine in early development when it was still only a project at Amiga Inc. However, when the 1983 video games crash happened, they started looking at other purposes and decided to add keyboard and mouse support. When they got funding from Atari it wasn't entirely clear if it was going to become a games console, a home computer, or both.
Then Jack Tramiel was ousted from Commodore and took engineers with him. Commodore had to abandon the Commodore 900 project: a 16-bit business computer running Coherent (a clone of Unix), and was looking for a replacement: a hole which the Amiga filled, and then some.
The Amiga did not originate under Commodore, and the original business plan was very much a game machine.
By the time Commodore snapped it up right under the noses of Atari, that was still the goal.
While Commodore changed that goal, the artefacts of that origin is all over the system, from a built in dual "play fields" graphics mode, to the OS support for sprite multiplexing.
The original Amiga design had none of those. All of those were added at a point where the game machine market had collapsed and Amiga was running out of money and/or the final pieces after Commodore had snatched them up.
AmigaOS "famously" (to Amiga users) has weird inconsistencies because Commodore last minute had to license a BCPL based system to add the AmigaDOS subsystem which is inconsistent with every other part of the OS, for example, because the original system didn't have filesystems and the like in a working order and the company meant to provide it failed to do so. And so we were grumbling about having to deal with BCPL style "BPTR" pointers (had to be shifted to be dereferenced) for no good reason for years afterward.
The 500 was developed to compete in the 'desktop gaming' market but the original 1000 was designed as a productivity machine to compete with the Macintosh. Later 'expandable' Amigas were similarly targeted towards graphics professionals.
The Amiga models targeted toward consumer gaming did not AFAIK have the ability to add a video card, being in the more classic integrated keyboard form factor.
The original Lorraine chipset was developed to be a gaming console. The original Commodore Amiga (the "1000" was added retroactively) was designed as a productivity machine, but that started only once the bottom fell out of the gaming market and they were forced to find another way out, which ended with Commodore buying them.
(Amiga Inc only survived until the buyout thanks to a $500k loan from Atari - who intended to buy them, but Tramiel, who had taken over Atari after leaving Commodore, was famously cheap and strung them along expecting to pick them up for pennies if they defaulted on the loan)
Apple was the antithesis of Commodore in those days: Aiming for the high-end market (post-Tramiel Commodore US tried that; the rest of Commodore focused on the low end, the Tramiel had from the start[1], with some exceptions - Commodore Germany did well in the business market - and the non-US subsidiaries sold far more Amigas that way), and focusing on far simpler hardware designs, and on selling into businesses and education.
And they too just barely survived. To the point that there was widespread talk of bankruptcy for several years. And there was that infamous cash injection from Microsoft.
There was also their brief flirtation with allowing clones.
At the same time Commodore was management-dysfunction central in a way that Apple, even with the Jobs ouster and return, never was (Commodore did oust its founder too - the aforementioned Tramiel -, but he bought their arch-rival - Atari - and went into head-to-head competition in their most lucrative markets instead of eventually returning).
[1] Tramiel's slogan in the early days of Commodore's entry into the home-computer market was "computers for the masses, not the classes".
AmigaOS 2 (I think this was the first 32-bit amiga OS) was, and this is putting it kindly, primitive compared to System 7.
Besides, in the early 90s when Amiga was dying Apple was also dying. They only survived because of extremely high-end desktop publishing and educational users buying IIfx's, Quadra 950s, and PowerMac 8100s, the fact that Apple moved to PowerPC (two months before this article was published) while Amiga was releasing the 4000T with a 68040, and their extremely desirable laptops.
People like to dismiss Apple but there were many years in the 80s, 90s, and 2000s when Macintoshes were the absolutely, irrefutably, and unquestionably fastest non-UNIX workstation personal computers you could buy at any price. And then they became UNIX workstations.
> there were many years in the 80s, 90s, and 2000s when Macintoshes were the absolutely, irrefutably, and unquestionably fastest non-UNIX workstation personal computers you could buy
Arguably true of PowerPC Macintoshes, but for quite some time the fastest m68k Macintosh you could buy was an Amiga running a Macintosh emulator.
I have an Amiga 1000 with an expansion package called A-Max. This allows my Amiga to use a real external Macintosh floppy drive. But it also allows me to use a separate partition on the A1K's internal hard drive where I've installed System 6. A-Max allows me to have what would essentially be a Macintosh Plus with crazy stats for the times, including a mild accelerator. The little thing flies. Two computers in one!
Since most software except certain Mac games (cough!) ever hit the Mac custom chips, virtually everything I throw at it works flawlessly.
MS Word, Mac Draw, Hypercard, Quarterstaff... It's pretty cool.
Almost nobody has the fastest Macs either, and high end Macs were not the market segment Commodore competed against.
Maybe it'd have gone better if they did. But the vast majority of Commodores Amiga sales were the low end machines that were hurt by being overtaken by PCs in the games market.
There has never been any such distinction. The 68k has a 24 bit address bus, and a 16 bit data bus, but addresses are 32 bit. As such, apart from a few stupid programs (cough Microsofts Amiga Basic cough) that assumed they could use the top 8 bits of some pointers to store data, AmigaOS was 32 bit from the very start.
> and this is putting it kindly, primitive compared to System 7.
AmigaOS had preemptive multitasking from the start. I'm sure there were features of System 7 that were better than AmigaOS, but all I remember of the OS differences at that time was how we ridiculously primitive everyone I knew considered MacOS to be for lacking basics like that.
Even a couple of years after Commodore's bankruptcy, I remember visiting a publisher, and marveling at how primitive MacOS seemed on the surface at least.
And it was the access to that software much more than the hardware that mattered - even when you could buy far more capable PC hardware for less, being able to run e.g. the Mac version of Quark outweighed any cost and performance considerations in those specific niches (I helped out with tech for a newspaper for a short while in that period, and it was beyond frustrating to have to deal with the printers who only used Macs).
>AmigaOS 2 (I think this was the first 32-bit amiga OS) was, and this is putting it kindly, primitive compared to System 7.
You CANT be serious. The multitasking on System 7 was a joke compared to Amiga. And Amiga had AREXX, which was absolutely ahead of its time. I had easily written scripts that connected all kinds of Amiga applications together, nothing like that existed on System 7 and I doubt it really exists today on OSX.
>People like to dismiss Apple but there were many years in the 80s, 90s, and 2000s when Macintoshes were the absolutely, irrefutably, and unquestionably fastest non-UNIX workstation personal computers you could buy at any price.
AREXX sounds very similar to AppleScript/AppleEvents which came with System 7 and exists in macOS to this day
Apple Events even worked over a network so you could trivially make scripts that remotely controlled other Macs over AppleTalk. This was great for making HyperCard stacks (HyperCard had native AppleScript support of course) that worked as remote control panels and such.
Apple Events were also emitted by applications when you were running them, so you could make an AppleScript simply by hitting "record", acting out what you wanted to do, and then edit the script to add any kind of interactivity/customization you wanted to do.
Apple Events was supported by a whole scripting subsystem in System 7 called the Open Scripting Architecture, so you didn't need to use AppleScript's own language, you could plug in other languages. Third party companies made Python and JavaScript language plugins so you could write (or record) scripts in those languages, and any app that wanted to add macro features would instantly support it. AppleScript itself was even ported to other human languages - Apple made a French version to demonstrate the capability
“the Macintosh-Microsoft monopoly” very funny statement circa 1994… just three years before the infamous “Pray” issue of Wired. https://www.wired.com/1997/06/apple-3/
I've always been kind of amazed by nostalgia for the Amiga. I was a young computer user in this era, and — despite having heard of the apparent media superiority of the Amiga — had never even heard of someone seeing one, let alone own one. I felt like more people claimed to have portals to Narnia in their closets than had Amigas.
> had never even heard of someone seeing one, let alone own one.
Are you in the US maybe? Most of the Amiga market was in Europe (though here too it varied greatly by country). In my school classes alone there were several Amiga owners.
In the US Tramiel managed to burn Commodore's computer shop dealer network to the ground with his pricing shenanigans and deal with Kmart to bypass computer stores for the Commodore 64. As a result, when the Amiga was too expensive to sell well in budget outlets, Commodore US basically had to rebuild their dealer network from the ashes that Tramiel left on the way to buy Atari.
Meanwhile, in Europe, most Commodore subsidiaries grasped the opportunity to push the Amiga as a games machine starting with the Amiga 500, while Commodore US kept pushing it as a professional machine.
You see this difference very clearly in the computer press - where the US Amiga magazines are all serious, while e.g. in the UK the most famous Amiga tie-in was a package with game for the 1989 Batman movie[1]. While many of us also used it for non-gaming stuff, it was games that sustained the bulk of Amiga sales, to the point of e.g. putting Commodore UK in a position where they considered a management buyout of Commodore International when their parent company went bankrupt...
Apple appeals to a totally different customer. They want something that they’d needn’t fuss with. The other Apple customer is the developer who’s forced into ownership so that he/she can sell to that first Apple customer.
Edit: until this became the Apple way, Apple was on life support and headed for hospice. The market chose the appliance like computer and explicitly rejected tinkering.
Many people don't care about anti-monopoly regulations, but they are not joke.
IBM and Intel for decades deal with anti-monopoly regulations and managed to avoid ATT/GM like issues (extremely serious troubles, in simple language).
People usually thinking, IBM distributed their soft as public domain and opened S/360 and allow clones, because they are good, but in reality, they used all means, to avoid 100% share of market, they intentionally made hidden moves to support semi-concurrents like AMD in 1980s, so anti-monopoly regulators didn't have enough foundations to issue things like made against ATT/GM.
In early 1980s, Commodore was nearly monopoly of home PC market (IBM PC at begin was expensive business machine, and Macintosh was even more expensive, when first appear). And I have seen from other business behavior, they behave exactly same, when appear close to hit some crossbar (of profits, or of gross budget, or of market share) which they don't want to hit, because they feel good enough under this crossbar.
Obligatory mention of the famous "Deathbed vigil" video shot by Dave Haynie in 1994 in which he enters the almost abandoned Commodore HQ in west Chester PA with a camera.
The Amiga was like a dream machine when it hit the market, most people I knew had C64's at the time and the Amiga was just light years ahead in terms of capabilities. A few people I knew had PC's and the Amiga absolutely left them in the dust for years, it only really seemed to catch up around 1993 once Doom hit and VGA / SVGA cards became somewhat more affordable, that was roughly eight years after the Amiga was released.
All of that being said, I lived in a part of the world in the 80/90's where PC tech was very expensive at the time, and the humble Amiga was very reasonably priced in comparison, it had just enough wow factor initially to make you believe, with lots of custom chips to support that ambition. It was the sort of computer that was able to crossover into every space mostly. If you lived in the US, you were likely trying to decide between a Tandy PC and Apple or a console which were way more affordable on home soil and already had a large catalogue of titles and mainstream developer support and business use (for the PC).
Where I think things went wrong for the Amiga, aside from Irving Gould and his rather hopeless management squad, was that the Amiga was marketed badly and poorly positioned in the US, with Commodore having a somewhat sketchy relationship with smaller computers stores across the country which almost guaranteed they got a bit less sunshine compared to other devices in store (thanks in part to Jack Tramiel and his aggressive price cutting tactics with the C64 in the early 80's). Europe, Australia, and South Africa all had brilliant marketing and distribution in comparison, and it was thanks to these markets the machine endured as long as it did.
The Amiga however ended up fighting too many battles on too many fronts and had very poor strategic direction in general. They needed R&D badly and a decent amount of it to maintain a hardware advantage at a low cost price point, instead they got rid of many of their great hardware engineers and specialists when their initial project had finished so they could save a buck and ultimately seeded their talent to the competition...
They should've had a VGA or better machine out to market by 1990, which they didn't and that was a major loss in terms of the hardware game. PAL and NTSC Amiga machines were also different beasts, with the PAL machines coming out on top in terms of resolution, compatibility and with more developer support in general.
But it's real losing hands (and there were a few) were:
1. Too many frequent revisions of it's main product which were superseded shortly after - Amiga 500+ or Amiga 600 anyone?
2. No cartridge or card slot while fighting a battle against consoles for market share and not having the foresight to build a low cost standardized adapter that could've been used as one (via the almost never used SCSI connector).
3. Releasing the CD32 Console, but not releasing an Amiga 1200 CD drive or the same capabilities across their all their home desktops (i.e. Akiko chip and CD32 firmware).
4. Releasing an upgraded version of Workbench / bios which killed compatibility with a third of Amiga's back catalogue of software titles (that was bright).
5. Having different resolutions and cpu speeds between countries for the same product - believe it or not there's a way to separate voltage speeds from the CPU frequency, and to have resolutions that can be complimentary across PAL and NTSC regions (just ask Nintendo and Sega), and even more amazingly if you just add a spot of frame skipping into your NTSC title it can work just fine in PAL at full speed.
6. RGB 15khz is great, but not properly implemented to support higher resolutions, so everything looked a super flickery when you went high res - it was basically a feature you couldn't use or had to suffer through if you did use it (Until the Amber Chip appeared on the A3000 / A4000 range).
7. The HAM mode was revolutionary but was extremely difficult to use except for the odd digitized image - another feature that people couldn't use, and wasn't made easier to use...
8. Chip ram should've been the only RAM in the system and had a faster bus speed to support normal transport speeds across memory. It was bad because it was very limited.
9. No upgrades across the sound channels or quality / frequency.
and lastly...
10. Commodore's main bread butter with the Amiga was with the desktop hobbyist home user - this segment should've been it's no 1 priority and been given first class treatment all the way. The UK and German offices understood this, but not in the US with the final nail in the coffin being the poor launch sales of the CD32 in the US (Went great in the UK). This was supported by a lack of killer titles upon release such as (Street Fighter 2, Mortal Kombat) and the general apathy to the Commodore brand in the US overall due to years of neglect and bad decisions.
The beautiful Amiga died because of idiocy and neglect - Commodore was doomed to die, it was just a rotten shame that the Amiga computer line went with it.
One of the under appreciated challenges was that as amazing as the silicon was, it really couldn't run at a high enough frequency to give you both acceleration AND a 640 x 480 non-interlaced display. That lead to workarounds like the "flicker fixer" because in the US at least, often the 60Hz interlace of the display and the 60Hz power line frequency "modulating" fluorescent lights meant you got some rather annoying "beat frequencies" on your screen. I had a long persistence monitor but it was an artifact from a different age and not generally available. (also kinda lame for animations because smearing).
I always lamented that Commodore was set up as a giant tax dodge rather than a computer company. And I got frustrated when Jean-Louis Gasse would announce some new "revolutionary" feature of Macintosh that the Amiga already had.
It definitely would have been different had Commodore been a "real" computer company. I tried to get Sun to buy them but alas, by the time that was possible Sun was already running away from the desktop workstation into the server room where they hoped to make a stand against Windows NT.