''I've been in retailing 30 years and I have never seen any category of goods get on a self-destruct pattern like this,'' said Everett Purdy, senior vice president of merchandising for Service Merchandise, which sells home computers through its catalogs and showrooms. Since fall 1982, T.I.'s list price has tumbled from $400 to less than $100.
That price plunge turned an unaffordable extravagance into something I could actually buy and start using every day. It wasn't just the computers that became cheaper, but the peripherals (disk drives, modems, printers) and software as well. Suddenly, a home system that was never going to happen could be picked up piecemeal at a fraction of the price.
I wonder how many programmers owe their current careers to this early shakeout and price collapse.
I for one learned to program on a TI 99/4A, which I suspect my parents chose based on price. I didn’t really choose a career in software, it just grew out of summer programming jobs I had during college, so reasonable to say I at least partially owe my career to the prices of that time.
Same: I was offered a choice between the TI and the Vic-20, the two cheapest options you could find in a store. The TI seemed better for basic programming, though the Vic probably had more games available.
$100 in 1983 is $253.90 in 2018[0]. Here's a chromebook with 2GB RAM and 16GB storage for $116[1]. You can probably get a used one on craigslist for less.
$400 in 1983 is $1,015.60 in 2018. You can buy the lowest spec'd version of the older macbook air for $1000, or the lowest spec'd version of the new one for $1200[2].
To be fair, prices dropping that low is a relatively recent phenomenon. A fully working and useful general purpose computer (including display) was never less than like 5-700 for most of the 90s and 00s in my experience.
It's funny how you round off precise Mac prices in a sensible way but insist on quoting much much much more fuzzy inflation calculator output with ludicrous 1 cent precision.
Possibly I'm being downvoted for snark, which is reasonable and I apologise. However I do think I am making an important point, "$400 in 1983 is $1,015.60 in 2018" is basically an innumerate statement. Multiple people make the same mistake in this thread, it seems that people like these inflation calculators and I suppose the fault lies mainly with them for not rounding their output sensibly. $400 in 1983 dollars very (very) roughly corresponds to $1000 of purchasing power today. The dramatic changes in society since 1983 mean that a representative basket of goods has changed dramatically (no Netflix subscription in 1983 for example). So you can only do a very rough calculation. Some things have gotten dramatically cheaper (international travel) some things are barely comparable (advanced healthcare). You somehow have to weigh these things fairly. Basically it's apples versus oranges all the way down. To me "$400 in 1983 is $1,015.60 in 2018" is almost a canonical example of pure and unadulterated nonsense. And yes, I have got a bee in my bonnet on this particular issue :- )
Indeed, thanks. My real beef is with the silly precision but I got distracted explaining (probably unnecessarily) exactly why it can only ever be a very approximate estimate.
> A raspberry pi is $25 and far exceeds the home computer of the 80s and most of the home computers sold in the 90s.
But what Moore's law giveth, web developers taketh away. If you want to use a computer to participate in the modern society, you need much more than a Raspberry Pi, due to web bloat.
This first struck me in 2008. You could get a lightweight Linux distro running on just about anything. People'd spend all this time arguing over which windowing system used fewer resources. But, it never mattered. None of the windowing systems performed more slowly than a standard web browser on a modern webpage. Worse, most of the bloat is stuff the user explicitly doesn't want: tracking, advertisements, even more tracking.
It's easy (and trendy) to point the blame at web developers and businesses who bloat their webapp with advertisements and trackers, but for many of those businesses, that's the only way they would make money.
There are many things consumers are willing to pay for with their time and patience, but not their cash. As it turns out many of those things are web "services". If they were truly valuable on the same level as say, food, people would gladly pay cash for the service.
YouTube is a great example for me personally. If you told me I had to pay a monthly subscription of $10, I'd stop watching YouTube instantly. But since it's "free" I'll sit through a few ads. I don't mind paying with my time.
Plenty of people still write travel blogs and think pieces without getting a salary for it. The thing that drives me crazy is twitter / facebook etc showing me content produced for free and STILL put it next to an ad.
Your point is taken, and I agree that very many folks are not being reasonable about what they want. The money has to come from somewhere.
That said, I think I'd be perfectly happy if most websites went under. What we're getting out of it (novelty, entertainment) isn't worth the cost (attention span, loss of privacy) in most cases.
I think that’s kind of flippant. The reality is that it’s very difficult to apply for jobs or even do banking effectively without internet access and a capable browser. I think this is what the parent means by “engage with society”.
Although, I think a rPI is capable, even if it would be frustratingly slow.
Flippant or not, it's a good demonstration of how a prerequisite of participating in a Twitter-and-Facebook-driven modern society is using a computer that's powerful enough to handle all the garbage those two companies insist upon running on said computer whenever you so much as think of participating in said society.
The World Wide Web would be a much better place if web developers/designers showed even the slightest bit of restraint when it comes to shoving arbitrary Turing-complete code down my throat. Statistically-speaking, 95+% of it doesn't benefit me in the slightest, and there's a special place in hell for those who write websites such that I have to put up with that 95% that's garbage just to use the site at all.
Loading 10 or 20 or even 1000 140-character tweets should be possible even on an old DOS PC with dialup, and yet even 10-year-old PCs with multiple orders of magnitude more processing power and network speed/bandwidth struggle with that.
At the very minimum, using the sites of your local and national government, and your bank. Normal levels of participation involve using e-mail, search engines, information sites, reading news, buying things on-line and social media.
Most of those sites are incredibly bloated. Even bank dashboards have become increasingly heavy (for near zero value added).
I am taking his statement in good faith. The "homebrew" application I created runs the exact some frameworks (Javascript with React/Flux/etc packaged with Webpack and calling Ajax based services inside of a browser based on Chromium) that any other modern web application would use.
>A raspberry pi is $25 and far exceeds the home computer of the 80s and most of the home computers sold in the 90s.
That would be relevant if the metric was "cpu power of computer sold in the 80s".
There were some $100 (or even $25) pieces of hardware far more powerful than ENIAC in 1980 as well.
But the relevant metric is "cpu power appropriate for contemporary uses and programs" -- and in 2018 it means what people in 2018 consider a home computer.
>If your goal is just learning how computers work, you have far cheaper options today.
Why would that be your goal? That's only of interest to programmers and tinkerers, not 95% of the population.
In 1983 “learning how computers work” was a huge goal for many buyers. Parents bought computers so that kids could learn to use them, which often meant learning to program. (BTW, thank you, mom and dad).
Eniac was only 500 flops and after an upgrade only had 100 words of ram. So a 6502 released in 1975 was a ~1,000x faster 8 bit cpu for 25$, granted very different architecture but easily much faster even if you where emulating an eniac. For reference a ‘speak and spell’ sold for 50$ back then and had IO but was not programmable.
Given that ENIAC didn't have a monitor (or keyboard for that matter) you just needed a very slow CPU and some means of input and some lights for output (a little soldering could solve that).
The 6502 (available even earlier than 1980) cost $25 -- and was used (e.g. in C64 well into the mid-80s and even today).
" If your goal is just to work on computers, you have far cheaper options today. "
1 of my 2 desktops at work is a 2006 mac mini g-7450. it can email. and view pdfs and edit office documents. it has ten-four fox. and it can ssh. yeah. mostly email and ssh...
They would be $248 adjusted for inflation. But those tiny home computers came without a screen, hard disk, networking, a real OS, or pretty much anything we take for granted today. That $400 laptop packs in a ton of tech.
> Many of them could just use your TV as display though, which went away pretty quick in the PC era.
Composite (and, later on, S-video) outputs were common on third-party CGA, EGA, and (S)VGA cards (not sure if official IBM ones did, but third party were more common.) Also, I think that Tandy’s quite popular proprietary graphics did. And VGA to composite, component, and S-Video adapters were all readily available, too.
And later, this shifted to HDMI being common for both video cards and TVs; the ability to use your TV as a PC display never went away, though TVs couldn't handle (until the HD era) the higher resolution modes used for text and later graphics (though it wasn't until SVGA that that became an issue for most games, which tends to favor higher color offer higher resolution modes.)
I had quite a few VGA video cards over that period of time and I think I only ever had one each that could do composite or svideo. I think at best it's something you could go out of your way for if you wanted but I don't remember it ever being the default. And definitely not on any integrated graphics on a motherboard that I can remember in the 386 to DVI/HDMI era.
There was a period where a lot of cards were sold with "TV out" as a bullet point-- usually a weird little port that connected to a breakout cable to provide S-video and/or component and composite. This seemed to be most common in the late AGP and early PCI-E era -- Geforce 2/3/4/FX/6, Radeon 8xxx/9xxx cards.
Before that, you could either get a "scan converter" which would convert VGA into composite video, or a specialty monitor which was typically a 27" or 32" CRT TB capable of synching to 640x480 VGA.
When flat-panel TVs started coming out they typically had a VGA or DVI port, and shortly thereafter HDMI, so it became moot.
Keep in mind here that the topic is cheap computers and once you get into scan converters and agp and pci-e gpus, let alone special monitors, we're well outside that realm.
Which annoys me to no end. I already have a Fire TV, a Chromecast, and multiple actual PCs. An additional computer that I can't replace or upgrade is useless to me, and it's even more frustrating when it gets in the way of using the TV for its intended purpose.
I don't understand why it's so hard for companies to build 4k dumb TVs. If anything it should be easier and cheaper to not include a bunch of smarts, right?
Well, to be fair $100 in 1980 is worth over $300 even if you believe inflation hasn't been systematically underestimated by CPI over the last 38 years.
I learned to program on a used pentium that I had brought from a friend and a copy of Delphi 3 that was included on a cd 2 that shipped with a computer magazine 2 years earlier...
Neither would have been possible if the prize had been as high as it was back in the day.
The TI wasn't the big winner in the home /business pc market, and as the article says there was a massive shake out in the pc industry going on at the time.
DEC could have completely owned the PC market. They had the microcomputer (LSI-11), excellent software for it, an upgrade path to more powerful machines, and legions of loyal "DEC-heads" just itching to buy one.
DEC fiddly-farted around for years, and finally came up with the Rainbow PC. Not only did it have nothing to do with the 11, its compatibility with the IBM PC was deliberately crippled (you had to do things like buy floppy disks only from DEC). The DEC-heads I knew just laughed in disgust at this, and abandoned their loyalty to DEC.
The Rainbow fiasco marked the end of DEC. It's pretty sad, what might have been. The 11 was a fantastic machine.
For the first two years of my life as a programmer (1996-1998), I only had access to these PDP-11 clones. Jumped straight to Pentium II. Never had a sharper upgrade in performance.
Thanks for creating Empire. I played the heck out of it on the Amiga in my teens. It really opened my eyes to what computers could do w.r.t. visualizing complex systems. I consider it upo with Conway's game of life as an instantiation of complexity.
That was not really a crisis, that was a shake out. The article really focuses on the Texas instrument computer. They basically bungled it, everybody knew that software was key for having a popular computer, and yet they do not take steps to make sure there was lots of software available for it... indeed, they discouraged it.
Amazingly, years later IBM made the same mistake with OS/2, but more so.
Some well known computer columnist, I think it was Jerry Pournelle in his BYTE column, wrote of attending a trade show when Windows 3 was still king where IBM was showing OS/2 Warp and Microsoft was showing Windows 95. OS/2 was out at the point, but Windows 95 was still in beta.
He went to the IBM booth, told them he was interested in programming for OS/2 Warp, and asked what he had to do. They gave him a bunch of forms to fill out to apply for the developer program. The forms asked for details of his business and analysis of why he thinks it will make money with OS/2. If they were satisfied with this, they would allow him to buy the SDK.
Then he went to the Microsoft booth and asked the same question about Windows 95. They handed him the SDK right there, no charge and no strings attached.
That was when, he said in his column, he knew Windows 95 would be the successor to Windows 3, and OS/2 Warp would be at best a niche OS.
I said IBM made the same mistake as TI, but more so. The more so is because IBM also neglected promoting OS/2 to consumers, and they neglected hardware support, so even if it had all the software that you cared about, it was hard to discover that OS/2 was an option, and if you did it could be a nightmare getting it to work with readily available hardware. By 1996, Linux was easier to get working than OS/2 with most hardware sold at the major retail computer stores (CompUSA, Computer City).
In '83 I was a 17 year old wise-ass with a dippity-doo mohawk and video games I'd authored in every Sears and K-Mart nation-wide. My best friend did the contacting of computer stores, and one of them turned us onto his brother - a distributed of games and toys to Sears and K-Mart. Our best seller was a Vic-20 game called "GraveCave". Through that experience I became a beta tester for the original pre-release Mac and backdoored by way into Harvard, working with Mandelbrot and quite a bit more. Ah, the risks of youth...
> The history of the personal computer business, brief as it is, has shown that the successful machines are the ones that have the most and best software available for them. It has also shown that no single company can write all the software itself. It must take advantage of the cottage industry of programmers.
Jack Tramiel truly was a business genius whose legacy is unfortunately being written out from history. Only the winners write history in their narrative but Jack is the forgotten hero who was ruthless with competitors.
I'm glad to have met him before he died. He was at the CHM with Woz and many other luminaries were in the audience like Al Alcorn and Bil Herd. My PET 2001's manual is autographed by him and Leonard.
There's lots to criticize re Tramiel's performance at Atari, especially on the console side where he sat on the already-finished 7800 for two years while letting Nintendo redefine the marketplace. But Atari had been failing in many ways up to that point (mostly management failures), and all of those chickens came home to roost in 1983. What Tramiel took over in 84 was a company that had already fallen, and was on the brink of utter collapse.
They drove away many of their best software developers, forming competitive software companies such as Activision and Imagic, and then spent resources trying and failing to sue them out of existence. Also departing were hardware folks like Jay Miner and Joe Decuir, who would be major contributors to the Amiga, an opportunity that Atari would later have in their hands and throw away. They completely muffed the next-gen console release with the Atari 5200, which was basically nothing more than a repackaged Atari home computer (already 3 years old at the time of the 5200's release), crippled by a terrible analog joystick that didn't self-center and was prone to breaking. They rushed VCS games to market assuming they would print money regardless of quality issues. After 9 months of raking in over $300m in profits in 1982, they ineptly handled the warning signs of Christmas 1982, leading to over half a billion in losses in 1983.
There was no way the crash of 1983 wasn't going to hurt Atari, but IMO a company which had been better run could have survived. As it was, without somebody that had Tramiel's level of discipline at the helm, Atari probably would have died in 84 or 85.
He performed well enough at Atari: turned the company around, got a 16-bit machine into production, ran several good years before the PC standard swept the board clear for everyone. Muffed the console side, granted, but I don't think he ever claimed to know consoles.
The striking part about that article is that there is zero discussion about technical merit. I've never seen nor used a TI 99/9A, but the stories I read weren't too kind.
In my neck of the woods (Denmark), in the 1978-ish your choices were expensive pre-built Z-80 machines, like the Luxor ABC-80 or danish niche computers (some which were actually amazing for the time). I went with a British kit, the Z-80 based Nascom 2.
Roughly 1 years later, the market had changed and 6502 based options were abundant with the BBC, VIC-20, and Commodore 64 taking the lead (ZX80, ZX81, and ZX Spectrum were popular too). Later Amiga arrived with its own following.
All this to say that the world was such a mess of an abundance of incompatible computers that when the IBM PC came out, the main attraction was the belief that, because of I.B.M., this would become popular enough that there would be a critical mass of great and actually useful software.
I sometimes try to imagine how things would have played differently had IBM abstained from the "hobby market".
>Coleco's Adam, which will sell for $600, includes 80,000 characters of internal memory, a daisy-wheel printer, a tape storage device, game joysticks, a word-processing program and a game. The package will allow people at home to use the computer in place of a typewriter as well as for playing games.
$600 in 1983 = $1,520.72 in 2018. iPhone XS Max w 512GB: $1,449.
So a well specced but not top-end personal computer in 1983 costs about the same as a simililarly specced computer today and the same as high end phone that effectively replaces computers for many people? It's kind of underwhelming.
They are only "comparable" on the relative scale. That aan interesting economic observation of consumers' willingness to spend on "technology". The progress in computing is not "underwhelming" -- a modern smartphone is not just a computer, it's also a telephone, a camera, a movie decoder, a video screen, an audio video recorder, an audio player, a GPS, an accelerometer, a local wireless communicator, and host incredibly advances applications, while also being battery powered, pocket-portable, water resistant, shock resistant, with near 0 electricity cost, noise emissions and heat emissions.
Yes the TI-99/4A was 16 bit, but it only had 256 bytes of (static) RAM connected to the CPU bus! The other 16k of DRAM was behind the video processor, so the CPU had to read/write it sequentially one byte at a time. There was't even a stack for the CPU.
I wasn’t alive then but always has the impression that “home computers” like those described were much more popular in Europe and Japan while PCs became the dominant platform in the US. Is this why?
I wasn’t alive then but always has the impression that “home computers” like those described were much more popular in Europe and Japan while PCs became the dominant platform in the US. Is this why?
Your impression is wrong.
8 bit home computers were massive in the U.S. the main differences between the U.S. and Europe were:
. Europe was big into Acorn and Sinclair. In the U.S., Sinclair was an also-ran under the Timex brand, and was widely seen as a toy; Acorn was unknown.
. Tape drives were more common in Europe, while U.S. consumers almost always had a disk drive or both.
IBM becoming big in the home computer space didn’t happen until the 8-bit Commodore and early Apple machines were already long in the tooth. When it came time for people who owned Commodore 64’s and TI99/4A’s to upgrade, the choices were often Amiga, Mac, or IBM.
IBM was an easy decision for many because they were familiar with the machines from work, and because the notion of doing more office-like things at home was becoming popular. It was about this time that the home “computer room” evolved into the “home office.”
Acorn and Sinclair were mostly UK. Commodore did better most other places in Western Europe, with some local variation. Eg pockets of Amstrad,abd I think Apple did OK in France.
Growing up in Norway I never saw an Acorn machine in person until someone brought some RiscOS machine to a party in '92. One unfortunate kid in my class had a Spectrum. Other than that I rarely saw non-Commodore machines outside of computer stores until PCs started being common, though Atari STs did also gain a foothold.
Interestingly I also never saw an Apple machine in person until the Macintosh - the pre-Mac Apple machines were virtually unheard of in most of Europe.
I really doubt that unless you stretch the definition of "very successful" quite far. Maybe they did ok by revenue given how extremely expensive it was. The Apple II sold poorly enough outside the US that most markets didn't even get localised keyboards.
Most UK accounts of the era at most mention the Apple II in passing among wider coverage of the "second tier" machines like Amstrad CPC, Dragon, and even Oric.
Apple II over its lifetime sold about as much as the ZX Spectrum did worldwide, but the vast majority of Apple II sales appears to have been in North America.
In terms of proportion of the business market, possibly - I don't know. But certainly not enough to make Apple sales in the overall UK market significant vs. Commodore and Sinclair. There simply weren't enough Apple II machines manufactured for that to be the case.
I'd say the UK was into Acorn and Sinclair, not (Western-)Europe as a whole. The situation was very different per country (e.g. the Acorn Atom was quite popular in the Netherlands, but AFAIK nowhere else on the continent). In (Western-)Germany it was mainly Commodore (C64 and Amiga), and apparently(?) Amstrad CPC were quite popular in France and Spain.
Eastern Europe was either ZX Spectrum clones, or "domestic designs", mostly based on Z80 CPU clones, but also some 6502 based machines (e.g. an Apple II clone) in Bulgaria.
I remember "studies," as well. But I remember the evolution of the supplemental room in American houses going
Library → Study → Den → Play room → Computer room → Home office
Though play room was sometimes a different space entirely, like in an attic. Den → Computer room came naturally because the computer enthusiasts were often men, for whom the den was intended.
In Europe you had Clive Sinclair who produced the zx80, zx81 and, critically, the ZX Spectrum.
Clive wasn't focused on building the best computer, he was focused on building the cheapest. He made home computers mass market. The C64 was more than twice as expensive as the Spectrum. _Anyone_ could own a Spectrum.
Almost exactly the same thing was said in the US between the Apple II family and the Commodore. _Anyone_ could own a Commodore. My folks couldn't afford a IIe, but they could get a 64 and all the trimmings.
1. prices for computers were lower in the US than in the UK
2. wealthier Americans were than europeans in the 70s and early 80s (the gap closed the further from WW2 we got)
That "double" priced c64 in the UK probably set an American family back as much as a spectrum did in the UK in terms of their respective disposable income.
And the C64 price quickly sunk like a rock once the VIC-20 left the market. The T/S 1000 (the US ZX-81), its contemporary, was seen as cheap junk. I remember my parents being critical of the keyboard and the construction, and I don't think they were atypical. The T/S 2068 was better than the 1000 in every respect, but had compatibility issues with the Spectrum, and Timex's computer division had already suffered from the poor perception of their earlier home computers.
I'm sure it gave an option to some that otherwise couldn't afford one, but the C64 outsold Spectrum several times over, and the VIC-20 before it was the first home computer to reach sales of 1m units,so I'm not sure I'd argue it was Sinclair that made computers mass market.
Official sales had the C64 outselling the Spectrum 3 to 1 'worldwide' but that ignores licened clones like the Timex and, basically, the entire Eastern European computer scene based on cloned ZX Spectrums.
Timex only claims to have sold about 600k units, so while not insignificant, it didn't really move the needle that much.
For the Eastern European markets it really varies greatly by country. Commodore dumped hundreds of thousands of C16/C116 and Plus/4's in the Eastern European markets in '86 onwards for example (largely because they'd totally flopped) - especially in Hungary where many Plus/4's ended up in schools for example, and a substantial proportion of software (quite likely a majority of titles) for those machines came out of Eastern Europe. This was a point where most of the Spectrum clones didn't even exist yet.
The Spectrum clones longevity in Eastern Europe with new models appearing well into the 90's was remarkable, though, and certainly served to bridge a gap when Western computer company had largely ditched their lowest end 8-bit machines.
They were huge in Australia also - I recall visiting a 'computer show' in 1981 in what was then a population of 32k people city, and next year I convinced my Dad we ... ahem ... I needed a TRS-80 Color Computer. Putting aside the huge interest in Commodore computers, at that stage Tandy Electronics (the Australia version of Radio Shack) had a strong presence in the market. You could go along to 'computer courses' at my local Tandy store, and they had lots of stock of the various TRS-80 business and home models; the staff working there had good knowledge of the machines, and would even do RAM upgrades etc. There was even local development of various clones (eg the Dick Smith System 80, which was a TRS-80 Model 1 clone). This country town I lived in had newsagents that imported all the 'known' magazines (Byte) as well as specialist Color Computer publications. It become a monthly ritual for my Dad to drive 10-year old me around to collect all my 'necessary' magazines. Looking back, I was quite lucky in terms of timing as getting my hands on a PC I could program in BASIC and assembly, and also having access to worldwide print resources, meant I already had a wider outlook than the employment opportunities that were available locally (which really came down to working in the local steel production facility). It was quite a good time to be alive, and discovering computers and computing, when I think about it now.
Well, people wanted the same computers at home that they used at work, at least many did. For a while, when businesses were using computers like the TRS-80 line, people would buy them for home also. But when the business world standardized on the intel-based PC, that's what drove a lot of people to buy those for their homes also. So they could do work at home, and use software they were familiar with.
It also just seemed pretty obvious (perhaps modulo even pre-Mac Apple) that the "PC" was just where the puck was headed in terms of software, etc. I was shopping for my first computer ~1982/3 and I started out looking at a whole bunch of different options (mostly various Z80 systems) but by the time I bought one (in probably 1983), it was pretty obvious to then me that an IBM PC clone was the way to go.
As you suggest, at least part of the influence was that, although I had used some Apples at work, we had switched to using an IBM PC which probably colored my thinking. Although I used them some in grad school later, I never got into Macs until relatively recently--and that was for basically work related reasons as well. (We basically don't use Windows and I didn't want to use solely Linux for presentations, etc.)
Retail price? Sure, but a number of large US conglomerates (the early PC era was also the height, or maybe just the tail end, of the period of big unfocussed US conglomerates) built clones in house and offered steep employee discounts across the whole conglomerate, not just the division that made and sold PCs.
As I recall, it varied. There were certainly relatively cheap Z80 and other machines like the Commodore 64. But I don't remember systems like the Apple III, Epson's machines, etc. being vastly different in price given the same floppy-only configuration.
The first "affordable" PC clones (in Europe at least) didn't appear until 1986, with the Amstrad PC1512 (apparently also marketed in the US as the PC6400):
https://en.wikipedia.org/wiki/PC1512
Much earlier in the US. Most notably Compaq (founded in 1982). But many others. The one I bought in 1983 or so was from a company called Corona. I think Eagle was another.
With hindsight, this article has all the pieces for understanding what happened with home computers but didn't quite put it together. The article is correct that software was going to drive computer sales and that you had to depend on what at the time was a cottage industry. They even mention IBM being about to enter the market, but the article doesn't seem to be aware of or recognize the importance of the PC architecture in providing a common platform for software creators to target.
Maybe because it wasn’t the first to do so. There were thousands of pieces of software available that would run on all different CP/M machines from lots of different manufacturers.
Usually the only difference was the disk format, and in those cases you could often order the software on the disk type of your choice.
If you had a real outlier of a system, there were often equivalent packages written in BASIC or LISP or PASCAL that you could buy the source code for to run on your machine.
People today would laugh at the notion of paying $1,500 for an accounting program delivered in BASIC. But that’s how it was.
I learned to program in BASIC on T.I. 99/4A. The computers themselves were not expensive, but the peripherals were. It seems insane now, but we used them by hooking them to a TV.
Even back then it was a niche compared to the Commodore vic-20/64.
I think the real loser from the that time frame wasn't TI, but Commodore who was in a market leading position, but ultimately lost to the PC and Mac.
Although the prices of the computers and the extreme price reductions seems interesting, I think even more so is the companies and brand names that have disappeared from the "consumer market" due to saturating the market (Texas Instrument and Atari). I think it is very interesting that the desktop (personal computer) exploded over the next couple decades and yet the overages in supply actually sunk these market leaders to the point where they never recovered.
I think properly forecasting market size and growing your inventory, surpluses, and work-in-progress properly while pouring extra engineering capacity into research and development is a delicate balancing act which we can all learn from.
Remember the computer shopper publication from ziff davis? it was a monthly 3 inches thick magazine full of vendors selling beige box computers that have now all been long forgotten.
Oh yes I remember these. A family friend bought one to run his business on, clearly happy that he'd got a bargain. It was a piece of shit and he was forever running into compatibility problems with software.
> Referring to the $100 rebate that had helped make the T.I. home computer popular, Mr. Cosby joked about how easy it was to get people to buy computers if you paid them $100 to do it.
It makes sense for Sony, Nintendo, and Amazon to sell their devices at a loss, making up for it later in selling digital games and books. The loss is an acceptable cost for a bigger distribution platform. But what reasons T.I. had to pursue the strategy without having any footprint in software? Can anyone help me understand this?
TI did have a footprint in software, which is how they hoped to survive the endgame selling computers at a loss. But that created a dilemma. If they went ahead with a Nintendo-style crushing of third-party software developers, the Vic-20 would have a much larger software catalog and be a better value proposition. If they went the Commodore route and welcomed third-party developers, little of the software revenue would go to TI. They wavered back-and-forth on this a couple of times but never found an answer that worked for them.
Clearing the channel of product 2.0 before flooding the channel with product 3.0 ? If the market is moving faster than you are, then prices of yesterdays technology can only drop over time so cannibalizing future sales at $200 discount by providing a $100 discount is a good idea.
Your question is clear in an abstract sense but AFAIK Nintendo has never sold a console at a loss even on launch day and Amazon tablet product sell for more than the profitable price of imported semi-legit Chinese tablets. A kindle fire hd 10 (which I'm considering getting to use as a big screen kindle appliance) is well over $100 from amazon and a seemingly identical NeuTab is $70, so Amazon is likely making about $50 off each kindle. Sony does sell at a loss which makes them go insane over unlicensed homebrew software displacing their profitable software.
At the time there were still a lot of people who wrote their own stuff because the majority of purchasers were still enthusiasts. I think TI failed to foresee that the market was changing to people who were primarily users than tinkerers, and ignored the need for a robust software library at their peril.
May not have been that was the strategy they wanted to take initially but they were undercut severely on price my commodore's vic-20 and may have been seen as the only option.
I had a Timex Sinclair 1000, and then bought a Commodore 64. I used a vic-20, and for the life of me, I cannot see how a vic-20 compares to a Texas Instruments 99 / 4A. I honestly would have preferred my Timex over a vic-20 just because the 20 character screen with was super annoying. How did you live with it?
I used to feel the same way. But it’s importamt to remember that the VIC 20 wasn’t so much a contemporary of the 99/4A as it was its predecessor.
The VIC was somewhat out of phase with the rest of the computer industry at the time. It was more akin to being a step between CP/M machines and the full-on 1980’s home computer scene.
Amazingly, people are still producing new VIC 20 carts today. And the quality of the games is quite impressive.
Dad liked the TI 99/4a so much he bought 2 of them, from service merchandise no less. It had a 16 bit processor and a really swell speech synth module (golly gee it says good shot pilot when you zap an alien!)
Personal computer prices were in a meteoric decent just a few years prior radio shack had a catalog of them starting well over $3,000.
I have one of the original TI PCs, with an 8086. My father purchased it around 81-82. We did have a Commodore 64 as well, and indeed it was far more capable, convenient and user friendly.
I was about 14 at the time and had to save my own money to buy a computer. I bought a Timex, and then a Commodore 64. I often wished for the Texas Instruments, because it looked more like a grown up or professional computer, while the Commodore 64 seemed to emphasize games. In particular, I liked the Texas Instruments keyboard a lot better than the Commodore. What was your experience and perceptions? Did you like the Commodore more because it had better games?
First off I found the BASIC environment of the Commodore 64 much more enticing and then MS-DOS 2. It was my initial hacking environment.
Then, we did have a lot better software on the C64 such as logo and indeed many games, and all of the hardware capabilities were far superior such as the graphics and sound. The TI had an ASCII display only… I don’t even think it had CGA, which is far inferior to the Commodore and all the PC world had to offer for years. Even EGA was not as good as the Commodore (one year my friend got a PCjr for Christmas...). It wasn’t until the PS/2 came out that I was excited about having a PC.
Whenever I read things like this I grow sad that this generation threw out all their computers and traded them in for smartphones, which aren’t really computing at all in my opinion. They are more like watching tv, something we already did too much of.
I first got interested in computers and eventually programming by playing with custom ROMs for my Windows Mobile phone when I was in college, circa 2006. If it wasn't for smartphones, I never would have taken a networking class (I was a radio/TV major) and realized I was very interested in what I could make a computer do (I'm a software engineer now).
I think that the kids-these-days who aren't hacking their smartphones are the same kind of people who didn't do much more than watch TV in years past.
I got interested in computers because I wanted something to play games on, that's all I cared about... and my cousin wisely told my parents to get me an Amiga 500 instead of what I wanted. It came with a lot of manuals which I devoured, and though I didn't program much on the Amiga, I owe a lot to all that (most importantly a point of comparison -- it's not like I haven't experienced all the stuff that came later, and I know I missed out on an even more "empowered" time in the 80s, I barely just catched the tail end of a golden age).
Sure, most kids who had an Amiga didn't get that deeply into it, e.g. a friend of mine who also had an Amiga never seems to use his computer for much more than playing games and watching movies to this day. But for anyone interested to dive deeper, the doors were wide open and and as encouraging and honest to newbies as humanly possible. Even my non-technical kid friends knew how to load up a neat module in protracker. None of that focus on quality and openness for the sake of quality and doing the right thing was to the detriment of the "casuals", and a polar opposite of a world of stuff like miniminzed JS that does nothing interesting, smartphone apps that half-ass what a website could do, games that just rehash and dumb down the same shit to keep people consuming.
The shift from tools and variety to mere consuming and fads is rather crass, come to think about it.
> "I think that the kids-these-days who aren't hacking their smartphones are the same kind of people who didn't do much more than watch TV in years past."
True. But most people never took the TV to the toilet with them. Yes, It's the same type of person. But today there's no norms to discourage usage (e.g., most people didn't watch TV at work) and thus ever less self-regulation.
Note: I'm not longing for the good ol' days. Just wanting to add some clarity to the comparison.
This generation didn't throw out their equivalent to the TI home computer. The same kind of people who were inclined to buy a TI in '83 are buying the RaspberryPi today. More computers are sold today to a wider variety of people that were ever sold in 1983.
The big difference is that in 1980s, we had no choice. If I wanted to play games I had to learn how to use this new TI99/4A. Back to those cartridges were in limited number and my time was kind of infinite (it seemed looking back ;) ). So I started to borrow books and copy listings, then started to understand basic and finally tried to adapt the gameplay etc...
I see my kid today, he is absolutely hooked to videogames. Minecraft and Fornite, they are already richer in variety than all of the games of my childhood combined!
He does some stuff on scratch and loves it so I tried python with the super simple 2d game engines... This has been a repeated failure. He has 0 patience left to try out stuff, learn the syntax or tinker with it.
But I understand, it is really easy to get instant satisfaction from the baked in stuff, oh look a new challenge to do on fornite!
I don't buy it - your kid may just not be into programming as much as you were. I was born in '83 and in the 90s (when I got into computers) QBasic was already well buried in some obscure directory inside DOS (or later the windows 95 installation CD) & with all the piracy there was no shortage of games around.
I still got into programming as a kid & have been doing it for a living since I was 18. And I had plenty of friends (most!) who just wanted to play the games, as well as those that got into programming to make their own.
I’m curious about people who learned to program as children. I was an intensely curious kid too, and would quite happily dig around the Windows 3.1 OS hoping to uncover something cool. All I really got out of it was a knowledge of the computer’s file system, though.
In my early teens I had access to the internet, and managed to find a C++ tutorial online. I learned to print text to the command line, but not much more than that.
It took until my late 20s that I finally had the resources and mental stamina to start learning it for real.
MS languages had great "online" (i.e. integrated in the IDE) help, you could learn programming from the QBasic/QuickBasic integrated help alone. Visual Basic also had great integrated help.
We also had Logo classes at primary school but I never "got" that you can actually program with logo, it seemed like a silly drawing program to me at the time.
Using Swift Playgrounds, Pythonista, Continuous and similar apps is not much different than booting up my Timex 2068 and coding away in the realm of Timex Basic.
The first is that those old systems were so limited you could either write BASIC or machine code. There were no distractions, no other apps, and no social media - except paper magazines. So 100% of your computer time was devoted to typing in code, learning about code, or writing your own code.
The other reason is that in the UK at least, a lot of one-man-shops produced an entire games industry around the Sinclair (Timex) Z81, Spectrum, and other 8-bits of the day.
The business had all of the magic characteristics of an open market: minimal cost of entry, minimal cost of marketing (through cheap ads in paper magazines), low-cost distribution, and an enthusiastic and largely untapped potential customer base.
No platform today has those characteristics. Web "apps" probably come closest, which is why we're talking about this on HN. But a lot of 8-bit businesses and careers were started without the corporate and financial cruft that surrounds modern start-ups.
The large majority of those UK professional shops were using 16 bit machines for coding those 8 bit games, while deploying them over connection cables.
Coding on the machines themselves was more of an indie kind of thing.
Plenty of stories about it on the Retro Gaming Magazine.
Most of my friends back on the 8 bit days never learned to code anything beyond LOAD "", as they had plenty of games to choose from at the local bootleg shop.
I don't think it's really that different. I only knew a few other kids in the 80's that had a PC. It seems similar with my kids. They are among a few in their school that have an interest in things like Rpi, Arduino, ESP8266, etc.
There are two types of computing today (2019) - producing and consuming.
I can't cite the correct reference, but it's something along the lines that when the masses do more of one thing than the other, something will suffer bad.
"this generation" is buying a lot more computers (the traditional pc/mac kind) than their 1983 equivalents, even accounting for the greater world population today.
Removing the expectation that a computer would ship with a compiler onboard, ready for use, was a big mistake. We lost so much control over our devices when we passively allowed the computer companies to take that power from our devices ..
The TI-99/4A didn't ship with a compiler. Neither did the Atari 400 or 800, the Commodore VIC-20 & 64, or for that matter, the Apple II. Sure, the Apple came with a BASIC interpreter, as did several of the others, but not a compiler and you could usually purchase an assembler. This isn't something we "lost," it's something we didn't have.
Yes, those things did come with the means onboard by which software could be created. It wasn't a compiler, if you want to be pedantic about it, but no computer shipped in the 80's ever survived unless it had the means onboard to support development of software for it. BASIC was a very effective means by which to write software in those days...
We do not have that capability with the iDevices. Nor do we really have it with workstation devices (except of course Linux). Manufacturers don't consider it important to breed a generation of users who can be developers with little more than the initial shipment of software - instead the base has been split, such that more profit can be gained, and more control over an elite market, can be applied.
Developers are the tip of the spear for OS vendors. There is a great deal of control over this end of things, for no reason other than to make it possible to have a separated distinction between developer and user. I believe this distinction is having a net deleterious effect on computing ..
There was a very expensive compiler for the Apple ][. Apple Pascal, based on UCSD Pascal, had its own operating system, released in 1979. I ran into it in 1983 in my very first computer class (8th grade).
https://apple2history.org/history/ah17/#01
You know this already, but smartphones are just computers that are small and portable. That's the only difference. Many of the operating systems are similar or based on those you'd use on a desktop or laptop computer. There's a large amount of software available on both mobile and traditional computer operating systems.
TV is non-interactive, smartphones allow a great deal of interactivity.
Besides, I don't think watching a fair amount of TV is a bad thing in the first place. It's a creative medium, you're consuming art (okay, maybe a lot of it isn't very good, but it's art nonetheless).
Smartphones aren't nearly as good for being productive as desktops and laptops, though they're pretty good at consuming content. I think this shift away from desktops and laptops indicates a shift away from productivity.
Yes, a lot of people watched too much TV, but at least they had the capability of doing things like image editing, building spreadsheets, etc. Smartphones can do those things, but it's so awkward that it's more likely that you're not going to bother, so you're more likely to just consume content instead of create it if you don't own a desktop or a laptop.
Personally, I'm not really nostalgic over traditional computers, but I am a little nostalgic for stuff created by hand instead of filled in with a template.
This. If we'd have a smartphone with a proper USB connection (one that doesn't stop working after a few years), and the possibility to connect it to a keyboard, screen, and mouse, I'd do pretty much everything on my phone. Instead, I can't even browser normally: I get a dumbed down version of most sites.
That's an interesting word choice and something tells me "productivity" is highly subjective.
For example, can you code on a smartphone? Yeah, but not as productively as on a desktop.
Can I manage my sales team's day to day from my smart phone? Absolutely and that might be a great choice as managing sales probably doesn't lend to being in one place for very long.
Smartphone no, but I’ve written thousands of lines of code and hundreds of thousands of words of material on an iPad, because it’s what I have with me on the train. Context is everything. I have written a few small programs on my phone too. Google Docs makes writing seamless with the desktop, and I can ftp my code out of Pythonista. Probably not a typical setup, but I find it very productive. They’re just tools.
I’m reading this thread and replying from my phone while on the bus. Try that with a desktop or even a laptop. I consume a huge amount of educational and special interest material from YouTube that wouldn’t exist let alone be accessible in the age of TV.
Mobile devices do have some negative consequences, sure, but they also enable engagement, contribution, self improvement and creativity. My kids watch a lot of YouTube, but also use their iPads to do homework and sketch and draw.
These are fantastic tools. How we use them is up to us.
I agree. To me a computer is a machine under a persons control and I believe if you own something you should have full control. This is why we have a jailbreak scene. People want control.
That price plunge turned an unaffordable extravagance into something I could actually buy and start using every day. It wasn't just the computers that became cheaper, but the peripherals (disk drives, modems, printers) and software as well. Suddenly, a home system that was never going to happen could be picked up piecemeal at a fraction of the price.
I wonder how many programmers owe their current careers to this early shakeout and price collapse.