Hacker News new | past | comments | ask | show | jobs | submit login
The Sad Saga of Silicon Graphics (1997) (businessweek.com)
138 points by gdubs on Oct 16, 2014 | hide | past | favorite | 99 comments



From what I saw as a humble member of technical staff, the place was packed with really smart technical people, some of the sharpest people I ever worked with. One thing the management did well was to recruit and retain a very high proportion of really good people in technical staff and the first line of management. The result was some awesome (for its time) hardware and great software (IRIX).

Top management was apparently locked-in to their original winning formula, to build high-end equipment and sell it to the "professional" market that needed its capacities. It was apparent to anyone technically aware that the narrow-margin PC hardware was rapidly eroding the performance difference that justified the higher-margin SGI gear.

I seem to recall that an in-house project to build an SGI graphics card for the PCI bus was scuttled, with the result that several of the people who had worked on it left to form a PC graphics card company.


Strangely, I worked at a place that did batch data processing on IRIX in the mid to late 90s. While we didn't even use the graphics capability of the machines, I did notice, at least in 95, that they were a hell of a lot faster than PCs (although the DEC Alpha that I used at a previous job sure gave them a run for the money).

I don't know what the price difference was between the servers, but in 99 we got new servers from IBM. We learned from personal experience that the way to pronounce IBM flavored unix is "aches" (AIX). I missed IRIX after that change over, regardless of the hardware price-performance ratio. I also missed having a workstation that ran the same architecture as the prod server. Sharing a dev server sucks, as there's always some jerk who will fill up the disk.


To found 3dfx? http://en.wikipedia.org/wiki/3dfx_Interactive - which in turn went from number 1 player to bust in about 4 years.


I was at SGI via a software subsidiary from 1996-2004 so just at the peak and then riding down the long decline. I remember going to Siggraph in I think 1998 and seeing nothing but SGI machines in every booth on the show floor.

One thing I recall is that at one point everyone at the company was told to read "The Innovator's Dilemma", which is of course the tale of large dominant companies having their market share eaten from below. Some people say SGI didn't see it coming - I think what was interesting was that they did see it coming but still structural issues prevented the taking of any effective action (for example salesforce commissions continued to encourage sales of large systems rather than the low end machines they really needed to push at the point.)

There was also an internal news group called sgi.bad-attitude where you got to find out what was really going on in the company. I believe the idea was brought over from Netscape via Jim Clark, but it was impressive to have a forum where you could say just about anything and not be censored. That is where I learned things were over long before it became apparent to the outside world, or even much of the company.


Re: sgi.bad-attitude. Yes, Netscape did have an mcom.bad-attitude, but the order is reversed, it was brought over from SGI to Netscape (or mosaic communications as it was originally known - we never did manage to get mcom fully removed from everything). Six week sabbatical after 4 years was another nifty SGI attribute that made it's way to Netscape as well.


> Some people say SGI didn't see it coming - I think what was interesting was that they did see it coming but still structural issues prevented the taking of any effective action (for example salesforce commissions continued to encourage sales of large systems rather than the low end machines they really needed to push at the point.)

It's interesting how knowledge of a (potential) problem often does not protect against that problem. As a psychologist it's one of my biggest fascinations: how people who work with addicts can become addicts themselves, how people who know of the 'evils' of racism are still susceptible to it, or how people who are professional psychologists do not deal with their own issues.


And in this (and many other cases), how do you shift your business to something that is fundamentally different in a lot of ways, has a very different cost structure, and may not especially align with the particular competencies developed over the years?

Reading about Kodak and Fujifilm is interesting: http://www.economist.com/node/21542796. Both diversified--Fujifilm just did so more successfully. And, interestingly, did so by arguably focusing less on being a photo company in a digital age (though their mirrorless interchangeable lens cameras are nice products) but by applying their film chemistry expertise elsewhere.

Hopefully you have the advantage over starting fresh of a good revenue cash cow. But if that revenue declines rapidly, your legacy base may be more burden than help even if everything is managed perfectly.


Reminds me of Buffett (1985 Letter to Shareholders)

>“A horse that can count to ten is a remarkable horse - not a remarkable mathematician.” Likewise, a textile company that allocates capital brilliantly within its industry is a remarkable textile company - but not a remarkable business.

And similarly I guess for Kodak and SGI reallocating capital to related projects.


I've always been fascinated by this as well. I've know several Cardiothoracic surgeons who were heavy smokers. Same thing with nurses.

You'd think if anybody would know first hand the dangers of smoking it would be these folks, but nope, they all still smoked heavily regardless.


> how people who are professional psychologists do not deal with their own issues

My dad was owned a car mechanic's shop. One of the truisms in that industry is that mechanics often have cars in the worst shape, because they're so busy taking care of other people's cars that they just don't have the motivation or time to deal with the problems of their own car. Perhaps a similar phenomenon?


> Another name for this is "vocational irony", which is a form of situational irony.

http://tvtropes.org/pmwiki/pmwiki.php/Main/TheCobblersChildr...


Maybe in the desire to care for others more than themselves.


Speaking of disruption, what's funny is the extent to which SGI was done in by id Software.


It's an old and standard story: High end, niche, special computer hardware is soon overtaken by low end, broad market, common computer hardware.

Why? Because the low end, etc. has so many more customers that, so far with Moore's law, etc., the much greater investment in progress in the low end, broad market, common computer hardware wins against the much smaller investment in the high end.

Or, as was explained to me once, as maybe no more than a hypothetical example, invent a Lisp chip. Great. When running Lisp, beat the pants off all the general purpose processor chips. But just wait until the general purpose processor chips are a factor of 10 times faster and then lose out because the small volumes of the Lisp chip can't keep up with, say, Intel heading for 7 nm or some such.

Besides, for graphics, where get the big business volumes, for the high end engineering graphics workstations for high end engineering or for millions of bored boys not old enough to shave yet who just love playing video games?


Haha, yeah, reading through the article (so interesting!) I just kept wondering why they stuck so closely to high end workstations, although these days that is what I'm more interested in.

Growing up I was one of those bored boys. I grew up on Nintendo and started getting into the Quake series when my family got a Windows 95 PC with a Diamond graphics card (that or RIVA TNT...I know we at least had the Diamond card later on). The release of the Quake III demo was the beginning of intense gaming addictions for me that trailed into MMOs. Also ironically, my Dad got us the @Home cable internet (from the article I learned the former SGI CEO Jermoluk was involved with!) which definitely helped foster my addictions.

Another interesting tidbit, one of the main reasons I chose the university I went to was that I heard about people doing internships at Nvidia. I was lucky enough to do mine there (was/am studying computer engineering) and even more irony, I worked on a technical marketing team (benchmarking workstation graphics) with a bunch of former SGI employees! I remembered hearing a lot about their former glory but never knew nearly as much as I have found out in this article and the comments! Fascinating.

These days I have actually moved on to doing research on high performance computing, and I might be looking for a job with close ties to Intel (IMFT~), haha. If only I could just acquire more motivation. Growing up as a bored boy playing video games has ruined me xD


See also: the iPhone vs Android phones.


Interesting. Perhaps this is why I Apple appears to be focusing even more than previously on being a high-end consumer product company more than relying on its vertical integration and solid hardware.

There might not be room for an SGI in a world of rapidly improving, cheap, low-end technology, but there might always remain a place for a 'BMW/Porsche' company.


Flight simulators. Back in the 1990s, McDonnell-Douglas in St. Louis made the local sales rep the top salesman for SGI with one purchase in one year.


Yes, flight simulators for the US DoD were long one of the main drivers of high end (not so high considering what we have now) digital graphics.


I was an intern at a CAD company back in the late 90s, and saw how in a short period of time SGI workstations had been replaced by Windows NT machines with better and faster graphics cards.

You could tell SGIs were nice, a good solid build, but their time was over. They were way too expensive. I think we even leased them because buying them was too prohibitive.

But I did have fun discovering by accident the file manager from Jurassic Park ( http://fsv.sourceforge.net/ ) on it. For whatever reason, I remember being really excited about that.


It was a surprisingly fast transition. I was working on physics engines for animation back then. In 1997, I visited Sony Pictures Imageworks in LA, where they had about 90% SGI workstations, 10% NT machines. Two years later, the ratio had reversed.

What killed SGI workstations was the gamer graphics boards caught up with the high end. There used to be a high end graphics market - SGI, Evans and Sutherland, Dynamic Pictures, Fujitsu, Lockheed. It was killed when the low end got good.

SGI might have survived that, but they made a number of bad decisions, including a sale/leaseback real estate deal with Goldman Sachs which locked them into long-term leases at the worst time in the market. SGI tried selling a PC that ran Windows NT, for something like $12,000. I told their sales guy that wasn't going to fly. It didn't.

Most of Google's buildings in Mountain View are old SGI buildings. The Computer Museum was SGI's Digital Studio division, which never accomplished much other than installing networks for some animation studios.


I still remember back in 1999 when an animator at a studio a worked at showed up with a brand new Geforce card he'd just bought. We grabbed one of our NT machines, popped out whatever high end card was in it, put in the Geforce card and started up Maya. Watching Maya run well enough to get real work done on a card that cost a fraction of what a 'real' graphics card cost was pretty amazing. Hell for many of our common work loads there was no real difference in performance at all. Combine that with the recently released Pentium 3 CPUs and you could build yourself 3D graphics workstation out of cheap commodity parts that could really hold its own.


Yep. I have an SGI Indigo downstairs for just that reason. Our Mech. Engineers stopped using them to run Pro E and switched to NT boxes. Couple years later the Indigos were in the trash.


In 2000 I visited a company that designs and produces car parts. They had some SGI workstations running as far as I remember CATIA v4 (3D CAD) with big CRT monitors showing a blue background, and some big tower PCs running WinNT 4 or 2000 with CATIA v5. They told me that the car manufacturer BMW used still the older CATIA v4, so for compatibility reasons they used both software versions.


CATIA is the product I was going to mention.

http://en.wikipedia.org/wiki/CATIA

It was ported to Windows in 1998 but anyone who knew anything could see that coming.


Actually, CATIA V4 was never ported to Windows. The next version CATIA V5 (and now V6) had Windows as one of its supported platforms.


I hope that upon discovering it, you said aloud, "It's a Unix system, I know this!"


I was working at a company that bought an SGI Challenge XL (I think for well over $100k). It was between 1-2 years after they bought it that we wanted to upgrade to a faster ethernet interface, but that would cost us around $3000. Instead we spent less than $10k on a Sun clone server with a NetApp filer, a combination that beat the SGI by multiples by almost any measure.


This article does not even scratch the surface.

Sorry but Win NT had nothing to do with it.

Sale/lease back of the campus to Goldman who then turned around and sold to Google for 3x what they paid for it.

2 bankruptcies. You don't even want to know what went on there.

The whole thing was a giant train wreck.


A minor bit of the story that postdates this 1997 article: in the late '90s, SGI's strategy to recapture the high-end workstation market was to hitch their horse to Intel's new Itanium architecture, which they thought would allow them to position their boxes as a cut above the performance of commodity x86 workstations. That strategy may or may not have worked if the Itanium actually had x86-beating performance, but since Intel never delivered a performant Itanium it definitely didn't work.


Can you say Vermel? VRML? That's Virtual Reality Markup Language. SGI was the champion of VRML, with Sony a somewhat distant second. It was really cool. Imagine 3D multi-media environments in a web browser. You could download 3D clip art for free. :D While SGI optimized VRML for the O2, it worked acceptably on a cough Windows machine. For a while, each week SGI produced 2 VRML animations with "Floops".

I was into VRML. It was great. We even had a couple SGI reps visit our company, demonstrating VRML and the O2.

Then one day they dropped support of VRML. That was the day I walked.


Hehe it's weird and sad that this was too much too early. Computers, OS and browsers couldn't do much with VRML at the time, even though it's almost a fully reactive, js scriptable, 3D DOM. Also the web wasn't as mainstream, so VRML was in the fog (I only got to use it in college and because I was a CGI fan)


Remember 3DML/Flatland Rover? I used to play with it constantly as a kid - it was a much simpler, block-based version of VRML. Similar in some ways to Minecraft, even, but more programming-oriented.

WebGL being where it is today, I bet something like 3DML could come back in a big way.


SGI's biggest mistake was selling Cray Research's business division to Sun. These became Sun's E10k machines and got them a tun of business outside of the desktop workstation market. It is probably what kept Sun around so long and why Oracle wanted to buy them.

http://articles.baltimoresun.com/1996-05-18/business/1996139...


SGI wouldn't have been able to manage that business as well as Sun did; at least the E10K was successful somewhere.


This is absolutely true. SGI was trying to decide between shutting down Cray BSD entirely or shunting it off to Sun -- who was really the only possible buyer (their product was based on SPARC and Solaris, even when at Cray). So SGI wasn't going to make a nickel off of that product, and as such, it remains one of the best acquisitions in the history of the industry: purchased for less than $10M (I'm not sure the exact figure was ever disclosed), that product line did $1.2B in top line in its first year at Sun -- and it's hard not to have fond memories of that year. (Speaking personally, debugging a nasty performance problem on an E10K in December of 1997 helped to directly inspire DTrace[1]; I will always remember that machine fondly, despite its many quirks.)

[1] http://queue.acm.org/detail.cfm?id=1117401


Right. Selling to Sun was the right thing for getting some nice hardware out there. Just saying, it gave Sun quite a competitive edge - probably bad idea from a business perspective. I was at Sun in that timeframe too, but not working on the E10ks :-)...


As if it helped Sun (in the long run, I mean).


Back in the day, we used to laugh at how awful IRIX was (a colleague who used an SGI box to run an F-18 flight simulator said IRIX leaked 512MB of RAM a day on idle -- back when 512MB of RAM was more than any desktop computer had; we had SGI sales reps visit our office and just getting their presentation software to work (on their hardware) was a major production. Essentially, once 3D acceleration went mainstream the writing was on the wall -- within 3 years of the first 3dfx video cards, a game box had more graphics horsepower than tens of thousands of dollars worth of SGI junk, and 3DS Max was enormously more productive for most things than Power Animator/Maya (Maya has improved a lot).

The company I worked at in 1999 did the special effects for season one of Farscape on SGI. They were underbid on season two by a factor of two by Animal Logic (across the street) doing everything on fast PCs running 3DS Max.

Edit: and when the writing was on the wall, SGI decided to go downmarket and fight Apple on its home turf -- porting its software to NT and competing against PC workstation vendors already running AutoCAD and 3DS Max. This allowed everyone to see the emperor had no clothes -- SGI's NT boxes were half as fast and twice as expensive as name brand competitors, and they were charging $20k for a software license.


> a game box had more graphics horsepower than tens of thousands of dollars worth of SGI junk

Interestingly, at least one of those game boxes (Nintendo 64) was actually powered by SGI technology. [1]

1: http://en.wikipedia.org/wiki/RCP_(chip)


My first full-time job out of college was as an engineer at SGI in Mountain View. This is now Google's campus!

It was the summer of 1999. You can imagine how it went.

In retrospect, there _was_ a silver lining: https://davepeck.org/2009/02/11/the-luckiest-bad-luck/


This is a really interesting article. SGI used to be a king in the Bay Area and you can still see the history if you look around. Some of Google's signs on their Mountain View campus outside of the former-SGI buildings are purple. Supposedly because they never changed it from the SGI purple or the sign changed but the purple stayed as a nod to the legacy of SGI.


Do you guys remember the Disclosure movie, I think 1995? It was the coolest SGI advertisement ever. I actually watched the movie again last year, and it's still cool! Exactly at that time we had both Macs and SGI workstations at the university and the Macs didn't even come close.


Strange fact: Google's headquarters in Mountain View is the old campus of Silicon Graphics. When i was there in 2007, you could still see traces of the old SGI logo in certain places.


related strange fact: SGI sold Cray Research's business division to SUN, Oracle bought Sun and now Facebook's headquarter is the old campus of SUN.


It's interesting but not really that strange. How many buildings/campuses in the Bay Area do you think could house companies the size of Google, Facebook, Sun, or SGI?

Random guess: less than 50.


Interesting that Linux doesn't come up once in the article, and that it is the equally doomed Sun that is perceived as the competitor (and WinNT).


In 1997 Linux didn't support SMP (multiple CPUs) yet, while Irix (SGI), Solaris (Sun) and Windows NT all did. So it wasn't a good fit at the time for server and high-end workstation installations.

Sun had the lion's share of Internet server installations at the time, but Microsoft's IIS was beginning to nip at their heels. Linux was what we played with at home, because it was "close enough" to what we worked on at the office, but didn't require us to spend thousands on hardware.


Support for SMP in the kernel was introduced in version 2.0 which was released in June 1996. It wasn't that mature but it was definitely there. It was only by 1999-2000 with version 2.2 improvements and a bunch of consumer-level dual-core boards like the Abit BP6 that Linux SMP really started becoming popular and usable for workstations.


At the time, Sun was the competitor in the engineering workstation market. Later on, Sun was itself eaten. There were other bit players: DEC, HP.

It used to be well-accepted to drop $18K on a baseline engineering workstation + monitor (1995-98, say -- I just checked an old quote for a Sparc Ultra that sat on my desk).

It must be said that the graphics and floating-point performance of those systems well outpaced high-end PCs running Linux.


In 1997 GNU/Linux distributions were still just something to play at home.

The 1.0.9 kernel released in 1995 was the first to support IDE-ATAPI controllers. Only rich people had SCSI at home.

To get properly working 2D graphics was a black art in itself, let alone 2D accelerated graphics or any kind of 3D support that could rival Graphics Workstations.


RedHat 5.0 was released in 1997. It was pretty usable, thats pretty much when the university I was at started switching from Sun Xterms to PCs running Linux. There was usable 3D graphics shortly after that.


The only usable 3D graphics that could compete with graphic workstations, were provided by a company that specialized in supporting specific graphic cards.

The company had a name starting by Xi something and I think the cards were still those that you had to pair a 2D with a 3D card.


Yes indeed. But they were cheap... And worked. We had quite a few. I can't remember the name either. Then Nvidia came along.


Here's a chart I found while looking for info about the first video cards I used in the late 90s (couldn't find Diamond on it though, besides a mention of merging with S3...):

http://www.vgamuseum.info/index.php/history-tree

I see IXMICRO and MXIC (Macronix) on there.


Eh?

WinNT had the last laugh over all of them.


To my recollection, WinNT didn't acquire much of the unix server base at all - that pretty much migrated over to Linux, with the exception of massively parallel machines.


But WinNT did end up taking over most of the workstation market from SGI and Sun.


For Jermoluk's part, he says he does not recall the incident. ''Am I guilty of getting drunk a few times?'' he asks. ''Sure. Probably inappropriately at times? Yeah. Did I party hard? Sure.'' But he also adds: ''Was I leading a wild life? No. I was working too hard, man.'' Even Jermoluk's critics say this never impaired his job performance, but such overindulgence by the company's No.2 rattled some employees' confidence. ''People aren't used to seeing the president get drunk,'' says a former executive.

It's funny to think this was pretty common during the first dot com boom and bust years. I had an executive at Best Buy tell me a few years back about how the bust changed a lot of attitudes with how executives should act in front of the company employees.


in retrospect, thank you SGI, for XFS filesystem.

http://xfs.org/


It's funny how this article bemoans the state of Apple. I suppose at the time after languishing under the Pepsi salesman for a decade, they were in trouble. Now, Apple is pretty much the primary source of graphics workstations running unix.


Ahhh IRIX.

Best computing experience I've ever had. R.I.P. SGI.


"Now, McCracken and Ewald have a chance to avoid Apple's fate." Since I was there at Apple while SGI was imploding, it's fun to read what people said then and see what happened. It wasn't fun at the time of course.


I keep an O2 on a shelf in my man cave. Wish I'd snagged an Origin 2000 rack when they were plentiful and cheap-ish on eBay about 10 years ago.

I'd always wondered how SGI missed the Internet wave relative to Sun. Around the time of this article I started working for a web hosting company that was, aside from my team of IIS upstarts, entirely run on SGI. Prior to that I'd worked for a content company that was running on Tandem-badged SGI systems. And before that I was with an ISP which was a motley mix of Sun, BSD, and Linux but had just purchased a pair of Onyx systems to handle Usenet services.


AOL had vast numbers of SGI machines of all types, both servers and workstations.


Their industrial design was so gorgeous it makes me cry. I have an Octane2 at home, and what I wouldn't give for that kind of build quality, design, and integration in a modern workstation.


Does is still operate?

Mine is not booting- I don't remember the POST but the fans do turn on. I tried reseating the ram the last time I was bored and pulled it out of the closet.

I would really like to see the IRIX UI brought up to modern standards. I almost feel like I'm looking at it when I look at launch pad in OS X.


There is a project called 5dwm which was an attempt to get an IRIX-like UI on Linux. I tried it once and it got me all nostalgic. I don't think it is being maintained or updated though


I had a few kids in college talking about how nice their Apple whateverbooks were...I'd just laugh and mention my Indigo2 and O2. Scrubs.


IRIX was how I got hooked on zsh. There was something beautiful about the toaster-like O2 that I had on my desk. That machine had some rather impressive graphics abilities (for the time) and was a pleasure to use. It was similar to modern-day Apple products in its beauty.


I always always jealous of those workstations.

To this day whenever I set up a new desktop I always make the window titles with black italics on a grey stripe. Just because that was so cool.


repeated with every new tech platform, e.g. Balcberry and samrtphones

in this case PCs with GPUs because an order of magnitude cheaper than graphics workstations


In 1996-7, I was on a junior high team to do virtual stock investing using mail in scantrons. We were terrible, and stock pickers. The stock I lobbied for: SGI. Singly the worst investment choice on the team. In 2000'ish, I decided to join some yahoo portfolio challenge and threw the initial pool into Geocities and jumped into a high tier spot.

Junior year of college, someone in the dorm went about running a portfolio challenge, and my roommate and I slowly realized the way to win was volatility, and I eventually found an earnings report calendar on Yahoo that was essentially our rosetta stone to winning, because the guy's system let people buy in after hours at market close prices. Wake up a few minutes before market open, search the earnings calendar for stocks up in after hours trading, and place your entire portfolio in one with a huge earnings surprise. I think someone else was front running the guy's system by using a feed 5-10 minutes ahead of the Yahoo API he was using, to find a series of 1 percent per five minutes trades.

In retrospect, these sorts of 'whoever has the most in 3 months wins!' competitions teach novices how to gamble and cheat, rather than allocate assets prudently. These days my retirement portfolio sits in low cost index funds.


Or go pro. People make 100's of billions of dollars every year 'cheating' in the stock market... Some of it's even completely legal. Find a hedge fund that's a little slow to update their purchase price based on market changes and eat them alive. Figure out when large trades are going to happen and remove liquidity at the right time etc etc.


> Find a hedge fund that's a little slow to update their purchase price based on market changes

This doesn't even really make sense. What's the "purchase price" of a hedge fund? Getting money into and out of hedge funds is a long process, not amenable to latency arbitrage.

Do you mean mutual funds or ETFs instead of hedge funds?

In the case of mutual funds, there used to be mutual funds with European holdings that used European closing prices when trading in or out of the fund at the US close, allowing arbitrage based on after-hours trading. However, that bug got fixed decades ago.

In the case of ETFs, they are priced by the market, so there's no such thing as the ETF itself being slow to update its price.

EDIT: Actually, I think instead of "their purchase price" you mean "their order limits". In most major equity exchanges, one doesn't know who's on the other side of the trades, so one wouldn't "find a hedge fund" to trade against, but rather "find a pattern of trades" to trade against. One typically wouldn't know if that pattern represented a hedge fund, a bank, etc., or maybe several different market participants that in aggregate produced an exploitable trading pattern.


Yea, that could have been more clear. If a large fund want's to maintain say 5% apple stock and 5% IBM in their portfolio and consistently balances their portfolio based on a set interval after market changes or even at the same time every day you could discover the pattern and exploit the behavior.

EX: In the next 1/2 a second their going to go from selling at 15 to buying at 16 that's almost free money.


How would ou 'figure out when large trades are going to happen' without violating 10b5?



In spring 1998, my 6th grade class did a virtual stock investing game (using prices from the newspaper business page, with ability to change our picks once a week or so). My big picks were Dell and Lucent Technologies, and my portfolio more than doubled in the course of 2–3 months, far outpacing the second best student, who gained like 15%. Both stocks crashed within a few years after.


In high school, my team invested our fake money in our Economics class in Amazon.com, right before the stock split. The three of us couldn't have been bigger slackers (we all read and watched movies), but we won the trip to NYC to do all the things you do in an Economics Class. The World Trade Center was nice to visit, in hindsight.


I had a similar project in my high school consumer finance class. Pick a stock and report on it.

I picked this no-name company that had just IPOed because I had recently bought their 16K RAM card for my Apple ][. And their Olympic Decathalon game wasn't bad either.

Too bad I didn't have $1K lying around. I was saving for a Macintosh.



I am a bit confused, MSFT IPOed in 1986 no?


Yes they did.


When I was in high school(7 years go) and doing a sim I invested all my money in Netflix....i'm rich....


Could somebody please translate this using layman's terms?


Like many high schools they had an educational program about investing. Usually this involves some company that convinced the school that they should buy 'fake investing software' for teaching.

Usually that software isn't real-time. You tend to log in to some web app, pick some stocks, and at the end of the day your returns get updated.

The students with the highest return on investment usually win something, in my case it was my teacher smiling his ass off and congratulating me. In the above comment, they won a freaking trip to NYC haha.

Anyway what he's describing is that usually these aren't the best written software packages. After all, it's to emulate a stock exchange for 14 year olds so they can do a project with fake stock-picking for 4 weeks. Quite often the web app trails the actual data, so people just go to the stock exchange website, see where the price is headed, then buy/sell accordingly on the fake exchange. 5 minutes later the fake exchange updates the numbers by pulling from the real exchange. It's as if you're betting on a baseball game that ended in the real world 5 minutes ago, as to who will win.

Tl;dr, kids cheated on educational stock-picking programs because kids can query the results of stocks before their software does.


By the way, the easiest way around this latency arbitrage in simulated trading games is to have all trades occur at the published closing auction price for the next closing auction after the trade is placed. A trade for IBM.N at 09:43 America/New_York or 15:55 America/New_York occurs at that day's closing price. A trade for IBM.N at 16:05 America/New_York occurs at the next day's closing price. Disallow placing or canceling orders within 5 minutes either side of the particular equity's market close in order to prevent latency arbitrage and reduce disputes.


Still got my SGI machine. I boot it up every now and then, just to have a wonder about what it would have been like if SGI had just pushed a bit harder, made a laptop, beaten Apple to the "coolest unix workstation vendor in the world" target, and so on. Frankly, I'd rather be typing this message on an SGI laptop than a MacbookPro, but that's history for you.


SGI was never going to make a cheap desktop, though.

Not that a loaded Mac was cheap, but a pretty basic, but usable Indy was still like $12,000. My high school 'won' one from the National Science Foundation.

PC workstations weren't close in power and Windows was flaky but you could buy 5 of them for every 1 SGI machine ($5,000 PC's vs a $25,000 SGI)


IIRC they had a half-assed kick at it with the O2, but that was too little, to late.


There was a Sun laptop around then, although very expensive.


There were a couple of SGI Indy laptop prototypes in the works too, but I guess they didn't go anywhere. But .. just imagine this, all you althist'ers: what if SGI had been the one to release the tiBook, instead of Apple, and for a good price? Would we all be going into SGI stores and drooling these days? I bet that could've happened .. the spirit was there at SGI, even if the management wasn't. (It should also be noted that a lot of that spirit went elsewhere when SGI exploded: Transmeta, Nvidia, AMD, etc. Some good SGI talent even went to Apple! So .. there is a bit of SGI DNA all over the place these days, in fact ..)


An Alpha one too...

http://www.ebay.co.uk/itm/Tadpole-ALPHAbook-1-OpenVMS-Alpha-...

(Note: I'm not linked to that auction, and it's over. It just happened to be the best set of pictures I could readily find of one)


Was there? I know there where one or two companies making Sparc laptops that ran Solaris, but I don't think Sun ever made a laptop.


And I was there. And everyone knew who the girl was. And everyone was just too cocky and had parties on campus where toilet bowls would get broken.

I haven't read the article beyond seeing the names and dates. I'll look at it in the morning and tell you more.

EDIT: Had some time after all and just finished reading it. I can't add any more than was in the article but I can verify about half of it from my own time there but I was gone by 1984.


I spoke with old timers at Cray and they were all shocked by the laxs work ethic at SGI: people coming back from months of paid vacation and not being sure it they worked there anymore.


The Cray guys had their own funny culture as well, though. Cray was incredibly process-based, whereas SGI was entirely seat-of-the-pants product driven. Merging those two cultures was incredibly painful; ultimately the Cray guys won out and SGI became process driven[1]. There's a really twisted article in Wired after Cray got spun out where they complained about all the money they got from SGI because it came with sandals and pets-at-work-allowed rules.

From the Cray culture (and some of the Cray employees) we got the IRIX release train, where we'd ship a new QA'd release every quarter. If a feature wasn't ready it just got pushed out to the next release, no big deal (sometimes this didn't work out; bizarrely one of the biggest screwups came from the CXFS guys who were working in the old Cray campus in Eagan, MN).

[1]: Though as I read Zero to One it's interesting to see Theil's perspective on process-vs-product outlook. Definitely the product-driven SGI was the more optimistic one.


Could that be after Sabbatical? SGI had a six week sabbatical after 4 years of work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: