Hacker News new | past | comments | ask | show | jobs | submit login
How I Arrived at Sun as Employee #8 (twitter.com/aka_pugs)
290 points by hasheddan on May 3, 2022 | hide | past | favorite | 122 comments



When we were undergrads, I remember looking over Tom's shoulder at some Unix code he was studying, and I worked up my courage to ask what "*cp++" meant. He said something about about "pointers" and "post-incrementing", but I couldn't figure out what he was talking about; he was a true wizard. Hi, Tom!


I miss Sun so much. Nearly all the happiest geek-out years of my career were spent there. A great technology company all about the technology.

Today the so-called tech giants are not really tech companies. Advertising, online shopping, more advertising, toxic social networks, movies. None of them are selling technology as the actual product. Apple is the sole exception, but they are a locked-down walled-garden consumer products. Contrast with Sun which was all about interoperable open systems.

I wish a company as remotely as cool as Sun still existed. I'd drop everything and go there right now.


I was #12386 at sun - circa 1990 but I don’t remember the exact date. The day I showed up I was given a cardboard box of VME boards with a sun 4/400 chassis, an Ethernet drop and and a SunOS tape and and left to myself to make it work. I think that might have been their way of hazing the new guy - and I’m sure I got nowhere without help. But it was a lot of fun.

While I’m there in memory lane and apropos of the pointer post increment discussion - one of my first source checking there was when I “optimized” a bit of graphics code by rewriting it to use pointer arithmetic rather than array indices like the “idiot” that wrote it had. Then the “idiot” in question, who happened to have designed the chip the code was taking to, explained how his code took exactly so many cycles with such and such cache consideration and then I learned how to revert a change in SCCS.


> But it was a lot of fun.

Indeed! On my first day I was given a non-working SPARC box and told to get the latest Solaris working (in the early days of Solaris, when many were on SunOS). So much fun!

Want to receive email? Make sure to configure sendmail correctly. And NFS to get to teamware. And so on. Never been so happy in a tech wonderland. Difficult to explain to a generation raised on third-party paid cloud services.

> Then the “idiot” in question, who happened to have designed the chip the code was taking to

The deep hardcore tech expertise across the board was so incredible at Sun. Nowhere before or after have I experienced that. Miss it a lot.


Thanks, Dave. You're no slouch yourself.


A wizard never uses the base pointer or the stack pointer by compilation: he moves the stack precisely as he means to!


Using the post increment operator for pointer operations ? Would have never imagined that. I've only seen it used in for loops for no reason at all, really.


It's common enough, like in a typical memcpy implementation:

    while(size--)
       *dst++ = *src++;


Or chaining it even further for copying a null-terminated string. The assignment operator returns the same value that was just stored, so it can be used as the conditional on a while loop.

    while(*dst++ = *src++);
Incredibly succinct, ridiculous powerful, and way too easy to overrun array bounds.


Yep, if there is no #$00 ('\0') at the end of the string, an exploit will ensue, especially if the attacker manages to modify the string first.


It's one place C shows its roots ....

mov (r0)+, (r1)+ is a PDP11 instruction

as is mov -(PC), -(PC)


...Or

  move.l (a0)+, (a1)+


I think those are 68k ops - PDPlls didn't have special addressing registers (just general R0-R7) and MOV/MOVB opcodes


Of course they are MC68000 family assembler instructions, but the principle is exactly the same.


And yet it's from B which was not designed for a machine with auto increment.


It's very common, like ABC of pointer arithmetic. Especially if one has coded in assembler first, then took up C, it's the most natural thing to do.


Off topic: I wonder if Google will ever end up like Sun.

The pinnacle of engineering, but their golden goose dried up and was acquired for pennies on the dollar.

EDIT: why the downvotes? I genuinely don’t understand. Why not just comment below your thoughts instead.


No. Google uses the dark forces of advertising and free products. Sun was trying to sell products.


Except that Dark Forces can change its allegiance. Product search is/was moving towards Amazon.

Hypothetically, if 'Metaverse' or 'Web 3.0' becomes a reality, not sure whether GOOG can do anything much there - as it is they are poor in innovations.


Sun failed because something "better"[1] came along (Linux and good-enough commodity x86). There's no telling if something better than Google (the product) will arise in the future; which is why Google (the company) doing it's darndest to ensure they are the ones to invent it.

1. Or hard to profitably compete with. Additionally, the dot-com bust sealed Sun's fate.


Linux and good-enough commodity x86 are not better, nor will they ever be better, because one cannot build quality by polishing turds.

Sun Microsystems failed because their hardware pricing wasn't competitive with the x86-based PC bucket turds, which were "good enough": most people working in IT can't explain why Sun hardware was superior even though it vastly was, so it was like throwing pearls before the swine. We have lots of ignorant idiots in IT, because they were drawn to good salaries like moths to a flame.


Perhaps instead of calling your peers idiots, you might take the time to explain what made Sun so much better?


With a few very notable exceptions, most of these people are not my peers, and it is highly unlikely that they will ever manage to become my peers. People here who are my peers are not idiots, so my original statement did not and does not apply to them.

The question in general is, what is one doing on a web site which is supposed to be creme de la creme of Silicon Valley and having to ask why Sun Microsystems was so much better?

My peer would be driven by relentless pursuit of perfection, being the best they can be at whatever they attempt: that would include researching what came before, and why it was so important when so many people say that it was important.

A large portion of people working at Sun Microsystems consisted of highly and formally educated scientists and engineers, inventors in an environment which not only encouraged but inspired innovation. They had people who were capable of designing advanced hardware and people who were capable of designing advanced software, combining both into vertically integrated systems.

For example: did you know that Sun UltraSPARC hardware has a crossbar switch at the core of its architecture? PC bucket turd servers still don't have that, relying instead of North and South "bridges", which are not much more than memory controllers, and certainly nowhere near a crossbar switch in design or capability. And that is just one of many, many examples of what made Sun Microsystems so great.


I don't know for sure, but I believe what made Sun's stuff so great (i.e. doesn't really ever stop working) was very tight software and hardware integration.


I disagree. With web and distributed computing I could anything I wanted by putting together “turds.”

I think Sun failed because they thought that expensive, complicated hardware and software was required for business. And it was for like old business. But new, Internet businesses meant lots of small distributed tasks that didn’t require anything that Sun made and could be done on el cheapo hardware and software.

It’s like those old people who complained that mp3 wasn’t as good audio as whatever. They’re right but it doesn’t matter.

Obviously, Sun hardware was better but it was like 100x the cost for 3x improvement. So everyone did the math and skipped Sun. And people sound funny complaining about turds.


I could anything I wanted by putting together “turds.”

You could, but it wouldn't be very good. Perhaps you noticed, but many others share your opinion and have in fact done that exact thing, and the results are not very good, which to me personally is completely logical and understandable.

The above is not obvious to you?

It’s like those old people who complained that mp3 wasn’t as good audio as whatever. They’re right but it doesn’t matter.

Thank you for providing a good example of what I wrote earlier: throwing pearls before swine who are not capable of understanding it to be able to appreciate or pick it over "good enough". IT is a shitty industry, and your example illustrates, in part, quite well why that is so. If I live to see my retirement, it will be one of the happiest days of my life because I won't be putting up with that kind of shit any more.


I put better in quotes because Linux-on-x86 wasn't the technologically better choice in the early 00s; but it was the much cheaper, yet viable option and had better ROI for many SMEs. Sun lingered in organizations with bigger budgets and had the expertise to appreciate it, but Linux had the momentum; most of the new server-side was Linux-first.

It wasn't just Sun that suffered, but all of big iron.


NeXT was very expensive, and better, but not enough to bypass the price.

Sun was the same. Expensive, better, but better enough?

Apple is much the same. VR might take them, and Google down.


Google is also crippled by employees who demand the firm follow the latest social justice whim (which has prevented them from entering big markets in the past, like defense and China). Sun had employees actively trying to build things, even if they were the wrong things.


I’m glad to see that all people that disagreed disagree with your request for an explanation, not with the content of the message :facepalm:

Anyway, I think it’s because comparing Sun to Google feels uncharitable (towards Sun).


Because thinking of Google in those terms shows genuine ignorance: Google is an advertising company where the privacy of the users is the product to sell. Google could not be more anathema to Sun Microsystems or Silicon Graphics.

No company has ever reached the standards set forth by Sun Microsystems or Silicon Graphics.

Whether 0xide Computer will, remains to be seen. In the absence of working there, one can only hope.


Apple?

Or do you argue Apple doesn't count because they're not in the business computing space, or that they're essentially a consumer products company that also makes personal computers?


Yes, Apple doesn't count because they are in the consumer space.


> Why not just comment below your thoughts instead.

Cowards downvote.


The world is full of cowards. I try to adopt cowards compensating strategies (while at the same time working on increasing bravery).


Please don't complain about downvotes. If people see a downvoted post with merit, it will gather extra upvotes. When I saw it, it was not downvoted, but I downvoted it because of the complaint.


> why the downvotes? I genuinely don’t understand. Why not just comment below your thoughts instead.

Ok, sure:

I downvoted because you’re complaining about downvotes. Have a nice day!


Computers as a whole seem so boring nowadays. I get that we all have supercomputers in our pockets, and it should be amazing, but it's just dull.


Dull?

You can hang the equivalent of an eighties micro on anything you please and record things nobody has ever seen before, download the data en masse to your pocket supercomputer, sync that to an AI and make accurate predictions that can be used to affect real world outcomes.

You can talk to anyone on the planet with minimal charges.

You can take infinite numbers of pictures of everything at insane resolutions, backyard astronomy is unbelievable compared to the Sun era.

Computing has never been so exciting. If you think you missed out because you weren't in silicon valley in the 1980s, the amount of cool work being done everywhere but the valley needs reconsidering.

I've been doing this since the 80s and its never been better than now.


Altogether far too much computing happens in far off mainframes. Aka the cloud. Far too little computing can genuinely be engaged with. The potential has been coopted & occluded & we're left chainrd to the cave again, starting at megacompany's shadowpuppets.

It's worth admitting the formative years are often a lot of fun. Some come-down is expected.

But I also believe collective discontent slowly grows in this deeply consumerized computing climate & eventually we'll push for new terrain, grow tired of this technostratifocation.


maybe, maybe not. I can get VC funding and in 2 months time have a compute cluster with thousands of nodes processing terabits per second of data. Then when my funding runs out I can just shut it all down and not worry about what to do with all that stuff.

Compare that to prior to cloud where you'd need to secure DC space, sign multiple contracts with many vendors. Wait many quarters for all that equipment to land. Hire a small army of consultants and experts to configure all this stuff. Then when the PM team says you know what, what about that new feature...we'll say no as we just made this huge capex purchase.

The Cloud has democratized super computing to the masses. Truly amazing


If you want to make a 3d rendering beyond 1990s Pixar's wildest dreams, the Cloud is fantastic for it. If you want to run protein folding simulations, or atmospheric modeling, it's fantastic. It's also effective if you want an always-on server for video analysis of home security cameras. Even if that could be done locally instead. As a developer, the Cloud is great, because you can rent a server farm instead of buying it. As a user, the Cloud is horrendous, because you are forced to rent software instead of buying it.

The Cloud has hasn't democratized super computing. It just expanded to a slightly larger oligarchy.


The Cloud has hasn't democratized super computing.

I, as a hobbyist anywhere in the world and with no company affiliation, can set up a 100 CPU compute cluster, run my own software on it and then shut it down again, all for less than the price of a movie ticket. How is that not democratisation.


VMs as a service are a good tradeoff, since you control most of the software you run (except the hypervisor) and can move it to your local devices easily, which usually also have good hypervisors.

Of course, with VMs you still have the problem of having to be a sysadmin and setup, configure and secure software. I guess that is why SaaSS is so popular.


I don't see this in my work. I've been hanging microcontrollers off sheep for the last 5 years. Sure, some of it is cloud based but a lot of it is visceral, tough, hard to write code that is very satisfying.


Your work isnt cloud but it sounds equally industrial, equally as much as a rebuff of personal computing. Your eork os reserved for & explorable only by a very very very few.

many many people on this site have great & wonderful experineces with computing, are.greatly empowered & do good things. but rarely will our efforts have a tangible output on computing itself, affect the way anyone else computes. the general iniative of computing has somewhat rather stalled, become overexploited for narrow & controlled ends.


That's only an issue if you structure your computing activities around network services. The computer remains as a powerful tool to do whatever you want. Just because the bulk of the public wants to do things mediated by a network doesn't mean that you are limited to the same.


And still printing a web page is real challenge. And if you want to compile a programm you find out that it needs 30 libraries. Some things are better. Most of the things are worse. I have a text editor on my Android phone. It cannot do search and replace.


Printing has been solved a long time ago. Do you remember having to search for/install printer drivers?

Even after a few years, I'm still amazed that I can print from my phone to a wireless printer I have at home, and how easy it was to setup.


>Do you remember having to search for/install printer drivers?

..yes? I need to fight with weird driver installers every time I connect new device to my Brother printer. (and then you setup scanner and printing stops working, arghh...)


My mom’s printer is basically care free.

$200 and she has a scanner/copier/printer and I pay about $60 a year and she automatically gets paper and ink from HP. Most of what she does is from an iPad, which is setup free. The laptop required some minor setup.


$60 a year is great. She dosen't print much.

My mother's HP stops printing completely if one color is down.

She only prints B/W.

(Personally I think printing companies shot themselfs in the foot with chipped cartridges. Instead of printing out willynilly; we, most of America, only prints out what's absolutely necessary. It's a business move on their part I will never understand. They tried to amend there blunder with Eco, but it's a bit too late. We got used to not printing. "Here's a picture of Billy Sue riding her brother's bike." Of course, we don't print it out like the 2000's and hang it on the frig.)


That's my experience. I have some cheap Samsung laser printer connected to my network, and a page comes out a few seconds later whenever I pick "print". It reminds me that it exists when I open the paper tray to steal a sheet of paper for notes and the rollers turn on when the tray is closed. Can't complain.


I don't have any problems printing web pages. Rick click, print, done.

Non trivial programs always required a lot of libraries. At least now you don't have to wait 3 weeks for a tape to show up so you can continue working.

Vim for Android is a thing [0]

[0] https://play.google.com/store/apps/details?id=com.droidvim&h...



I print articles/pages to PDF all the time to read them on my e-ink. They come out readable about 30% of the time thanks to multiple layers of banners and popups on the average site these days. It's annoying that even (and especially!) Medium/Wordpress etc can't be bothered to exclude them from @media print. I usually end up editing the site via devtools before printing, but that's not always an option (eg. in third party mobile apps for pdf export).

Make your websites printable, darnit. It's very little effort to get 90% there.


Yes, please make your sites printable, with no popups or ads or other garbage!

Also, somebody make an extension so that every webpage renders with its print style sheet, so I don’t need to see the popups or ads or other garbage.


To me, one exciting motivation today in computing is rethinking and simplifying systems.

https://youtu.be/k0uE_chSnV8 https://youtu.be/kZRE7HIO3vk


Nice, Jonathan Blow. Although I may not understand the underlying philosophy under which his arguments are made, he's inspired me to challenge ideas that are sometimes given for granted. I'd definitely be interested in working with operating systems or even systems theory, although I haven't put much thought to it.


You weren't paying attention from 91-95 or so, then. You'd run into Sun, SGI, OS/2, 3.1, Mac, NeXT, and DOS everywhere you looked, and might have heard about Linux. New Netscape versions were released nightly. There was no data mining, or ads.


Hacking on my Commodore 64 in the 80s was much more fun than the web app of the week I'm doing now.


Get into hardware. Pick up a micro controller, its all restricted memory, C or assembly, awful tools, big challenges.

Its just as satisfying as an 80s micro and a lot more useful.


Cheap microcontrollers are beefy these days. You can run Python on them; plug it into your workstation, open "code.py" on the drive, write your program, enjoy. Yes, there's the REPL too!


> Dull?

Yes, dull. Technology has been made into a walled garden consumer product. Hosted on remote servers ("cloud") tighly locked away, non-interoperable and no serviceable parts inside as the sticker says. Dull.


Not for me. Anyone can create the future outside of those walled gardens. The price of entry has never been lower.


Hm maybe you should try Linux.


Been using Linux since late 1993.

But I'm talking about the tech industry and products.


maybe thats the problem right there.

EVERYBODY is using some unix these days (android, linux, ios, macos) so it's not cool and nerdy anymore, it's your mom, uncle, grandparents ... it's dull.


or maybe like Linux From Scratch, to spice things up :)


Or FreeBSD.


Computing could be more exciting than it's ever been - were it not for the walled gardens being build all about.


For me, I imagine part of the excitement is applying the art of computer science every day at your job back then. When things are taken care of for you, they become boring. Not because those things are inherently boring, but they're boring because you're no longer thinking about them. You can't get excited about things you aren't thinking about. Many jobs transitioned from surgeons with scalpels to supergluing prefab parts together. Which one sounds more exciting to you? I'd rather be a surgeon.


C'mon over to databases. Computer science is alive and well here.


Computing is less fun exactly because all this is commonplace.


Yes, but the question then becomes, is it more fun and exciting to use those things, or was it more fun and exciting to build the underlying technologies and products?


I agree with you fully. It's just that tech stopped giving the same dopamine boost as social platforms nowadays.


Around about as long. It's been reined in and put to work for real. Whips don't hurt much, for now.


It feels boring because for the most part, we have solved some of the most common problems that we needed computers for. The idea of watching movies, ordering groceries/food, talking to a friend/family 8000 miles away in a second were all interesting and fancy and now it is part of life. So we get bored. Today's personal computers pretty much do everything we need in terms of basic necessities. I was driving earlier today and realized the power of Maps on our phones. Imagine back in the days when you need to print mapquest (remember?) or plain old maps to figure out how to get to the Interstate from this new place you just visited.

Things like VR etc so far (my opinion) have been disappointments for the most part even though we keep trying to innovate. Also, remember 3D TVs ?

I guess we need things like teleportation, real robots (not the one like Rocky IV movie) and some cooler stuff to make it interesting again.


VR is not a disappointment imo. A lot of people are having a lot of fun with VR chat. The hardware is just still very expensive and has the huge requirement of needing a 3x3m space to play in. The average person doesn't have an empty VR zone ready just to play specific types of games in. 3D TVs only failed because they never worked out how to do it properly without glasses. People will accept a clunky solution for the early adopter version but once it became clear there was no path forward, they just become discontinued.


I haven’t spent much time in VR Chat, but I have in Supernatural.

You’re absolutely correct about the space. I think this is a very under-appreciated concern for VR.

I believe people will want / need an entire new type of room. In a lot of housing, the dimensions won’t accommodate other uses.

A person won’t only need to be wealthy to buy say, Apple’s $3000 headset, they will need to be even more so to have free use of a space that lets them enjoy the best experiences.


Which is why there being $300 standalone headsets is great for democratising access - and hopefully that price will lower over time.


I like the ring of that but housing with an extra room seems unlikely to become affordable, ever.


I'm biased because I've been working on AR/VR at Meta/Oculus for the last 4 years, but I'm curious what you've found disappointing and whether you've tried the lastest VR software and hardware.

As far as gaming goes, Half-Life: Alyx and Resident Evil 4 are superb AAA experiences. As for hardware, Quest 2 and Valve Index are both great at what they promise for their respective price points.


I'm curious, in your field have you had a chance to try the Varjo Aero yet? It looks stunning.


I haven't tried the Aero but I tried another Varjo model a while back (not sure which), and the visual clarity was magnificent. I'm really glad there is room in the AR/VR space for devices at varying price points and levels of quality.


I remember taping printed instructions to motorcycle tank to drive from NC to OH. It rained. That was good times.


VR is incredible, and it's only going to get better. Meta's stuff has improved by leaps and bounds in the past several years alone. See what happens in another 5, then another 5.


I think there was a lot more novelty and shock/awe in amazing new things when we didn't have youtube channels and social media of every new thing immediately poring over ever detail of it.

For instance, when ID first released DOOM to the Internet and BBSes it was something you discovered on your own and among your local community of people who had network access and a PC capable of running it, there weren't hundreds of youtube channels playing full-1080p playthroughs of the entire game, etc.


Selective ignorance might be something worth cultivating.

When I was young (middle/high school), I devoured print strategy guides for video games and printed (hah) source code in library books. It was hard to come by, and just as nuanced/wrong as solutions/strategies/source you now find on the web/GitHub/etc.

In the last few years, I've stopped indulging in play-throughs, guides, or other experiences-on-rails. My sense of awe at discovering things has increased dramatically as a result. Not quite where it was thirty years ago, but getting there.


Exactly, the novelty of finding something our for yourself. And lots of times the anticipation of waiting for something itself used to be so fun. As you mentioned, with everything being reviewed to the minutest detail, all of that novelty is gone now.


Software engineering is no longer considered engineering these days. People mash abstractions together without even trying to understand what's beneath those abstractions. Knowing what you're actually doing seems to be a big deal in software development now.


Engineering is trying to underestand things and learn from mistakes. SW engineering is like financial engineering: trying to get profits fast.


Anecdotally that tracks, because all of my (very good!) CS professors in undergrad were retired engineering-engineers, with only a single retired software engineer that I can recall (who taught the course focused on real-world SWE/project management)


This is why I've given up on working anywhere where venture capital is even remotely involved. I want to make the world a better place and empower people, not make rich people richer while they do nothing of value.


ha you sound as jaded as i do. ive come to learn that we will work with google (search) programmers as long as buisnesses fool themselves that outsourcing saves money.


> Computers as a whole seem so boring nowadays.

For most of us everything becomes less exciting as we get older compared to our youth. Our minds are less responsive to novelty and there’s less novelty to be had. Kids still get enthralled by computers these days.


Sometimes i think this too.

I think part of the joy (for me) used to be in the deep understanding of the hardware and making something out of what felt like nothing.

If I’m doing something with a load of libraries and feel like a plumber more than a programmer then I find that excitement hard to come by.

Playing with modern day microcontrollers, SoC stuff or building software with hard real-time requirements is still a lot of fun - i just don’t get to do it so much these days.


Computers are boring but system on chips are kind of interesting again now that you need all this compute for AI.

People have made huge mistakes waiting for workloads that have very predictable computation, now with AI we have a need for coprocessors again.


I'll share a controversial opinion: I think computers were always boring, even if you worked for Sun Microsystems, yes even in 1981. In 1981, the Intel 8086 had been on the market for 3 years, and the 68000 had been on the market for 2 years. There were probably a hundred startup companies trying to make a workstation to compete with Sun. If you went to work for one of those companies, you crashed and burned or were acquired, and got to watch the project you worked so hard on become abandoned.


One of the things that drew me to urbit is it feels like when I first started playing with linux. It's fun, but in a way that tackles some real underlying issues with the modern web/computing stack.

The ideas are interesting and whatever happens it's definitely not boring imo: http://moronlab.blogspot.com/2010/01/urbit-functional-progra...


It is the paradox of choice or abundance. I remember that in the early 90s getting a bootleg copy of turbo C for my PC was something that I eagerly anticipated for the weeks it took me to finally get the disks. Heck, I was excited to install windows 3.1. To be frank even using debug.com to execute some small snippets of assembly code was terribly exciting. Because there was not a lot of much else to do.


Right, because micro computers have had decades of investment and development, when in 1982 they were like 15 years old (microprocessors anyway (HN correct me)).

Now the excitement is probably something that new or newer.. drones, recent ML, AR/VR/XR, wearable, or something that I don't know what it is. It's gotta be something where most people think it won't even work.


15 years in 1982 is casting a pretty wide net for actual microcomputers. The Altair 8800 kit debuted on Radio-Electronics in 1975 and the Apple II came out a couple years later. But I was an engineering undergrad in the late 70s and I knew no one who owned a personal computer. They didn't even become geek things beyond a very limited number of people until the early 80s.


Computers have not become boring. You have become old. Kids these days are having more fun than ever with today's computers.


I think once the VR goggles get more portable, hopefully the size of contact lenses one day, it will make things super exciting once again. I just hope it happens in the next 20 years.


Let me guess, computers were a lot more interesting and amazing when you were a teenager and into your early 20s?


It’s not dull really but I find it laborious and sometimes stupid. The sense of achievement has been sucked away by deep rabbit holes of research required to decipher even the simplest tasks. The retention and value of knowledge has declined due to the transient nature of everything too. Every day is a catching up session for something which is either horrible or will be replaced before you’ve finished the job.

I reserve the label stupid specifically for templated YAML which makes me want to strangle people fairly quickly.



The author has had some great experiences. I was struck by this post a few down from the Sun one:

>"Sometime in late '80 I moved over to Amdahl's architecture group to work on data communications - X.25, SNA, etc. But that work wasn't too satisfying."

I believe X.25 would have been pretty cutting edge in 1980. A bit of trivia regarding this protocol AOL's network was built on X.25 and many ATM machines still use X.25. If you look in the back of any of those standalone crappy offbrand ATM machines you will see they are connected to an RJ 11 phone jack. Minitel in France also used it X.25 as well. It's almost hard to get your head around packet switched networks today with our ubiquitous Ethernet. If your curious:

https://www.historyofdomains.com/x25/


There are a few things leftover from those networking protocols that are still common:

- X.509 for certificates

- parts of LDAP

- the 7 layer networking model, which at least made sense with the protocols it was modelling before it was jammed into tcp/ip networking.

I think we should be thankful the list is relatively short.


That whole network stack was based on a simple premise from the telco world ... that you should charge by the byte for a virtual end-to-end connection (just like we used to charge for a phone call - N bytes and M bytes/min).

IP sort of busted that whole idea - the idea that you could (should!) just drop packets when things get tough ran against this whole idea - but it made TCP/IP easier to implement, and perform better, and most importantly move faster and be adopted as the standard we use today


Forward Error Correction and ASN.1 as well. I think it's kind of impressive though that those things survive more than 40 year later.


That’s some great stuff! Thanks for sharing. One thing that really stuck out was that the startup boom bust cycle was already in full swing by the early 80’s.


As robotresearcher implied, the startup boom/bust cycle has existed as long as capitalism has.

In the past 200 years:

* Railroads, 1830-1850

* Telephones, 1880-1920

* Automobiles, 1890-1930

* Radio, 1920-1950

* Aircraft, 1910-1950

* Airlines, 1930-1985

* Television, 1945-1960

* Computers, 1950-1990

* Internet, 1995-present

The computer cycle can be subdivided into

* Mainframes (1945-1965)

* Minicomputers (1965-1980)

* Microcomputers (1975-1990)

And/or

* Vacuum tubes (1945-1955)

* Transistors (1955-1975)

* Microprocessor (1975-1990)


Slight correction:

* Internet, 1995-2010

* Mobile, 2010-present


You are correct; thank you for the addendum.

Speaking of mobile, I could also have listed

* Networks (1990-2010)

Encompassing handset and equipment makers, cellular/cable/broadband networks, and ancillary entities like towers. Obviously there is overlap with Internet, but I think the two categories are separate enough, akin to the "Tech", "Media", and "Telecom" buckets of "TMT".

The significance of the Mobile startup cycle starting exactly when the Internet and Networks cycles ended is left as an exercise for the reader.

Two more cycles I omitted are

* Newspapers (1850-1900)

* Film (1920-1950)


Of which century?


One other thing. 20-something years ago I kept getting in trouble at work, because every night I'd take my G3 laptop home and get a dual-boot Linux/OS8 thing going.

Then 10 years ago I'd root all my Android phones and run spectacular roms.

Now I'm paid to keep a Linux network running, and I use my phone as-is.


I used Unix for the first time in September 1984 on a Sun 120 workstation. It had a huge monochrome monitor and an optical mouse. I was so entranced by this amazing machine that I spent nights and weekends learning C and trying to program it. Was a wonderful experience. Nice memories!


Pre Solaris? amazing. Did a lot with SunOS, on Sun-1 and Sun-2. SunView was ok, coded some nice things in it, helped PhD art students at the slade.

Post Solaris? If I wanted System V I would have asked you for it.

The 68000 was an amazing chip. Sparc was good too, but what really kicked Sun off was 68xxx series, and BSD


If this doesnt convince you to join a startup, nothing will


TIL of Sun co-founder Andreas Maria Maximilian Freiherr von Mauchenheim genannt Bechtolsheim's most awesome full name!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: