Hacker News new | past | comments | ask | show | jobs | submit login
Dennis Ritchie Home Page (2006) (bell-labs.com)
223 points by mehdix on Jan 30, 2022 | hide | past | favorite | 80 comments



I worked in Bell Labs from 2012 to 2018, unfortunately then it was a shadow of its former self. The MBAs had taken over.

While there Weldon tried to pivot the place from being ‘like twitter’ to an incubator to a like a startup. He got an Apple Watch so we were going to be a wearables lab. He threw together a book on the supposed future of networks and conspired to get it to the NYT #1 by forcing us all to buy it.

There were endless reorganisations. Disastrous leads parachuted in to wreck groups. Weldon played favourite to an alarming degree only to turn on them when they didn’t deliver on the aforementioned vapid promises.

We had endless managers holding endless meetings telling the smart people what they should be doing and not listening to what the smart people wanted to or could do. All hands meetings telling us how great everything was while they were letting a third of us go.

They engaged in vanity projects like putting a 4G network on the moon and getting us to build endless fake demos for technologies that didn’t exist and then acting surprised when told said technology didn’t exist.

The funny thing was that the old timers over in Murray Hill just ignored all of this and continued to work away on their research untouchable like some prize zoo exhibit.

It was a place full of the most wonderfully intelligent people managed by fools who wasted fortunes on flashy demos rather that let the smart people take the time to build something.


Makes me think of Buffett’s saying that goes something like: “I invest in companies that could be successfully run by monkeys because eventually they will be.”

I don’t know how an organization avoids this fate, but it does seem to eventually come for all enterprises.

IBM, Bell Labs, GE; who else should be on that list?


From its history since the MD merger you could argue Boeing should be on this list.

It's the inevitable result of prioritizing profit rather than prioritizing the creation of quality products that the people working at a company can be proud of.

That's why you see companies fall apart when the founders leave. They're replaced by people who prioritize profit for the company and wealth for themselves rather than product quality so the company ends up rudderless.


>That's why you see companies fall apart when the founders leave. They're replaced by people who prioritize profit for the company and wealth for themselves rather than product quality so the company ends up rudderless.

Which ironically fucks over profit anyways, just about every time. I think it's more of a personal greed thing combined with incompetence, on top of perverse incentives that come with being a publicly traded company.

The quarterly profit model all but ensures a societal-scale myopia in the end.


Unless the founder of the company builds a culture that is designed to survive without them at the center, the company will eventually fail without them there.

One of my former employers was like this - the founder was a brilliant engineer, who installed good financial controls, and was himself decent at the business end.

But, he was a micromanger, and he never sought to create replacements for himself inside the company, he tended to penalize people who stepped out of line too. So once he was out of the picture (he sold out), we didnt really have the right person to run the company, or the right culture to just install a generic manager with industry expertise, it instead was a company that had been formed for the founders personal needs.

That company was purchased by a multinational, and is now being systemically dismantled.


Google has been on this list for the past decade, as demonstrated by the Google+ fiasco of 2011. Same for Microsoft in the post-Gates era and Apple in the post-Jobs era. As for Facebook, the vapid metaverse hype shows that it has well and truly jumped the shark.


I would argue that Apple is doing fairly well in post-Jobs era, probably much better than anyone expected. Microsoft seems to be making some steps forward here and there, improving its reputation among developers and end users (although they still do their fair share of missteps).


Microsoft will always be a pathetic company


Apple has done some pretty ugly stuff related to the App store monopoly and fighting the right to repair but having monopolistic tendencies has always been baked in to a company that wants you to run both its hardware and software, and that is not the same thing as being MBAized (ie greed/evil and technical prowess can be orthagonal). Their recent processor success proves they are not another IBM.


> IBM, Bell Labs, GE; who else should be on that list?

Intel felt quite a bit like IBM during my time there circa 2010.

I wasn't familiar with this quote, but wow was that my experience at large tech orgs. I wish I would have known this about 15 years ago, as perhaps I would have done a better job picking orgs to work at early in my career instead of being so frustrated.


Kodak - can you believe they used to be a defense contractor (cameras or lens for spy satellites, among other things)


AFAIK Kodak strength was film, not cameras or lenses.


They also made cameras. Ever hear of the wildly popular Brownie? Anyway their film expertise does not take away their involvement in cameras for spy satellites in the 1960s.


please define "strength" (?)


expertise


The Peter principle.


Boeing


This sounds disturbingly similar to my time at T-Systems as a trainee.

Brilliant tech people getting managed to death by clueless MBAs. It was a great place to learn, though.


> conspired to get it to the NYT #1 by forcing us all to buy [his book]

How many people worked at Bell Labs at the time? It looks to be a little over 600 now, which in relation to NYT #1 (best selling list?) having less-than-1000 employees all buy a copy seems to be a relatively ineffective strategy.


If it was a hardback then that could make a big difference to sales.


There were considerably more when you look at the various European sites and other staff.


Bell labs was the original mafia though. Rob pike, Dennis Ritchie, Ken Thompson...


I'm assuming bells labs no longer has the financial backing of a monopoly like AT&T that they had 40 years ago?


It was cleaved off AT&T as part of Lucent and then merged with Alcatel. Part of the problem there was that it was never clear if Lucent merged with Alcatel or if Alcatel acquired Lucent. The whole company was pretty dysfunctional but nothing too unusual there.

Nokia came along flush with MS money from the sale of their handset business to buy Alcatel Lucent and got Bell Labs for free. Neither company was doing particularly well flogging network gear and together they didn’t do much better.


I suppose even if they had the funding the company itself is at it's value extraction phase.


They are now owned by Nokia.


> and getting us to build endless fake demos for technologies that didn’t exist

But of course researchers are to blame too, if they lend themselves to doing stuff like this.


True, it was all some people did… and some continue to do after our site was shut down. They know nothing else.


Dennis was 30 when he published the first Unix software manual, according to his homepage: https://www.bell-labs.com/usr/dmr/www/1stEdman.html

It’s interesting to wonder which 30yo’s project today might have as much impact.

You may think it’s impossible, but 40 years is a long time. People always underestimate the impact of decades, and overestimate the impact of years.


> People always underestimate the impact of decades, and overestimate the impact of years.

Sometimes cited as Gates' Law, and also attributed to Arthur C. Clarke, Tony Robbins, or Peter Drucker. But they may have all gotten the idea from Roy Amara.

https://fs.blog/gates-law/


Thank you! I’d always wondered if it was a Gates original, and what the history was.


It reminds me of the famous Steve Jobs quote, "Good artists borrow, great artists steal."

Apparently Steve stole that quote from Pablo Picasso, who borrowed it from Igor Stravinsky, who lifted it from T.S. Eliot.

https://www.uvu.edu/arts/applause/posts/stealing.html


TIL, he never received his PhD because he didn't provide a bound copy of his thesis to the Harvard library. https://computerhistory.org/blog/discovering-dennis-ritchies...


I‘ve been working for Lucent Technologies in Moscow, Russia from 2000 to 2004. I remember my feelings when I looked up Dennis Ritchie in PeopleSoft, corporate directory - hey, I work in the same company with a man who invented Unix and C! Lucent was a great place to work, even in Russia. :)


I was working on an annotated (unofficial) edition of K&R updated to the latest C standards, with commentary like the Lion's book on UNIX, completely typeset in LaTeX. Sadly, I don't think it will ever see the light of the day due to copyright.

I had some correspondence with DMR in my early college days. It would have been an ideal tribute.


Maybe reach out to Kernighan?


Our initial request was forwarded to Pearson and their lawyer responded with a heavy-handed threat.


If you’re interested, I’d try to get ahold of an executive at Pearson. Some big corp lawyer has 0 concept of the significance of your work and doesn’t have the authority to green light anything anyway.


I was in 6th grade when I found "The C Programming Language" on the floor in my friend's house. I picked it up, took it home, and read it cover to cover. I didn't have a computer then, but I was absolutely sure about what I wanted to do in my life.

Thank you, Mr. Ritchie and Mr. Kernighan for opening the world of computer science for me.


I was lent a copy of K&R by an English teacher¹ in my high school (this was 1984ish). I still remember the smell of coffee and nicotine that was imbued in its pages and any time I deal with C code, the sense memory comes back to me.

For a while, under the influence of K&R and The TeXbook, I contemplated going to Stanford to study computer science and then working at Bell Labs. I did neither.

⸻⸻⸻

1. About ten years ago, I decided to try to reach out to him and thank him and comment about how out paths kind of were the inverse of each other—he had a degree in computer science but ended up teaching high school English, I had a degree in English and ended up programming computers—and I discovered that he had died a few months previous. Whenever possible, get in touch with those who influenced you earlier if just to say hi and thanks.


I ordered it by inter-library loan in 1991 to rural Oregon. I had recently learned 6502 assembly language, so pointers seemed "obvious". A few years later in CS101 I had such instinctive feel for them I could hardly explain them to my fellow students.

Thank you K&R.


> so pointers seemed "obvious"

Indeed - having come from low level route, never really understood why people get so confused with them


I can remember writing a large Pascal program in the 80s and really wishing I had function pointers available so I could pass in a reference to a function. I look back on that as an autodidact programmer and realize that I had some vague instinctual notion of stuff that would become commonplace as OO and functional paradigms took over.


surprising -- I picked up the same, also read it cover to cover, and wondered over and over what kind of thinking leads to the small assembly'ish idioms and quirky character IO definitions. "Structured Programming" was obvious to me, and using that design to build non-trivial programs was very compelling, but the constant emphasis on small, tricky ways to move around a character seemed driven by some intense factory of machine parts thinking, not clean abstractions or consistant naming or human-readable coding. I immediately wanted to try this "big phone network" core OS language on my portable home computer with apparently one-one hundred thousandth of the capacity. Other home computer companies were publishing C compilers rapidly with lots of feature tradeoffs, so there was no question that C was the thing to use for me. Not good design at all though -- machine requirement driven totally.


I recently worked to update a Linux-based system that was originally built by a team that had previously implemented the same product on a microcontroller-based system. The Linux drivers are obviously direct ports of the old subsystems, without any apparent effort to understand or leverage existing kernel drivers or subsystems that could have simplified (or outright replaced) their custom functionality. It is unholy.

Now, this might sound absurd by the standards of today (because it is), but this was the transition that every programmer had to make back when high level languages were introduced. It takes time to adapt to a paradigm shift, so it hardly seems surprising when vestiges of the “old ways” can be seen peeking through the curtains of the new abstraction.


The ease with which I was able to read this page has me wondering how much of the challenges I sometimes have with focus and attention have to do with modern web design where pages are littered with elements unrelated to the text (not to mention ads). It's hard to beat the readability of black text on a white background with a few <p>'s


A titan among humans, co-creator of C and UNIX, a statue should be erected to this man.


Theoretically, how would one build a statue of Dennis?

Yes, I’m serious. If it costs less than $10k, I’d love to make a statue of Dennis and put it somewhere in my house. Partly to show off my excellent sensibilities in artistic taste on my otherwise barren walls.

But mostly I just realized I have no idea how statues are made circa 2022, and it sounds fun{,ny}.


The art of marble or bronze sculpting is still around, although much rarer than it used to be. It's a trade like any other, requiring years of study to achieve a high level of competence (notwithstanding the trend of throwing a bunch of random metal pieces together and calling it art - true art requires creativity and skill; one or the other does not suffice).

Many university art programs have a sculpture department. Student and faculty artists will make works on commission, although it's hard to find good figurative art among the sea of abstract political B.S. If you're really serious, Italy is the place for the best artisans, as it has been since the Renaissance[1].

There are still some old-school sculptors around [2] in America who take the craft seriously.

[1] http://www.spartacopalla-scultore.it/english.html [2] https://corneliussullivan.com/


Find a local sculptor you like and commission a work! Your budget won't get you somebody famous, but it's enough.

Maybe consider a bust rather than a life size full body sculpture. I think that a bust of Dennis Ritchie would be a pretty awesome quirk in somebody's house. Engineering heroes aren't normally celebrated like that.


Check out Veijo Rönkkönen. A modern sculptor.

https://www.atlasobscura.com/places/veijo-roenkkoenen-sculpt...


This artist died in 2010, so won't available for commissions XD. However, it's a cool aesthetic and it's worth checking out the article just to see it.

Practically speaking, if you actually want to find a sculptor to commission a work, search for a local sculptor's guild and check out local galleries or exhibitions. Many works at exhibitions will be available for sale and will have prices listed, which will give you some idea about cost.

Plus, visiting local galleries if you haven't done it before is a fun adventure!


I mean a statue-homage in his city or town, I’m not even American but would donate to this cause.


Certainly. I would too. But wouldn’t it be cool to go down to your laundry room or wherever and see a bigass Dennis Ritchie statue?

I suppose I could put it in the yard, facing a neighbor’s window. Then we’d be able to dress Dennis for Halloween and Christmas too.

But for real, is it completely impractical to want your own statue of someone? Rich people do it, and it’s been a few centuries, so I bet technology has worked its usual magic on the price...

EDIT: in Germany you can get a 3D printed 10 inch figurine for $400: https://doob3d.com/ so this is possible in principle.

Other refs: https://www.quora.com/How-much-does-it-cost-to-make-a-life-s...


Requiring something weatherproof which can be displayed outside and not degrade quickly when exposed to the elements will dramatically increase the cost. Can you make your artistic point with something which can only be displayed inside?


How about designing a DR balloon instead? Costs would be divided by mass production.


It's a shame that the Americans don't have a culture of putting luminaries (other than presidents obviously) on their bank notes; it's a wonderful and far-reaching way of celebrating a person's contribution to the culture. Even so, Ritchie might be considered a little niche for such an accolade, but it's a nice thought experiment nevertheless.


His recognition level is niche but his contributions aren’t. His language (C) and OS both literally (UNIX->BSD->macOS/iOS) and in design (Linux/Android) is powering a huge part of civilization.


I think the best monument to DR is the software billions of us use every day. Best thing we can do is make more people aware of his work.


When I was 14, I emailed him to thank for his work. Years later he humbly responded and said he is surpised that C is still around.


He replied to an email years later or were these 2 separate threads ?


He replied to that original email years later


Somewhere on the Internet exists a Plan 9 press release with my name on it. When I had a chance to corner a few of Dennis's colleagues at Mobile World Congress (they were, at that time, part of Alcatel-Lucent), I asked them for their memories of working alongside him.

Please excuse the audio, I was working with a FlipCam and just focused on capturing what I could for the history books.

https://www.youtube.com/watch?v=NE4ZRPwbNhA&t=141s


I sometimes think modern culture has lost a grip on the past. Much has gone before us and much will go after us.

I find reading the Unix Manual comforting as it reminds me where we have come from. It was written a handful of weeks after I was born and I am still using it's commands fifty years later.


Even though I never met or knew Ritchie I still feel bad when I read about his death, when I was a teen I would go in rabbit holes reading about C and Unix, and would read about all the design decisions he made and the rationales for it, a true loss for the programming world.

The concepts of minimalism and modularism are being thrown away and it shows in the performance and stability of new software. It's a shame we have to learn the same lessons over and over.


>Even though I never met or knew Ritchie I still feel bad when I read about his death, when I was a teen I would go in rabbit holes reading about C and Unix, and would read about all the design decisions he made and the rationales for it, a true loss for the programming world.

sounds a lot like me, even I still read docs related to old school UNIX and C to this day

>The concepts of minimalism and modularism are being thrown away and it shows in the performance and stability of new software. It's a shame we have to learn the same lessons over and over.

why though? is coding huge monolithic software easier to do rather than creating a set of modular and simple tools?


Yes, creating a monolith is usually easier and faster. It is also the wrong thing to do, more often than not. As they grow and mature, properly designed modular systems can be easier to debug, maintain, test, deploy, and document.


On the other hand, hardware has continued to expand to support the bloat in software. I do agree computer software design could be better, thanks Electron. My most used piece of software is written in Java. I also spend a lot of time in Electron. My terminal is written in D (Tilix). I guess my point is, it could be better, but there is no incentive to make it leaner so no one will try. Nothing is really stopping anyone from running old school Linux software though. I know guys who run FVWM and really minimalist configurations of their Linux systems.


That sounds interesting. Could you link some sources where I can read more about Ritchie's design decision for C and Unix?


From Wikipedia: "News of Ritchie's death was largely overshadowed by the media coverage of the death of Apple co-founder Steve Jobs, which occurred the week before."

NOW I'm sad.

RIP Richie, your place in computing heaven is secured.


HN has a weak spot for Marketroids like Jobs or Musk. It it sad that the people being praised are not the ones doing the work, just the ones presenting it. For me, Dennis Ritchie, Ken Thompson, Brian Kernighan, Alfred Aho are real people who built something. The others (Musk, Jobs) are only opportunists with a big mouth.


Yeah, it really frustrates me that this site's startup culture worships people who are veritably garbage individuals as opposed to recognizing the people who, you know, actually built the stuff in the first place. I struggle imagining a world where so-called hackers respect Jobs more than Wozniak or Dennis Ritchie, but here we are...


HN had the black bar at the top when Dennis Ritchie died, and a thread about it.


The first book I bought for programming was C Programming Language. Freshman year, 2004, for EECS 10 at UCI. To my delight, it remains part of the curriculum of the course, 17 years later with the same instructor too:

https://newport.eecs.uci.edu/~doemer/f19_eecs10/syllabus.htm...


There are many Bell Labs these days: https://www.bell-labs.com/about/locations/


None will ever match the legendary group coming out of Murray Hill between the 60s and 70s.


Lots of HTML Energy here!


What’s not to love?! Loads fast. No pop-ups and scripts to get in the way. No Reader mode needed.


R.I.P Legend


Their works still live among us, if anyone tried to look close enough, they would still see their names all over it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: