Hacker News new | past | comments | ask | show | jobs | submit login

Of course, engineering the Bell System led to a few innovations and societal changes, too.

https://en.wikipedia.org/wiki/Claude_Shannon




Can you be more specific? I’m familiar with Shannon’s work, obviously, but what societal changes happened as a result of engineering the Bell system specifically?


Shannon's great contribution to the Bell System was that he figured out how to reduce the number of relays in a fully-connected toll office from O(N^2) to O(N log N).[1] After that, they let him work on whatever he wanted.

[1] https://archive.org/details/bstj29-3-343


> great contribution

One of his great contributions, I would argue, information theory being another, and secure telecommunications.

Early work on switching networks (MS Thesis):

https://dspace.mit.edu/handle/1721.1/11173

Seminal work on information theory

https://en.wikipedia.org/wiki/A_Mathematical_Theory_of_Commu...

> they let him work on whatever he wanted

UNIX was written by some guys in the same organization, I wonder one of them thought "Oh sure Shannon gets to work on what he wants, why can't we work on the the future of a global inter-net? Why do we have to hide it as a text processing system?"

My management here apparently is a crowd sourced mob trying to silence me by clicktivism. Shannon and KNR had it easy, IMO.


Unix's involvement with the development of the Internet was mainly through BSD, which was a UC Berkeley joint, not Bell Labs.


Actually, no. The UC Berkeley TCP/IP implementation was not the first. It was more like the fifth. But it was the first for UNIX that was given away to universities for free. Here's the pricing on a pre-BSD implementation of TCP/IP called UNET.[1] $7,300 for the first CPU, and $4,300 for each additional CPU. We had this running at the aerospace company on pure Bell Labs UNIX V7 years before BSD.

Much of what happened in the early days of UNIX was driven by licensing cost. That's a long story well documented elsewhere. Licensing cost is why Linux exists.

[1] https://archive.org/details/bitsavers_3Com3ComUN_1019199/pag...


But that doesn't refute the parent's point, does it? (If it has been edited since you wrote that, the version I see is "Unix's involvement with the development of the Internet was mainly through BSD, which was a UC Berkeley joint, not Bell Labs.")

They were responding to the statement:

> "why can't we [Kernighan, Ritchie, Thompson, other folks at Bell Labs] work on the the future of a global inter-net? Why do we have to hide it [Unix] as a text processing system?"

Whether or not the BSD TCP/IP implementation was the first or most influential, the point is that it wasn't the Bell Labs Unix folks driving Unix networking forward. UNET was from 3Com.


The Bell Labs people had their own approach - Datakit.[1] This was a circuit-switched network as seen by the user but a packet switch inside the central-office switches. Bell Labs used it internally, and it was deployed in the 1980s.

Pure IP on the backbone was highly controversial at the time. The only reason pure IP works is cheap backbone bandwidth. We still don't have a good solution to datagram congestion in the middle of the network. Cheap backbone bandwidth didn't appear until the 1990s, as fiber brought long-distance data transfer costs way, way down. There was a real question in the 1980s and 1990s over whether IP would scale. A circuit-switched network, with phone numbers and phone bills, was something AT&T understood. Hence Datakit.

[1] https://dl.acm.org/doi/pdf/10.1145/1013879.802670


Interesting. I want to point out though that the document you link to is dated 1980, which is late in the development of the internet (ARPAnet): by then the network was 11 years old and research on packet-switching had been going on for 20 years, which is one reason I find it hard to believe that the Labs (or anyone at AT&T) contributed much to the development of the internet like great grandparent implies when he imagines the Unix guys saying, "why can't we work on the the future of a global inter-net? Why do we have to hide it as a text processing system?"

Yes, the early internet (ARPAnet) ran over lines leased from AT&T, but I heard (but have not been able to confirm by finding written sources) that AT&T was required by judicial decree (at the end of an anti-trust case) to lease dedicated lines to anyone willing to pay and that if AT&T weren't bound by this decree, they would probably have refused to cooperate with this ARPAnet thing.

I concede that after 1980, Unix was centrally instrumental to the growth of the internet/ARPAnet, but that was (again) not out of any deliberate policy by AT&T, but rather (again) the result of a judicial decree: this decree forbade AT&T from entering the computer market (and in exchange, IBM was forbidden from entering the telecommunications market) so when Bell Labs created Unix (in 1970), they gave it away to universities and research labs because it was not legally possible to sell it. In 1980 (according to you, and I have no reason to doubt you) AT&T no longer felt bound by that particular decree, but by then Berkeley was giving away its version of Unix, or at least Berkeley had an old version of Unix from AT&T which came with the right to redistribute it and would soon start to do exactly that, and Berkeley's "fork" of Unix is the one that was responsible for the great growth of the internet during the 1980s. Specifically, even if an organization wanted Unix workstations for some reason other than their networking abilities, the ability to communicate over the internet was included with the workstation for free because most or all of the workstation OSes (certainly SunOS) were derived from Berkeley's open-source version of Unix (although of course they didn't call it "open-source" back then).


Unix got increased presence on Internet because DoD paid UCB to port "DoD Internet" (aka TCP/IP) to Unix, because Digital had announced cancellation of PDP-10 line.

Meanwhile everyone and their pet dog Woofy was exploiting the recent explosion in portability of Unix and thus source-level portability of applications, using unix as the OS for their products - because it enabled easier acquiring of applications for their platform.

With some Unix vendors (among others Sun, which arose from a project to try to build "cheap Xerox Alto"), providing ethernet networking and quickly jumping on BSD sockets stack, you had explosion of TCP/IP on unix but it still took years to get dominant.


Actually, the early ARPAnet was mostly DEC PDP-10 machines.[1] MIT, CMU, and Stanford were all on PDP-10 class machines. Xerox PARC built their own PDP-10 clone so they could get on. To actually be on the ARPANET, you had to have a computer with a special interface which talked to a nearby IMP node. Those were custom hardware for each brand of computer, and there were only a few types.

The long lines of the original ARPAnet were leased by the Defense Communications Agency on behalf of DARPA. ARPAnet was entirely owned by the Department of Defense, which had no problem renting a few more lines from AT&T.

AT&T was willing to lease point to point data lines to anybody. They were not cheap. I had to arrange for one from Palo Alto CA to Dearborn MI in the early 1980s, and it was very expensive.

Here's a good historical overview.

[1] https://www.visualcapitalist.com/wp-content/uploads/2019/03/...

[2] https://www.researchgate.net/publication/329976721_ARPANET_a...


> The resulting units may be called binary digits, or more shortly, bits.

It's interesting to read this early use of “bit”, before the term became commonplace. The first publication to use “bit”, also by Shannon, was only a year prior[0].

[0]: https://en.wikipedia.org/wiki/Bit#History


> At the close of the war, he [Shannon] prepared a classified memorandum for Bell Telephone Labs entitled "A Mathematical Theory of Cryptography", dated September 1945. A declassified version of this paper was published in 1949 as "Communication Theory of Secrecy Systems" in the Bell System Technical Journal.

https://en.wikipedia.org/wiki/Communication_Theory_of_Secrec...

> what societal changes happened as a result of engineering the Bell system specifically

I don't have that much time, but in general think about how I am even capable of communicating with you at all. Start with the "https://" at the beginning of most modern URLs.

UNIX, transistors, foundational information theory, "on and on till the break of dawn." If you want to become more familiar with Shannon's work and Bell systems, separately and together, try his master's thesis, followed by his Ph.D., ...

> obviously

I thought my original comment was obvious. At least we both seem to be familiar with the principles of:

https://en.wikipedia.org/wiki/Paul_Graham_(programmer)#/medi...

Thank you for for helping me clarify my thoughts, and have a nice day.


Thank you! That was a super-helpful response. I wasn't asking because I disagreed, but because I didn't fully understand what you were getting at. Now I do. Thanks!


I wasn't trying to be snarky. Here is a 7 volume "A history of engineering and science in the Bell System"

https://searchworks.stanford.edu/view/1460436

The first volume 1875-1925 is over 1000 pages. I'm telling you, Bell was an important organization with respect to modern sociology.


Volume 1 of this book can be read at archive.org:

https://archive.org/details/bellsystem_HistoryOfEngineeringA...

Another interesting book is "Engineering and operations in the Bell System", which can be borrowed at archive.org:

https://archive.org/details/engineeringopera0000attb


Though perhaps not strictly societal in its effect, let’s also not overlook the discovery of the cosmic microwave background…

https://en.wikipedia.org/wiki/Discovery_of_cosmic_microwave_...


> thought my original comment was obvious

Your comment is stronger without this. (From a fellow Shannon fan.)


Thanks for the feedback!

However, if you can discern "better" (i.e. 1 plain old bit of difference) by taking a few words off my posts on social media, you have "Beat the Shannon Limit". ;)


Hello,

Update, I thought of a way to express the parent comment here:

0 + 0 = 0, substituting literal values, dropping the units, for some convoluted overloading of the operator '+'. My TSH (thyroid test) came back from the lab without units this, I guess I'm modernizing.

0 dB S/N + 0 bytes originating information = 0 bytes transmitted (arguably error free to be fair).

Contrast this with P. Graham's comment last year "Why would I want anyone to fail?" on Twitter, which (I'm not a fanboi of Graham or Shannon or Turing, just an admirer of their work), was the most information transmitted to me over any medium, with any S/N that year. Perhaps we should revisit the basis of Shannon's work, in light of what we have learned from the Internet--Einstein wasn't afraid of arguing with Newton. :)

Bye.


Hmm..took my brain several orders of magnitude longer to warm up than a 6L6 power tube.

AI suggested that I was being generous with my 0 dB S/N for social media, it should be -∞ dB. Good catch.

It also didn't like my unit compatibility (reminding me of the utility of unit analysis), but remember that my '+' is overloaded--most programmers would probably write:

    int plus_ungood(int bytes, double SNR);
dropping the units. Of course we programmers also add geometric points together without dividing, which is mathematical no-no too.

I guess I'll put this on my TODO list for a fun project--"Relativistic Shannon: A Critique of Pure Reason on Social Media":

    Einstein wasn't afraid of arguing across time with Newton, nor should we be afraid of arguing with Shannon.  Arguing on social media, well that's a different thing altogether....[continues for 7000 pages]
I did learn what semantic density was without AI's help, postulating that it would be involved (viz Graham above) using my alarmingly down-trending cognitive abilities.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: