Hacker News new | past | comments | ask | show | jobs | submit login
The legend of John von Neumann (1973) (lk.net)
96 points by tjaerv on Aug 22, 2014 | hide | past | favorite | 65 comments



I wonder if a person like this came along today, whether they would be allowed to have the same type of impact in the variety of different ways that John von Neumann did. Whether he was advancing economic theory or telling the military the most efficient altitude to detonate a bomb, it was all the same basic skill set. He was just a crazy smart and logical guy and he applied that to every field that he could. I can't think of any modern equivalent person who has had a similar impact in the number of disciplines that von Neumann did. Does this type of opportunity still exist or do we now put too much focus on having people concentrate on a single area of expertise?


I think in many ways, in terms of exploring scientific and engineering realms, we're more conservative now. Personally, I'm doing a PhD in computer science, and I find it incredibly frustrating. Publishing papers is all about minute incremental improvements, and you absolutely must have numerical results to prove that you beat the competition. You can't publish negative results either, your work is considered worthless.

I'm in compilers and programming languages. I really wanted to create my own language as part of my PhD thesis, but I was basically told that this would be unpublishable. I mean, how can you hope to numerically show that your language is better than everything else around? Plus it's all been done already, nothing else new can possibly be invented in that realm.

Things weren't always like this. In the 1970s, we created things like Smalltalk, ML and LISP, which had a tremendous impact on the programming world. People also had bold ideas about artificial intelligence and nuclear-powered spaceships. In the 70s, people were allowed to just explore ideas, in the hopes that these ideas would lead to something good (and some did). Now, it's much harder, you bring up an idea and people immediately try to shoot it down, ask you for proof that it definitely will work, and bring up the most asinine suggestions as to why your idea will definitely fail.

Today, the exploration has been scaled down. It's not because the exploration failed, we invented many great things as a result of it, it's largely IMO because we live in different economic times. The USA is no longer in an economic boom, things are no longer in expansion. There are cuts to scientific funding, cuts to education. People are being told not to be "wasteful". We live in a much more nearsighted world, in a sense. Being a dreamer isn't considered a virtue.


I've mentioned this a bunch already on HN, but I recommend reading "The Structure of Scientific Revolutions" by Thomas Kuhn. The book was published before most people had even heard the phrase "computer science," but I think that computer science still follows some of the patterns he talks about.

In Kuhn's view, "normal science" is mostly iterative and incremental. It is so because most people in the discipline agree on most of the big issues, and people are mostly refining those understandings. The periods where people don't agree on the big issues in a discipline is around the time of scientific revolutions: the solutions to such big issues are so different from previous approaches that accepting them requires a complete re-think of what the discipline is.

A lot of areas of CS are in the "normal science" part of that cycle, and I think compilers and languages are in there. (The biggest argument against that is concurrency and parallelism.) In the 60s and 70s, programming languages were new, and they changed computer science forever. We were exploring what these things could be.

I also recommend Cristina Videira Lopes's blog post, "The Evolution of CS Papers": http://tagide.com/blog/2014/02/the-evolution-of-cs-papers/


People are still creating new languages in academic contexts -- and ones which go on to have influence outside of the academy -- now. Likewise, I'm sure even in the 1970s, someone's thesis advisor told them it had all been done.

You're trying to compare different times, but its getting confused because you're actually comparing one incident now to an aggregate impression of the "1970s" (though, from the 3 specific examples cited, the "1970s" you are talking about are really something like 1958-1973.)


To be fair, were any of Smalltalk, ML or LISP created as part of someone's thesis? I thought they were all academic (or academic/commercial) projects, but developed by academics at a more senior stage in their careers.

I'm pretty sure I've seen some recent theses that did describe the creation of languages, too. Though they had less of a theoretical computer science bent, and more of a focus on saying "this is a language for X problem domain".


Why do you want to create your own language? People create languages for many different purposes. What would be original and interesting about this language?

I am inclined to agree with your larger point that we have gotten too conservative, but I also think we are awash in programming languages. Before I would encourage someone to create a new language, I would want to see a good argument that none of the existing ones would suffice for some interesting purpose.


Implementing a new language was never such an exciting deal as many think. New languages happen more because of the need than because of properties of the language itself. For example, C was created because a language with those features were necessary for the UNIX project. Similarly for Smalltalk and the environment at PARC. Lisp is a little different because it started as theory that was eventually implemented.


The modern language that comes to mind, is of course, Haskell. And "in between" we have Standard ML. And I think one could argue that the Dylan revival project and Rust are various extensions on the idea of invent(academically)-implement(pragmatically). And the Racket ball is still rolling, of course.

So, I don't think it's right to say that things are all different now. Might be that the maturation of computer science into a "science" field feels a bit like it's taking the joy out of things. But I think if you look at stuff that was published earlier, there's a divide between rather conservative work founded in logic and discrete mathematics, and more exploratory work in what would now be considered "computer science". I'm not convinced all of that would really be considered "published research" though (as in qualifying for a ph.d etc).

It's not like you'd be able to publish a study in medicine on the benefit of washing your hands before you deliver a baby, after you've done an autopsy -- we've already figured out a lot of the elegantly simple stuff.


I do think there is a big difference in academic computer science now and then. Rob Pike has written about it from an operating systems perspective [1], and Richard Gabriel from a programming language perspective [2]. It seems that until the early 1990s, there were lots of projects focused on building "systems", i.e. big pieces of software which are in themselves practically useful; then there is a sharp shift and academic research focus on "theory" (in PL research, e.g. a type safety proof for small core calculi, or a particular algorithm for program analysis).

People have always pursued "small" ideas. But I think the fact that we stopped writing "big" systems is a real change. (I guess the reason is that off-the-shelf operating systems and programming languages gradually got better, until it became impossible to "compete" with them. C.f. the Lua langauge, which got written basically because "western" langauges were not easily available at the time). My impression is that there is a lot less diversity of ideas now than there used to be, because everyone is incrementally improving the same set of OSs/langauges.

[1] http://herpolhode.com/rob/utah2000.pdf [2] https://www.dreamsongs.com/Files/Incommensurability.pdf (see the section starting on page 10)


I'm not sure it's quite so clear cut, don't forget about http://vpri.org/ for example. Or the OLTP (one laptop per child) with assorted projects. Or the work on various unikernels on top of xen (like mirageos). Or Minix3. Or livelykernel (http://www.lively-kernel.org/).

Or perhaps even Dragonfly BSD.

I'm not necessarily disagreeing with you, some of the links above might even support your point -- I'm just not sure we've "stopped" with big, complete systems -- but the field as a whole has gotten much bigger -- and there's only so much hype to go around...


>all about minute incremental improvements

...how much of that is due to the fact that CS is now much more of a mature field compared to 44 years ago. It seems like the fields of synthetic biology and biological and chemical computers and self replicating nanobots (hey Von Neumann again) would be more open to grand scale ideas, and there are lots of exciting opportunities that haven't been explored.

http://en.wikipedia.org/wiki/Synthetic_biology

http://en.wikipedia.org/wiki/Biocomputer

http://phys.org/news/2014-01-slime-molds.html

http://en.wikipedia.org/wiki/Chemical_computer

http://en.wikipedia.org/wiki/Self-replicating_machine#von_Ne...


Lisp was defined in the 50s and implemented in the 60s.

https://en.wikipedia.org/wiki/Lisp_(programming_language)


True, however many other lisps were developed in the 1970s (i.e.: Maclisp), including Scheme which came in 1975.


Not disagreeing with you necessarily, but the 70s were hardly an economic boom time in the USA either.


And von Neumann lived mostly though war and 30's financial crisis. I honestly think it's more a question of (lack of) will/confidence from part of academia. I bet if had really good ideas about new languages and really wanted to dig deep on that he would be able to do so without starving.

Maryam Mirzakhani (this year's female fields Medalist) commented that she's quite a slow thinker, and left a clear impression that it's more a matter of asking important questions and actually trying to solve them with perseverence. It's just far more secure to chase the low-hanging fruits.

http://www.simonsfoundation.org/quanta/20140812-a-tenacious-...

" Another notable and enviable trait of von Neumann's was his mathematical courage. If, in the middle of a search for a counterexample, an infinite series came up, with a lot of exponentials that had quadratic exponents, many mathematicians would start with a clean sheet of paper and look for another counterexample. Not Johnny! When that happened to him, he cheerfully said: "Oh, yes, a thetafunction...", and plowed ahead with the mountainous computations. He wasn't afraid of anything."

That's an inspiration!


My understanding is that during the war, the american government was throwing money around liberally. Trying to do everything it could to outwit and outdo the enemy. This is part of what caused the post-war economic boom, money being distributed and more people getting a chance to prosper.


My understanding is that after the war, the rest of the world needed to buy stuff, but the U.S. was the only major country with an intact manufacturing base. So that's why the U.S. made 30.5% the world's manufactured goods exports in 1948, compared to 15.3% in 1938. See table XXIII on page 52 of:

http://unstats.un.org/unsd/trade/imts/Historical%20data%2019...


Makes one wonder what the NSA have been doing since the 80s. They're supposed to be the biggest employer of mathematicians, and since the 90s they've had their very own "war budget". Maybe in 2050 we'll say -- oh wow, look at all the great stuff that came out of the islamophobic-fuelled war on the middle east that the US spearheaded. Shame about the future ruined for half a billion people, but boy did they come up with some crazy stuff at the NSA!


> Makes one wonder what the NSA have been doing since the 80s. They're supposed to be the biggest employer of mathematicians, and since the 90s they've had their very own "war budget".

I wonder. Does anyone have recent sources indicating the NSA really might be the biggest employer of mathematicians? I'd also point to the Snowden and other leaks and attacks like Stuxnet - so far, everything revealed has been fairly humdrum in the sense that they are more or less what you'd expect if you threw a few billion dollars at known vulnerabilities in the Internet and current OSes. Portmapping entire countries' computers may be impressive in some respects, but not in the sense of beyond-cutting-edge cryptography/mathematics.


I'd say the recent revelation/allegation that NSA accidentally took down Syria's Internet connection in the recent Wired interview with Snowden and the general "vibe" of mismanagement that I get from Snowden's and Binney's accounts -- doesn't have to imply that NSA doesn't also do interesting work other than their practical attacks on global infrastructure. But it does seem clear that if they actually do interesting stuff (say, working on using multiple satellites to photograph the same area in order to extrapolate to higher resolution images than what is possible from a single lens due to atmospheric effects -- or perhaps quantum or DNA/RNA computing etc). It is clear that whatever they might be doing, it does seem to be ridiculously highly classified, just like the sibling comment's example with RSA and GCHQ.


While the NSA has a huge budget, it's not infinite, and what gets one promoted and what analysts are boasting about in their slides gives you a definite idea for what the organization values. Do any of the leaks sound like there might be huge contingents of mathematicians off in side-corridor making breakthroughs? Not really.


We're more likely to say "what a huge burden was NSA!". Major crypto breakthroughs created there (in either breaking something/creating something) would be kept secret for years, so it just bogs down innovation. GCHQ infamously invented public key crypto but no one was allowed to show outsiders for comment so it just died there until it was reinvented a few years later.


OTOH, the time when Lisp (1958), Smalltalk (1969), and ML (1973 is what Wikipedia has for "first appeared in", but unlike Lisp and and Smalltalk I can't find information on definition vs. release) were first defined all (except the last, which is on the cusp) precede the 1973 recession, and the preceding period actually was an economic boom time.


Just move to open source. There you are judged by your peers rather than by numerical results.


To be honest with you, I think this problem is less about open-mindedness and more about the increased depth in which we've studied most fields. There's been a huge increase in the number of academics in these fields, and a lot of the low-hanging fruit has already been picked.

There still are some polymaths, but it's hard for them to make as fundamental a contribution in fields that have already existed for a long(ish) time.


> There still are some polymaths, but it's hard for them to make as fundamental a contribution in fields that have already existed for a long(ish) time.

Yes, but it misses the fact that polymaths historically have solved that problem by creating new fields. Richard P. Feynman, as one example, lectured on nanomaterials and nanodevices decades before the technology existed to make his ideas practical. Einstein shaped relativity theory about four decades before there was any way to confirm (in detail) its theses or apply it to practical problems.


Fields today are too well defined for that.

Just try making a grad work by joining knowledge from two separated field - nobody ever understands what you say. You lose most of your time in the basics, and people are mostly unable to build over what you created.


your comments reminds me of this, which was posted here on HN some time ago when it first came out. "The Last Days of the Polymath"

http://moreintelligentlife.com/content/edward-carr/last-days...


The opportunity still exists, but it is not in the obvious places, like computer science or physics. Great thinkers gravitate to fields that are starting right now, where there is a lot of basic work to do. Computer science is in many ways a mature field. Of course, there is always the possibility of something new and exciting appearing, as it happens in physics and mathematics from time to time. We don't know that it is right now, it will only be clear 10 to 20 years later.


This is exactly right. I strongly disagree any sentiment that's basically equivalent to "they had it easier back then". There are always fields which are too deep for the newbie to penetrate...that's why the newbie either invents a new field or works in a much younger field i.e. great people are great because they found basic and turned it into the software industry not because they joined the software industry and sobbed about the fact that it's too late to make a change within it. In 2014 we also have Google. It's impossible to overstate how much of an advantage we have today. Learning is INCREDIBLY easy and so while the fields are growing at a fast clip so is the ability to instantly find the information we need to learn things.


I'm surprised this article doesn't mention his language abilities.

Since the age of six, von Neumann was fluent in Latin and ancient Greek. He also read (and remembered completely) all the major works of antiquity.

As someone who's been struggling with Latin and Greek for over 10 years it completely breaks my spirit to know that for someone else it was so effortless.


What I find truly remarkable about polymaths is that they don't seem to be crippled in other regards. There is no trade-off. Actually, abilities seem to positivly correlate rather than displace each other.

It seems like there's a free lunch somewhere. Which is unusual for living beings, for evolution to miss it.


Niels Bohr - Nobel Prie winner, Quantum THeoretician, Olympic athlete.

I think you might be onto something - I suspect its that genius is not genetic, which indicates it might be nurture not nature - which means things might be really cool for humans in a hundred years.


Well, he only had one kid. That makes him a pretty lousy performer from evolution's point of view.


He did improve the evolutionary odds of most people on HN, so that counts for something.


The section on "work habits" is especially amazing. I know people who are very smart but also very lazy. They end up falling behind in the long run; may be a corollary to the Curse of the Gifted.


John von Neumann was an absolutely amazing human and scientist. truly a genius, a polymath, and evidently quite an interesting party guest. the book Prisoner's Dilemma is ostensibly about game theory but it's set within the context of von Neumann's life. really quite fascinating, one of my favorite people to study.

http://www.amazon.com/Prisoners-Dilemma-William-Poundstone/d...


Not to rain on the parade, but such glorification of "heroes" is... just awkward. Wasn't there an article on Herbert's Dune on the HN front page, just yesterday?

In particular, von Neumann was an "amazing human" as long as you don't mind him advising the military to bomb Russia out of the map, preemptively. What a wonderfully humane move.

"If you say why not bomb them tomorrow, I say why not today? If you say today at 5 o'clock, I say why not one o'clock?"

(and this is not just some cranky scientist rambling -- he was very high up in the military circles, an advisor/policy maker).

Another common target of cringeworthy worship: Feynman. All fun stories (and I'm aware saying this is probably not going to go down well on HN) until you read about the abuse of women, paid abortions, broken marriages of his colleagues...

They're all human.


"If you say why not bomb them tomorrow, I say why not today? If you say today at 5 o'clock, I say why not one o'clock?"

As a mathematician, it's probably worth pointing out the use of the implication. "If you are planning on bombing them tomorrow at 5pm then why wait?" is not equivalent to "Russia should be bombed now."

And indeed, of all the logical fallacies I see on the Internet this is one of the ones that tends to bother me personally the most. "A -> B" is not! equivalent to "B". If I meant B, I would have simply said B. I suspect the same is true of von Neumann.


I agree with your sentiment overall, but I don't think it applies to von Neumann. The sentence right before that quote is: "With the Russians it is not a question of whether but of when." It seems generally accepted that he was in favor of destroying the Soviets before they gained the ability to retaliate, although I will admit that I can't find any more direct quotes about it.

It's hardly surprising either. During those few years when the US had the ability to devastate the USSR but not vice versa, it was a fairly common opinion. In addition to all the common reasons for advocating this, a Hungarian would also be influenced by the rather brutal domination of his homeland by the Soviets in that time.


Rain is good. No matter how smart you are, you can still be an idiot sometimes:

"Von Neumann knew that it was only a matter of time before the Soviet Union became a nuclear power. He predicted that were Russia allowed to build a nuclear arsenal, a war against the U.S. would be inevitable. He therefore recommended that the U.S. launch a nuclear strike at Moscow, destroying its enemy and becoming a dominant world power, so as to avoid a more destructive nuclear war later on."

http://cs.stanford.edu/people/eroberts/courses/soco/projects...


Nietzsche has interesting things about our need to tear down people who have great, unique accomplishments over common weaknesses - calling this the "slave morality". The weak need to feel superior to the great, so we redefine greatness to mean other things.


The context: slave vs. master morality. Democracy is a slave morality. (http://en.wikipedia.org/wiki/Master%E2%80%93slave_morality)

Personally, I think slave moralities have much to recommend them, as movements against elite domination.

Intellectual culture has its own hype, like the computer world. Some with loudspeakers try to deify a few "great thinkers", focussing on (certain narrow aspects of) their personalities they wish others to emulate. Brushing others aside who don't fit their model, despite talent. (Like people with the wrong gender/race.)

Let's take another well-known philosopher (Chomsky), who pointed out how humanity's masters try to turn "great" figures into villains, when they act with exemplary slave morality:

"Compare Russell and Einstein, two leading figures, roughly the same generation. They agreed on the grave dangers facing humanity, but chose different ways to respond. Einstein responded by living a very comfortable life in Princeton and dedicating himself to research that he loved, taking a few moments for an occasional oracular statement. Russell responded by leading demonstrations and getting himself dragged off by the cops, writing extensively on the problems of the day, organizing war crimes trials, etc. The result? Russell was and is reviled and condemned, Einstein is admired as a saint. Should that surprise us? Not at all."


That's not what is being discussed here -- what's in point are his wonderful mathematical prowess, not his principles. Being good at one thing doesn't imply pristine ethics.

You also have to remember that those figures have pretty much all they ever said scrutinized, so that tends to distort things -- I'm pretty sure everyone one time or another made some unethical remarks, specially as a WWII fugitive. Horrible things happened in Europe and after that the harshness of Stalinism was also clear. I think it wouldn't be hard to get caught in "ultra-Americanism" at the time.


Can you please add some sources for the claim about Feynman? A quick search on the web "feynman abuse" didn't gave me relevant results.


Feynman's life is well-chronicled -- in its totality -- in Jim Gleick's biography "Genius." Feynman had a somewhat messy personal life, and Gleick covers the key points briefly but clearly.

For example (p. 277) "Bethe worried that Feynman was growing restless after four years at Cornell. There were entanglements with women: Feynman pursued them and then dropped them, or tried to, with increasingly public frustration -- so it seemed even to undergraduates, who knew him as the least professorial of professors."

If you get to p. 290, there's a quite astonishing section on Feynman's effort to work out the rules of flirting in a bar. You can still admire his science after reading the book. You'll be both charmed and troubled by his over-sized personality. But, hey, he was a complicated guy.


The famous quote is "His ex-wife reportedly testified that on several occasions when she unwittingly disturbed either his calculus or his drums he flew into a violent rage, during which time he attacked her, threw pieces of bric-a-brac about and smashed the furniture." That was from an FBI file (https://www.muckrock.com/foi/united-states-of-america-10/fbi...) on Feynman.

I don't know anything about the "paid abortion" thing.

It's tricky stuff... Feynman was (is?) a hero of mine, and I struggle to put this stuff in context.


"It's tricky stuff... Feynman was (is?) a hero of mine, and I struggle to put this stuff in context."

Heros have feet of clay. The correct move is to adjust your concept of "hero" to include this.

The alternative, that you will only declare someone a "hero" if they are perfect in every way yields a definition of "hero" with 0 instances, which is a useless definition.

If Feynmann was abusive to his wife to some degree, well, plenty of people are abusive to their wifes without also creating physics breakthroughs. It's OK to honor his ability to do that without having to think of him as perfect.


I half-agree with you, but isn't there an ethical line past which it's wrong to call someone a hero?


Isn't that something you should decide for yourself? Also, why do you assume there's a duality between hero and non-hero? The person did what they did. Simplifying it to the point of 'hero' or 'not hero' is overgeneralizing, I think. Acknowledge their achievements. Acknowledge their faults. Labels aren't a requirement.


I tend not to engage in much hero worship anymore, but if the question is essentially whether there's an ethical line past which we can't look up to someone, then I'd say it's of vital importance to know exactly what this "abuse" entailed. Did he physically beat his wife on multiple occasions, or did he throw an ash tray or plate at a wall on a few occasions?

I know a good number of people in a relationship, both male and female, who have engaged in some variation of second offense(throwing things or breaking things during their worst arguments with their spouse/partner. But it wasn't habitual and didn't extend to physical contact.

If that's "all" Feynman is accused of, then I don't think it's that big of a deal. I mean, he had a son and daughter who looked up to him, so I am a bit skeptical that he was a brutal wife beater (who tend not to draw the respect of their children).


To be honest, I don't really call very many people "heros". I personally keep a pretty high bar for that. But that goes into an entirely different, and very large, discussion.


Shine a bright enough light and everyone is hiding cockroaches.


> It's tricky stuff... Feynman was (is?) a hero of mine, and I struggle to put this stuff in context.

The context is that there was no such thing as no-fault divorce in the United States until 1969, and as a result, it was not uncommon for people to invent phony stories about abuse and submit false testimony to the court.


I thought this was a rumour started by Feynman himself in order that the authorities would think twice about asking him to preside over something or other.

The 'his calculus or his drum playing' part is kind of a giveaway.


> Not to rain on the parade, but such glorification of "heroes" is... just awkward.

you need to stop putting words in my mouth. never did i use the word hero, never did i use the word worship. never did i suggest those things, either.


"John von Neumann invented the digital computer"

Mauchly and Eckert did but Neumann came and stole the fame.

http://en.wikipedia.org/wiki/John_Mauchly#EDVAC

Eniac: The Triumphs and Tragedies of the World's First Computer, Scott McCartney

http://www.2think.org/eniac.shtml


I think it's sad that at the end of his life, he was really fearful of death. So fearful, in fact, that he sought refuge in religion, but supposedly, it didn't bring much comfort.

I always think of that when I ponder about my own struggles with life, the human condition, and mortality. I really don't want to be that fearful of my own impending death if I get something like cancer.


"Hungarian is not exactly a lingua franca" Hard to argue with that part. :)

Anyhow, a colleague of mine at PaineWebber sadly came down with schizophrenic delusions, and concluded that he and I were illegitimate children of Norbert Weiner, while Jack Grubman and Andy Kessler had been sired by John von Neumann. I objected strenuously; as somebody whose PhD these was a special case of the min-max theorem for zero-sum games, I thought it only fitting that I be on the von Neumann side of the ledger.


Gödel did not prove that mathematics could not 'be proved consistent'. He proved that particular axiomatic systems cannot be both complete and consistent.


Presumably, the reference is to Goedel's second incompleteness theorem, rather than the first incompleteness theorem. (One might still quibble with phrasing it in this way, of course)


More than once I was studying something in completely unrelated fields (eg: quantum mechanics, biology, game theory), when suddenly von Neumann (who I first knew for his contributions to computer science) is mentioned and I'm like "What? This guy again?". After the third time I went to read his biography because it couldn't be the same person. But yeah, it was.

https://en.wikipedia.org/wiki/John_von_Neumann


Yeah, the same happened to me. I believe it was the thermonuclear bomb and the Von Neumann architecture. I thought that it couldn't be the same guy - two entirely unrelated fields - but it was. The guy was, in my opinion at least, the last true polymath.


Except they're not unrelated. Von Neumann needed more computing power than he had available, for studying bomb blasts. Not only did he and colleagues largely create digital computers, but he marshaled the resources of the US government and academia to get it done.

Dyson's Turing's Cathedral is a fascinating account of the development of digital computers. The early ones were amazingly physical, using wave propagation delays in liquid mercury, and repeatedly painting rows of oscilloscope screens, as storage.

https://duckduckgo.com/?q=turing%27s%20cathedral

https://en.wikipedia.org/wiki/Williams_tube

EDIT: s/Turning/Turing/


Wow. Really cool stuff. I'll be sure to check out the book then.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: