Hacker News new | past | comments | ask | show | jobs | submit login

My Oxford Dictionary of Computer Science defines a computer as "a device or system that is capable of carrying out a sequence of operations in a distinctly and explicitly defined manner."

There is no defacto computer science orthodoxy, so let's take that definition as a shred of a candidate orthodoxy to think inside.

Let's go further back in time with the this 'device' oriented way of defining. Through correspondence, we can see Newtons contributions contain the https://en.wikipedia.org/wiki/Fundamental_theorem_of_calculu..., but this was an emergent process. The method of the 'grunt work' of this process apparently had two camps, Abacists and Algorists, and he was more the former. https://en.wikipedia.org/wiki/Abacus#/media/File:Houghton_Ty...

From the Oxford English dictionary it gives the origin of Calculus to be "Origin Mid 17th century: from Latin, literally ‘small pebble (as used on an abacus)’."

It's easy to imagine a layperson watching Newton fiddle with his abacus and presume he was doing whatever pragmatic bean counting prevalent to his time. Most people wouldn't spend time pondering and inventing new usages of the abacus. So, it's easy to imagine we'd do the same with Stephen.

As a device, the abacus doesn't conform to the initial definition of a computer (on its own it is neither a system or device that in and of itself can carry out a sequence of operations).

The abacus device seems much more like the contemporary 20 dollar TI scientific calculator. The human has to manually administer the sequence. The system of thought could be arithmetic, algebra, etc, but a contemporary vanilla calculus textbook does not describe a Turing complete anything. Without some explanation of theory to bootstrap the explanation the idea of how recursion/looping works, it's going to hard to even metaphorically mechanize the the idea of sequencing.

Enter lambda calculus and church numerals. With these systems, you can systematically represent mathematical idea with whatever notation you want. Church goes the other way and instead of inventing new squiggles, he gets reduces the 'character set' for his notation, and has really long as expressions, for example, the number 3: λf.λx.f (f (f x)). The algorists were unconstrained by ASCII and unicode so they may have balked at the aesthetics of these expressions, but's not inherently great practice to have ad-hoc invention of notation. Wolframs link to the history of mathematical notations starts to trail off, probably correlated with Church's rendering the need for new squiggles optional.

Contemporaneously, (1930s) Turing contributes the idea of a turing machine, but this is an abstraction. It doesn't actually carry out the sequence of operations. This is after the enigma machine, but ultimately we don't spend much time fussing over who specifically invented the physical abacus, sliderule, digital calculator or digital computer. Its an emergent process, and as a system we don't see a pool of folk communities of "algorists vs abacists" until sometime around the homebrew computer club.

Because of this feels very superfluously flippant that Stephen Wolfram is always coining "new this" and "wolfram that" when we think about Newton leaving others to coin "Newtonian Physics." Personally I can just mentally redact this aspect of his behavior.

So the question is is Stephen actually talking about a new thing that exist? Is a "computational language" a thing? I'd say he's probably right that its coming, but the wolfram language isn't quite yet that thing.

What he is pointing out here is the missed oportunity to have a vernacular, or a text in the semiotic sense, to depict the actual computation aspects of our calcuations. When humans moved the calculator out of our/Newtons brain and into the device as uniform-ish human collective activity, wasn't really happening in significant numbers until the 70s. Signficant as defined by my metric of "the amount of people who used abacus back in the day." Once we let the computer do the task, we didn't force it to show its work. But if the computer carries out the operations, what is it we're still operating manually, why do we continue to interact with it at all?

Well we don't have a language to describe what we're doing and thinking in those gaps, and we're rarely automating state diagrams and flowcharts from our code much less starting with just painting out the state diagrams and flowcharts and letting the computer do the work.

Perhaps wolfram will set the stage for a defacto method towards this type higher order representation aka "computational language." He's definitely trying, but newtons's notations didn't catch on, its up to you to decide if the representation or the premise is more important.




> Newton coined 'Calculus,'

Minor point, but the term "calculus" was in use before Newton's "fluxions" to refer to various mathematical systems, just as we still have Lambda calculus, etc. Leibniz used the term in his approach, iirc.


Thanks for reviewing. Fixed the discrepancy, hopefully made it better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: