Hacker News new | past | comments | ask | show | jobs | submit login
What We’ve Built Is a Computational Language (stephenwolfram.com)
74 points by techgipper on May 9, 2019 | hide | past | favorite | 61 comments



I downloaded a free trial of Mathematica about two weeks ago. I appreciate all of the built in data sources and libraries, the notebooks work well, but it is hard for me to go all-in on something that is not an open source language.

I am a Lisp/Java/Python/Ruby/Haskell programmer, but, the language I really wish I knew inside and out is Julia. I think that Julia is hackable like Lisp but I just don’t have the experience to do it. I would not be surprised if Julia in five years becomes an open system like Mathematica and the Wolfram Language - with lots of libraries supplying useful data, lots of libraries and language extensions, etc. sort of what Python is today but much more efficient and better for building very large systems.


Julia is amazing its the language I wish I had time for too. You can literally have it print out the assembly for the code you write. I know C and such can do so. But Julia does it within its little REPL tool. It is incredible.

Unfortunately I love languages like Python and D the most: multiparadigm but not crippled by one single programming approach like OO or Functional. Just do what is best for you and as you learn the language it gets way better.

I would pay good money (if I had good money) to invest a lot more into D. It has so much more potential imho. I would love to see a UI library in raw D that comes out of the box with D in the std lib for example. Same with an out of the box web server like Go does. Lastly maybe some editor like IDLE for D. Coded fully in D with its own UI library like Racket does. I also love Racket it makes me feel like I can do anything and indeed I can.

GUI programming is not emphasized enough in new languages to the point where Electron took off cause when the UI is just HTML and CSS mostly, the sky is the limit. You can reuse skills you know! Qt even has some Electron like stuff going on which is neat.


Julia is a fantastic language and ecosystem, but I don't think it's the right thing to compare Mathematica against. As a language, Mathematica's computational model and strengths are closer to Lisp, IMHO. As a "computational system" their goals might be similar, but I can't fathom what Stephen Wolfram actually aims at, for Mathematica.

Julia, btw, is actually a language you could potentially know inside out. The discipline to keep the language simple enough that LLVM can generate fast code means that it's not hard to understand and there aren't many gotchas. On top of that, the community discussions (on Github issues and the Discourse forum) are very interesting and enlightening to read through.


If John McCarthy had said this about Lisp in the 1950s or 1960s, it would have probably been preserved as a seminal paper, a computer-science classic. Lisp literally was the first language for reasoning about computation itself, and it was for decades the go-to language for experimenting with new computational concepts, abstractions, and applications. As it is, it's the usual sort of puffery we've come to expect from Wolfram, a few decades late. This is particularly mordantly ironic because Wolfram owes much of his success to Lisp (Mathematica being more or less a successor to Macsyma), despite having once been the shithead teenager who disparaged Lisp as too slow for real symbolic mathematical programming. (see: https://groups.google.com/forum/#!original/comp.lang.lisp/BU...)


McCarthy: 'Recursive Functions of Symbolic Expressions and Their Computation by Machine':

'In this article, we first describe a formalism for defining functions recursively. We believe this formalism has advantages both as a programming language and as a vehicle for developing a theory of computation'

There is a lot of fluff in the Wolfram article:

'To be fair, the Wolfram Language is the sole example that exists of a full-scale computational language. But one gets a sense of magnitude from it. While the core of a standard programming language typically has perhaps a few tens of primitive functions built in, the Wolfram Language has more than 5600—with many of those individually representing major pieces of computational intelligence. And in its effort to be able to talk about the real world, the Wolfram Language also has millions of entities of all sorts built into it. And, yes, the Wolfram Language has had more than three decades of energetic, continuous development put into it.'

Just like other attempts in making vast amounts of world knowledge available for computation. See for example Cyc of Cycorp.


Why does he keep using the word "computational" like it's some unique thing and not literally the same thing as programming?


AFAICT, it's a confusingly aggrandizing term that does offer some new things, but should at most be considered a new subtype of programming language (rather than something different from programming languages altogether[0]—and I'd bet "new subtype" isn't even necessary). He mostly characterizes 'programming language' (as distinct from 'computational language') as telling the computer what to do, as if all previous programming languages were low-level imperative languages:

> With standard programming languages, we’ve had a way to talk about the low-level operation of computers. But with computational language, we now have a way to apply the computational paradigm directly to almost anything: we have a language and a notation for doing computational X, for basically any field “X” (from archaeology to zoology, and beyond).

From what I can tell, what actually characterizes his language is:

- It's pretty thoroughly declarative

- It offers nice ways of incorporating diverse data inputs/outputs into source code

- It seems to incorporate some very abstract notion of an entity that can be involved in a generic symbolic computation process, and maybe a way of defining collections of related entities (i.e. new 'domains')

- It seems unafraid of 'pluralism,' i.e. walking away from the elegance/mathiness of the previous item, and incorporating an encyclopedia-like collection of custom 'domains' into the language.

----

[0] He pretty clearly contradicts himself on whether it's actually a programming language or not. He knows he's not going to get away with saying that it's not technically a programming language, but then... also wants to insist that the term isn't adequate to describe it:

> Yes, it’s a computer language—a programming language. And it does—in a uniquely productive way, I might add—what standard programming languages do. But that’s only a very small part of the story. And what I’ve finally come to realize is that one should actually think of the Wolfram Language as an entirely different—and new—kind of thing: what one can call a computational language.


I once was outlining notes for a course I would like to give introducing computer programming. The idea was to have exercise and theory sessions interleaved, which left the first session, which has to be theoretical in order to introduce the thing, kind of pointless.

So I wanted to explain kind of what programming is, and what it means, and I went to the etymology and why we call it what it is. And it turns out that the etymology tells you that it's a throwaway word. It comes from the same root as a program at a play -- it is just a schedule of what a computer will do, when. The hard concept is not "programming", the hard concept is "computer". Indeed our failure to so far resolve something like P = NP can be traced to our ambiguity on what exactly a computer is.

So I wondered what I would say to kids who had never programmed before and decided that I would just be honest and say that I don't know exactly how to define computers, but that in our experience we can say that they are machines which are at least good at a few things: Math, Art, Control, Graphs. Maybe I had one or two more, or maybe that was it.

This actually changed the course vision directly. Before I was doing what came natural to me as someone with a heavy mathematics education: let's do fibonaccis and factorials and primes. But that's wrong. If you want to teach programming then you want to hit all of these more evenly, not one of them heavily. And if you want one thing that hits all of them directly, you would ask people to start programming games. After all, what is a game? It is a user-Controlled walk-through-a-Graph whose nodes contain Art that rewards or punishes the user for how they walked through that graph. Possibly math exists in the form of, say, physics engines and drawing and such: but it is more of a means to add some interesting features rather than a purpose in itself.

So I had to go back and rewrite a bunch of the follow-up sessions and now it is not in a 90%-done state but in like a 40%-done state, heh.

But yeah, if computations are just what computers do, then the only obvious division between computation and programming -- which Wolfram may have intended -- is that programming languages typically have to focus hard on the sequence of what is going to happen; technically one might say a computational language is less about what order things happen in and more about what result is wanted. Logic languages might be more "computational" than functional languages which might be more "computational" than imperative languages, on this account. And indeed as one goes down that path one sees less and less symbolic representation of expressions and more concrete focus on data structures and values.

Possibly what one needs after that is a sense of metadata and metalanguage: in the language I specify what my conditions of adequacy are for what I am looking for; with some metalanguage I might be able to help a computer figure out one way to get there.


I tend to think of the "programming" part as "planning", which I think fits the etymology. Planning is hard!


Its a bit domain specific to describe as a programming language. The blog post didn’t describe and control flow features for example. I suppose computation because it is able to incorporate diverse computational algorithms. The unit aware resolver seems like a spin in type systems.


> The blog post didn’t describe and control flow features for example.

Is there anything innovative to describe?


I really appreciate the amount of detail and scope in the Wolfram technology. However, I think the language design and the understanding of what makes a good language is quite flawed...

Just one example from the article: "While the core of a standard programming language typically has perhaps a few tens of primitive functions built in, the Wolfram Language has more than 5600—with many of those individually representing major pieces of computational intelligence"

Looking at those "primitives" I would rather interpret them as "functions" or even "Classes" instead of primitives. And other languages have indeed lots of them, arguably more than the Wolfram language. Which leads to another problem with the language design: Most of those functions are too restricted and too unextendable imo. Also the tech is a quite closed system. This makes it hard to really built upon all the great work the Wolfram team puts into the technology.


The first sentence of the report defining the Scheme programming language:

"Programming languages should be designed not by piling feature on top of feature, but by removing the weaknesses and restrictions that make additional features appear necessary."


Imagine the impact on the world if this was FLOSS.


Steven Wolfram claims (credibly IMO) that none of it would have existed if it was FLOSS.


That is true and irrelevant.

It would also not exist without other free software. Though this too is true and irrelevant.


I can't figure out what he means by "computational language." It seems to only be defined in a circular way:

>So what is a computational language? It’s a language for expressing things in a computational way—and for capturing computational ways of thinking about things.

I'd love to see him explain what he means without further use of the word "computational" in his explanation.

As long as use of the language stays behind a paywall I doubt it is going to get wide adoption. But having seen some of his previously published examples of using the language, it does seem powerful.


When you read a great novel you almost feel as if your there in your head. When I read you code, I know what the program does, but if its elegant or even brilliant, I have no idea what it was like in your head the day you wrote it and where that epiphany percolated from.

I think a computational language would emphasize how you perceive the program in your head, or at least experience reliably how other people do AND do the thing (unlike state diagrams and flowchars, uml etc)


He seems to operate on a very narrow definition of what a programming language is, as if symbolic languages and homoiconic languages and declarative languages and logic langauges don't already exist.

Really any language used to specify a computation is a programming language, and so "computational language" and "programming language" are synonymous. Wolfram certainly knows this. His post is just marketing.


It's "computational" now?

He used to make a big thing about it being a "symbolic" programming language, which made more sense to me - a programming language that treats symbols as a first-class concept is indeed a rare beast. Only computer algebra systems really attempt this, and of those few have really well designed programming languages. Yes, Lisp has symbols, but you can't really do much that's interesting with them directly. (sqrt x) when x is unbound is an error, not the abstract concept of taking the square root of an unspecified value. The idea of taking something that would normally be a "compiler optimization", like sqrt(x^2) => abs(x), and making it available at runtime as part of the evaluation model, can be seen as an evolution of the powerful idea of including (eval) in the language.


>a programming language that treats symbols as a first-class concept is indeed a rare beast

JavaScript

> Only computer algebra systems really attempt this

There are a bunch of programming systems (incl. computer algebra systems) on top of Lisp, which use various types symbol expressions.


How does javascript qualify? x2 => ReferenceError: x is not defined, same as any other non symbolic language.

As for libraries - yes, there exist symbolic manipulation systems for Lisp - and Javascript, and Python (notably SymPy). But none of them integrate seamlessly with the language, because there's fundamental conflicts with the language's default evaluation model. You need to design something with that in mind from scratch, and doing it such a way as to make something elegant and composable and not a huge mess is really hard.


> How does javascript qualify?

It has symbols as a basic data type, slightly similar to Lisp.

> But none of them integrate seamlessly with the language, because there's fundamental conflicts with the language's default evaluation model

Lisp has symbols to implement these other evaluation models.

Every evaluation model will have some problems integrating other ones. Most of them have a particular purpose: algebra, theorem proving, predicate logics, planning, ...


Does anyone here have experience programming (or even playing with) Wolfram language?


Yeah, I've used it quite a bit and would consider myself an intermediate user. It's a very pleasant language to use (everything is an S-expression underneath, and computation happens via "term rewriting") once you get used to it, but it's declarative nature of programming makes it similar to eg. Haskell/Lisp and quite different from eg. Python/Matlab. It's basically unmatched for symbolic manipulation and algebra, so it's very popular among theoretical physicists (that's how I encountered it).

My only gripe with it is that it's not open/free (as in freedom), so the platform is limited by Wolfram's imagination (and different pieces don't always compose nicely, unlike say, in Julia) and I don't want to become dependent on a computing platform whose future is murky (bus factor of one, at least from what Wolfram takes credit for). Otherwise, I wouldn't mind paying for a personal use license and using it more.


Played with it during Advent of Code. It really disappointed me. The first 10 minutes of discovering the language were great--in theory it had so much potential between how errors automatically propagate up, the extensive library, and metaprogramming abilities of the language. The syntax is unconventional and to a certain degree off-putting (coming from a person who doesn't mind Erlang, Prolog, Lisp, Forth, Haskell, ...) but usable. The novelty starts to fade when you realize how poorly suited it is for general purpose programming. Despite having features for everything, it's hard to make them all connect. I spent hours trying to figure out how to use the DFS functionality in remarkably simple ways, for example. Maybe it was the documentation that failed me, but that doesn't really help sell the language either if that's the case. I think if Wolfram actually cared to "come down to earth" and think about users it could be a great language--the potential is definitely there--but right now it really looks like he jumped the gun with marketing and left out a lot of the important groundwork.


It really is well-designed and pleasant to use for the sort of programming that scientists and (real) engineers do. Not just symbolic math, but any kind of technical computing, done at least partly interactively, on a single powerful machine.

Stephen Wolfram actually is very good at designing a programming language and standard library (and he does personally oversee most of it). A lot of work has gone into picking the right set of functions to provide at the right level of abstraction.

Tools are provided to avoid a lot of bullshit; like there are just functions "Import" and "Export" which mostly work automatically and can read and write to all kinds of bizarre file formats. You don't need to go dig for an unmaintained library on pypi.

It's also probably the most visually pretty programming environment you can use, which ended up being more important to me than I would have expected.

The licensing cost doesn't seem like such a big downside to me, because it is probably worth $115/month for someone who would use it. Obviously much cheaper for academia.

The closed-source aspect is a bigger problem; if it doesn't do quite what you want, you really are out of luck, and there's no way to crack it open and make a small change or see what's happening. On the other hand, most of the target market (engineers and scientists who only write code to produce something else they want) would never do that anyway.

Support for machine learning used to be nonexistent but is now pretty good although a little more opaque to the end user than it ought to be. Somewhat separate from that, they have a set of Keras-esque neural network facilities which is extremely good in terms of design and usability.

As for the gee-whiz features, mostly they aren't that useful, unsurprisingly. The feature to translate inline natural language input to expressions of code (done with Ctrl-=) is surprisingly nice for small things, however, because it can handle dates and times really easily.


I like to play with the subset implemented in Mathics. It's not a bad language. As people have said, it is basically Lisp written in bracketed M-expressions rather than S-expressions.


Yes, I've taught with it in a multivariable calculus class. It's pretty easy to learn, and by the end of the semester the engaged students are doing amazing things. The notebooks are very useful, and will feel familiar to anyone who's used Jupyter notebooks. The language itself is what you would get if Python and all of the popular libraries were programmed by a single person with a common syntax and good documentation.

I probably wouldn't use it for "big data" analysis, but you can use it for just about anything else that is math or even slightly related to math- fluid dynamics, solving systems of equations, statistics, optimization, stock market charts, plotting values on a map, etc.

If you can get someone else to pay for it then it's absolutely worth the money- basically, if you're just a hobbyist then you probably don't need to buy it, but if you need to calculate integrals for your job then you need it.

I realize I've said mean things about Wolfram's attitude before and that his isn't actually revolutionary, but it's still a great product. Kind of like how you can't really call an iPhone X revolutionary, but it's still really good if you can pay the sticker price.


A little bit, but fresh. Parroting some things too.

Pros:

(1) At surface it's mostly M-expressions variant, but underneath is term-rewriting system. This makes it, well, anything. Multi-paradigm is an understatement. Feels like a better layer of abstraction to float in than "code is data".

(2) That TRS has possibly biggest rule set in existence. I doubt anyone but Stephen has good idea of the breadth of it, but it just works.

(3) It's been around for 30 years (40, if you count SMP). Most notebooks from 30 years ago will run just fine on latest. This means it has good foundations. They're gonna add a LLVM compiler in there soon and so far they make it seem like just another day at the office.

(4) Did I mention the giant rule set? It had function for anything that came up on my plate, well integrated with everything else, and well documented (color-coded!), runnable examples and all.

(5) Notebooks and other facilities in the language/environment encourage exploration. I've barely begun working with it but I already used it thrice to solve some tricky problems. Porting that to whatever I use in production seemed trivial so far, once I groked it with a .nb first.

(6) You can simply reach out to curated datasets, and they're really impressive. I feel like they covered beginner level pretty well, as well as a lot of specialist areas, but there are holes in the "middle". This doesn't sound like intractable problem and it's only getting better. Playing with their neural net repository is something I look forward too.

(7) A lot of what would usually be boring boilerplate elsewhere (unit conversions, plotting parameters, etc.), here can be done via natural language input.

(8) Makes math look nice and readable.

Cons:

(a) Some areas grow neglected at times (i.e. graphs atm). But that sounds natural - this is a tree or a forest, not a flower.

(b) It looks usable for production as live system, but you better prepare budget.

(c) It's not open-source. You'll be hard pressed to know what's exactly happening under your code. As I grow older, I care increasingly less.

(d) You have to wait for updates for fixes/features.

(e) I'm afraid that if you don't have at least some inclinations towards science (vs. "programming"), you're not going to enjoy it. This is not a tool to maximize CRUD app output.

Remarks:

(I) After watching some streams by Stephen, I feel it harder and harder to justify attacks on his ego. He sounds like a really smart and sensible guy with enthusiasm of a child and huge ideas that may or may not pan out.

(II) "Computational" thing he's pushing for makes perfect sense to me. At even rudimentary level, I feel like end-game for math notation (or music, for that matter) isn't some prescriptions of Royal Society of Mathematical Notation, but exactly something like Wolfram Language is. And on high level, "mining computational universe" for actual practical stuff also makes sense. But we'll see. Feels right but far away.


It's great for experimental mathematics. Not so much for software engineering. Easy to write, hard to read/maintain, poor performance.


I came across Mathematica via Wolfram Alpha while studying for a Maths degree. I was quite astonished by its capabilities, and have been playing with it ever since.

For context I work full time in an industry unrelated to software, and have kids and all the chaos that goes with them, so have relatively little time to devote to programming. I need something that 'just works'.

I've always felt that I should enjoy programming, and have had several false starts including HyperCard and Eiffel back in the day. But Mathematica / Wolfram Language is the first time programming has actually clicked.

There are various aspects to the Wolfram Language that make it work for me;

- The documentation is enormous, comprehensive and even editable and executable (desktop install). I haven't seen anything that comes close in any other language (Racket would probably be a distant second). In the snatches of time I have, all the information I need can be found using the F1 key, I don't have to waste time going to Stack Exchange and asking others for help

- The concept of everything being an M-expression makes the language very logical. Sub-expressions can themselves be evaluated and understood, larger expressions built out of smaller ones etc

- Lots of syntactic sugar IMO makes the M-Expressions more readable than Lisp S-Expressions, while retaining their usefulness in making code understandable

- Very powerful pattern matching and structural operations on expressions, which are great tools for manipulating expressions and extracting code or data

- Strong support for functional programming, which I find to be more enjoyable than procedural programming

- The language is symbolic, which often allows you to 'play' with programs and understand how they will work in an abstract way before using with real data. As a toy example you can literally fold an abstract function with abstract expressions, e.g. inputting Fold[f,x,{a,b,c,d}] returns f[f[f[f[x,a],b],c],d].

- a huge standard library built in, all working in a way that is remarkably consistent for such a wide diversity of domains, and a language that has been in development for 30+ years

- interactive notebook programming allows me to document my notes and progress along with the code

I did try a few FLOSS options primarily for access to their communities and also ease of deployment (e.g. F#, Clojure, Java) but haven't found anything comes close for my particular needs. I suspect that Racket would be the next best thing.

I think the Wolfram Language does suffer from the Lisp curse, in that it is sufficiently powerful and well documented that individual developers can go a long way without recourse to a community, which in turn hence hinders the establishment and growth of such communities.

I would strongly recommend spending some time learning the fundamentals of the language. My favourite resource is Paul Wellin's An Introduction to Programming with Mathematica (2013). I suspect a lot of the criticism of Mathematica comes from those who haven't learnt the fundamentals, and expect it to work like Python or Java etc, when in fact it is more like a cousin of Lisp.


I had never heard that explanation of the Lisp curse and it is very insightful; thank you!


It was great for solving integrals in high school calculus many years ago, not only providing the answers, but also showing the steps it took to get to them. Even in the 90s, it did pretty impressive symbolic math and plotting. This was, of course, before Wolfram discovered the web and started broadcasting his monster raving egomania.

I imagine it's still a pretty useful tool.


The funny thing about all this is that I think - despite all his bluster - that Wolfram is basically right about how uniquely powerful this stuff is in terms of making advanced symbolic programming with complex data accessible and visible, and in providing a rich-text notation to express computations. I think that Mathematica genuinely captures a piece of what it would mean to have a computer as a mind-amplifier, a tool rather than a mere appliance, one which is frictionless and accessible enough to be usable by everyday people. I think he's actually on to something (as opposed to merely being on something).

...The problem is that it's attached to the rest of the Wolfram language - a painfully awkward Lisp-ish thingy dependent on an expensive closed-source platform and a standard library that seems to contain everything you could ever possibly need to do ... but which is so intimidatingly enormous that you can't keep track of it, which turns every program into a trek through the documentation in case there's something there that already does what you need (and then trying to figure out how to plumb all these bits together.) You can do virtually anything with the Mathematica standard library ... which is good, because trying to actually write de novo code is despair-inducing.

And Steve doesn't seem to be able to recognize that Paragraph 1 is ultimately crippled for widespread utility by Paragraph 2.

I do hope that eventually someone manages to take the notational insights of the Wolfram language and apply them to some other platform. There's a vague vision in my head of something that combines aspects and insights of Wolfram/Mathematica, HyperCard, Jupyter, and Excel[1] to create a truly flexible and accessible end-user programming environment.

[1] people really underrate Excel as a programming environment, honestly. Yes, it's crippled and leads people to produce massive gross un-debuggable hellsheets ... but there's reasons (beyond just "it was the only usable software that could be run on office computers") that end users with a problem to solve keep turning to it despite those flaws. The combination of reactive programming, data-first visibility, no hidden state, decent approximations to structured programming by way of click-drag-and-copy-paste, and being able to reference variables and values without needing to name them has some kind of magic to it. There's quite a bit of interesting research on how to take something like Excel and turn it into a non-crippled programming environment - spreadsheet-defined functions (including recursion, lambdas, and higher-order functions natively in the spreadsheet environment, without having to drop into VBA!), dynamic arrays, an alternate computation-first textual view that exists simultaneously with the data-first spreadsheet view, first-class complex data structures ...


On [1], you would probably like Mesh Spreadsheet. Spreadsheet-defined functions (via IPC), lambdas, dynamic arrays, text pane for showing and editing sheet-as-code, editing tables as a first-class data structure, all portable and free and based on JS as the formula language. https://github.com/chrispsn/mesh


> but which is so intimidatingly enormous that you can't keep track of it, which turns every program into a trek through the documentation in case there's something there that already does what you need (and then trying to figure out how to plumb all these bits together.)

This hit the nail on the head for me. Thanks.


"But I wish it was free/open/FLOSS!"

I'll take usable documentation and 30+ years of continual improvements with widespread use in both academia and industry (including supporting a number of research scientists/mathematicians/programmers) to any busted piece of FLOSS ghostware any day.


I really like Mathmatica and how fluid it is at handling abstract logic. However I don’t like it when you end up with giant dense blocks of code with little visual outline to indicate the structure of how it’s put together.


Absolutely one of the most fascinating articles I’ve ever read, can not thank you enough for posting this


My Oxford Dictionary of Computer Science defines a computer as "a device or system that is capable of carrying out a sequence of operations in a distinctly and explicitly defined manner."

There is no defacto computer science orthodoxy, so let's take that definition as a shred of a candidate orthodoxy to think inside.

Let's go further back in time with the this 'device' oriented way of defining. Through correspondence, we can see Newtons contributions contain the https://en.wikipedia.org/wiki/Fundamental_theorem_of_calculu..., but this was an emergent process. The method of the 'grunt work' of this process apparently had two camps, Abacists and Algorists, and he was more the former. https://en.wikipedia.org/wiki/Abacus#/media/File:Houghton_Ty...

From the Oxford English dictionary it gives the origin of Calculus to be "Origin Mid 17th century: from Latin, literally ‘small pebble (as used on an abacus)’."

It's easy to imagine a layperson watching Newton fiddle with his abacus and presume he was doing whatever pragmatic bean counting prevalent to his time. Most people wouldn't spend time pondering and inventing new usages of the abacus. So, it's easy to imagine we'd do the same with Stephen.

As a device, the abacus doesn't conform to the initial definition of a computer (on its own it is neither a system or device that in and of itself can carry out a sequence of operations).

The abacus device seems much more like the contemporary 20 dollar TI scientific calculator. The human has to manually administer the sequence. The system of thought could be arithmetic, algebra, etc, but a contemporary vanilla calculus textbook does not describe a Turing complete anything. Without some explanation of theory to bootstrap the explanation the idea of how recursion/looping works, it's going to hard to even metaphorically mechanize the the idea of sequencing.

Enter lambda calculus and church numerals. With these systems, you can systematically represent mathematical idea with whatever notation you want. Church goes the other way and instead of inventing new squiggles, he gets reduces the 'character set' for his notation, and has really long as expressions, for example, the number 3: λf.λx.f (f (f x)). The algorists were unconstrained by ASCII and unicode so they may have balked at the aesthetics of these expressions, but's not inherently great practice to have ad-hoc invention of notation. Wolframs link to the history of mathematical notations starts to trail off, probably correlated with Church's rendering the need for new squiggles optional.

Contemporaneously, (1930s) Turing contributes the idea of a turing machine, but this is an abstraction. It doesn't actually carry out the sequence of operations. This is after the enigma machine, but ultimately we don't spend much time fussing over who specifically invented the physical abacus, sliderule, digital calculator or digital computer. Its an emergent process, and as a system we don't see a pool of folk communities of "algorists vs abacists" until sometime around the homebrew computer club.

Because of this feels very superfluously flippant that Stephen Wolfram is always coining "new this" and "wolfram that" when we think about Newton leaving others to coin "Newtonian Physics." Personally I can just mentally redact this aspect of his behavior.

So the question is is Stephen actually talking about a new thing that exist? Is a "computational language" a thing? I'd say he's probably right that its coming, but the wolfram language isn't quite yet that thing.

What he is pointing out here is the missed oportunity to have a vernacular, or a text in the semiotic sense, to depict the actual computation aspects of our calcuations. When humans moved the calculator out of our/Newtons brain and into the device as uniform-ish human collective activity, wasn't really happening in significant numbers until the 70s. Signficant as defined by my metric of "the amount of people who used abacus back in the day." Once we let the computer do the task, we didn't force it to show its work. But if the computer carries out the operations, what is it we're still operating manually, why do we continue to interact with it at all?

Well we don't have a language to describe what we're doing and thinking in those gaps, and we're rarely automating state diagrams and flowcharts from our code much less starting with just painting out the state diagrams and flowcharts and letting the computer do the work.

Perhaps wolfram will set the stage for a defacto method towards this type higher order representation aka "computational language." He's definitely trying, but newtons's notations didn't catch on, its up to you to decide if the representation or the premise is more important.


> Newton coined 'Calculus,'

Minor point, but the term "calculus" was in use before Newton's "fluxions" to refer to various mathematical systems, just as we still have Lambda calculus, etc. Leibniz used the term in his approach, iirc.


Thanks for reviewing. Fixed the discrepancy, hopefully made it better.


It’s hard to think about Wolfram. His tone is so off-putting — he’s constantly discovering the glorious and singular capabilities of his own products — but for instance he did basically invent the now-ubiquitous idea of the computational notebook.

Sadly I don’t know a single person who has used Wolfram’s software extensively. I haven’t used it myself extensively. Is that because it’s not what it’s cracked up to be, or because it’s a closed world?


Frankly, I'm really, really tired of this being brought up in every single Wolfram-related thread.

Mathematica is used by a lot in academia, and you'll find it cited in Methods section often. I personally know a lot of incredibly talented and prolific physicists who swear by it, and have used it to get very good work done.


It'll be easier to stop when the posts stop insisting on coining "new kinds of X".

The only innovation I see here is an unusual attention to constructing visual representations and using them as identifiers in code. Which is cool! But insisting it's new demeans the work of thousands of engineers who do similar and prerequisite work, and who make an effort to situate their work in literature and indusrtry rather than insisting on some kind of exceptionalism.

Arrogance should be called out in every instance, because it's not actually a route to greatest impact.


I didn’t mean to say that nobody uses the software, just that I don’t know anybody who does. Which just makes it hard for me to get a handle on its value.

The question I’d ask those physicists is whether Mathematica does the kinds of things Wolfram claims for it / does things other programming languages can’t do. I wouldn’t be surprised if the answer was “yes,” I’m just not sure.


For some mathematical work, going from Mathematica to Python/Julia/Octave/Scilab would be a significant step backwards.

This is coming from a pythonista btw. To each their own. Python is a better scripting language, but Mathematica is a better analysis language for a lot of things. Python is catching up in a lot of ways though with Numpy and Tensorflow. I'd say Python's Tensorflow is a lot more mature than Mathematica's neural nets, but Mathematica's symbolic math seems to be world class.


I used mathematica as a graduate physics student. It is an incredibly powerful equation solver and visualizer. I do not think I would very easily be able to do what I was able to do in mathematica in another language. That said I don't use it anymore.


Even Trump is less self-aggrandizing than Wolfram. It’s just hard to take him seriously

> It’s only recently that I’ve begun to properly internalize just how broad the implications of having a computational language really are—even though, ironically, I’ve spent much of my life engaged precisely in the consuming task of building the world’s only large-scale computational language.

I guess the rest of us will just need to muddle through with the (apparently) small-scale (non?)-computational languages that we use.


"I only now realize how great my contribution is"? Gag me.


>Even Trump is less self-aggrandizing than Wolfram.

I think that might be a little unfair. Wolfram's certainly more articulate in his egotism, but nobody matches The Donald for sheer volume:

"I understand things, I comprehend very well. Ok? Better than, I think, almost anybody."

https://www.youtube.com/watch?v=5GqJna9hpTE


Python is used more than Mathematica, but you never read a blog by Guido van Rossum claiming that he changed the world. It's the messiah complex that people find off-putting, not that he has put together a good product.

Mathematica/Wolfram Language is a perfectly fine programming language. It has an amazing standard library. It's definitely worth the money (if you're working in a field that benefits from it). But that's it. Don't call it a computational language. It isn't a new way of thinking. It isn't revolutionary, there is literally nothing in Mathematica that can't be done in another language just as easily (except the CAS engine is world leading). But somehow these mundane blog posts by Stephen Wolfram make it to the front page of HN every few months, so someone must be upvoting them.

So maybe his hype machine is working as intended- he is in charge of a successful company and I'm hiding anonymously behind a keyboard. So what if a few people don't like his attitude, he's more successful than most of us.


Technically that honor goes to a lesser known program known as Mathcad, which had "worksheets" (basically notebooks) in 1986.


That may be true, but in case anyone thinks the two are comparable:

I used Mathcad a bit in undergrad and my wife (civil engineer) used it extensively for all engineering courses for 4 years. One of her main professors wrote an entire civil engineering textbook in Mathcad and that was why they pushed it. It was a decent and cheap math tool, but Mathematica is orders of magnitude more mature from a language, functionality, graphics...really it beats Mathcad in everything except cost. The gulf between the two is like Windows 95 and Windows 10. Mathematica has support for neural networks, time series data, blockchain, 3D printing, running on Arduino, transpiling to C, Natural language processing, insanely detailed graphics primitives, web crawling...etc etc.

Matlab and Maple have done a better job keeping up, but Mathematica beats those easily as well in my opinion although they are somewhat different products.


> His tone is so off-putting

His writing does come across that way. He is self-confident, certainly. I have met him in person. He is actually a kind and generous person. He is also very likely to be the smartest person in the room most places he goes. I met him at a place where there were a lot of other math PhD's in the room (not me by any stretch) so maybe he felt he was among his people and was mixing more naturally. But in any case, my approach was to shut up and listen because I was more likely to learn something that way.


I've used Mathematica extensively, and it's very nice. It's better at symbolic algebra than Matlab and much more intuitive than Maxima. Its primary strength is computing purely symbolically with no numerical methods, for when you have no tolerance for numerical error.


I've used it a bit and it is pretty powerful to say the least as they've been shoving functions into it to cover nearly every domain of computing for decades in a consistent manner.

The notebooks are great, but I find Jupyter notebooks to be good enough even if they aren't as good in many ways if you aren't skilled in markup.

One of their senior scientists (Matt Trout) has some insane blog posts that show off the power of the language. He has one on using Laplace Transforms to hide an image of a goat. It is basically pages of Math. It would take two or three times the code in Python I bet.

Annoyingly, I wouldn't really use it in production as deployment looks painful and it limits the number of cores you can use. I usually use it as a super powerful prototyping tool.


When Nassim Nicholas Taleb posts stats stuff on Twitter, it is almost always Mathematica.


Yawn.


Can you please not post unsubstantive comments or shallow dismissals to HN?

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: