Hacker News new | past | comments | ask | show | jobs | submit login

I find it amusing that many people (seemingly Hackers) concentrate on Wolfram's ego rather than on the actual thing at hand. I don't care about egos, lipstick color, or the pants the language creator wears. I care about the language, so let's talk about it now:

From a long-time Mathematica user: this is not a language I like. I use it, but there is little to no fun. Many things aren't immediately clear or become problematic if you stop using the language for a while. As an example, quick, what is the difference between Module[], Block[] and With[]? Even if you remember that With[] is a lexical scoping construct, Module[] and Block[] can be confusing.

It doesn't help that Mathematica is absolutely horrible for debugging code. I don't know why they put so little emphasis on their REPL, but it's way behind the times. Even mismatched parentheses can cause serious pain (and I hate the fact that things move around when I edit parentheses).

That said, I have a lot of respect for Mathematica as a complete system. It is incredibly useful, most mathematical tools you will ever need are there. It is also a great tool for explorative data analysis. Have a pile of data you'd like to see patterns in? Load it up, mash it into shape with high-level functions, then quickly produce great-looking graphics. Nothing even comes close to the flexibility of Mathematica here.




Not sure if you just want commiseration and are just having trouble remembering which keyword is which, so I apologize if I'm explaining too much, but for me deciding which to use is more grappling with the difference between dynamic and lexical scoping rather than which concept maps to which keyword, and just as knowing more about a person seems to help me remember their name, so too does knowing more about the function help me remember its name.

What helped me straighten it out was to evaluate the examples here with Trace: http://reference.wolfram.com/mathematica/tutorial/BlocksComp...

  In[1]:= m=i^2
  Out[1]= i^2

  In[2]:= Block[{i=a},i+m]
  Out[2]= a+a^2

  In[3]:= Module[{i=a},i+m]
  Out[3]= a+i^2

  In[4]:= Block[{i=a},i + m] // Trace
  Out[4]= {Block[{i=a},i+m],{i=a,a},{{i,a},{m,i^2,{i,a},a^2},a+a^2},a+a^2}

  In[5]:= Module[{i=a},i+m]//Trace
  Out[5]= {Module[{i=a},i+m],{i$5592=a,a},{{i$5592,a},{m,i^2},a+i^2},a+i^2}
It's a bit dense to read through but with Block, "i" has been assigned the value "a", but still remains in the form "i" throughout the evaluation, and once "m" is replaced with "i^2", the "i" is seen and replaced with "a", acting as normal dynamic scoping.

In Module, however, "i" is replaced by a temporary variable with the form "i$5592", which does not have a definition anywhere in Mathematica, so when "m" is evaluated, "i" is left alone, which is how lexical scoping acts.

I dislike forgetting parts of languages in the ones I use. C# is full of things whose behavior I have to look up, as is javascript, python, and especially while I'm learning them: clojure and cljs.

But what's nice about this language is that these two concepts, Module and Block, while having different meaning, have the same form.


If this thing is actually great, I feel that the license is important. If it's too restrictive, what's the point? We're not living in the 80s.


Correct me if I'm wrong, but the ego of the language creators seems to shine through in at least this[0] reference page, though. The most important bit I'd like to get some clarification on:

> Long viewed as an important theoretical idea, functional programming finally became truly convenient and practical

> with the introduction of the Wolfram Language. Treating expressions like as both symbolic data and the application

> of a function provides a uniquely powerful way to integrate structure and function—and an efficient, elegant

> representation of many common computations.

I see nothing on the reference page that could be said to make Wolfram's FP constructs more convenient or practical. They make it sound like FP was a long lost dream but then Wolfram came along and actually made it happen. They're seemingly using the exact same constructs as everyone else.

What you do actually get from the reference page, though, is that they went the Haskell route and decided the FP stuff shouldn't be readable by anyone who hasn't used the language before. This is all fine and dandy, really; it's a model as good as any other, since it at least gives writability where it loses readability... They aren't doing anything new, from the looks of it, though. They seem to be doing precisely the opposite... In exactly the same way.

[0] - https://reference.wolfram.com/language/guide/FunctionalProgr...


Because ego blinds one to effective criticism. And the criticism many of us have with Wolfram products is that while smart in limited domains, they can be blindingly stupid in others (your various examples providing evidence for this). If he listened to and respected our complaints, maybe we'd have a better product. But no.


Not really. Stephen knows how poor our debugging facilities are (and I regularly tell him how bad the situation is) -- we've just had limited bandwidth to actually fix it properly to our satisfaction.

Luckily I have a totally revamped debugging and logging system that will put us ahead of the curve here (imagine DTrace at a program level). Probably a point release of V10.

Same goes with multiple undo and retina support -- we know how embarrassing their absence is, and we've luckily been able to fix these for v10. I've been in design meetings where the multiple undo has bumped several other desirable features off our roadmap.


> will put us ahead of the curve here

Unless it's the same kind of curve Go is judged against (aka the curve of 30 years ago), that's a very tall order. Are you working on something better than a time-traveling debugger?

> imagine DTrace at a program level

so... DTrace?


I should have said DTrace for functional programs


...which Lisp has had since before I was born


How about hashmaps, something like records and other constructs for building large, long lived applications.

I'd kill for hashmaps, some sort of records and strong typing in Mathematica.



Awesome. Is this a new thing? I can't find it in the Mathematica 9 documentation.

So now there's just strong types and something like records left for me for general purpose development.



They've existed in a significantly more powerful form as DownValues since Mathematica's release.


what's wrong with just using

    obj["a"] = 1;
    obj["b"] = 2;
    Print[obj["a"]]
or

    obj = { "a" -> 1, "b" -> 2 };
    Print["a" /. obj]

?


Because those are really poor substitutes for Associations, though one can make do.


Having real undo sounds great. Also, thanks for fixing plot legends in Version 9. Because of that I now use MMA for publishable figures.


This is good to hear. But problem is that the curve is moving all the time and Mathematica seems to be catching up very slowly.

As for the REPL and general editing, if I were you, I'd quickly go towards LightTable integration — it should be doable, and the effects could be spectacular.


That might work, but it's not clear how all the graphics, image processing, user interface, and dynamic interactivity stuff would work.

Still, would be interesting to try. I have great enthusiasm for what Chris Granger is doing.


...and the ability to actually abort long computations? ;-)


Yes! At least triple-abort should kill the kernel! P.s. hi again!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: