Hey all! I've been working on this on-and-off for a few months, but it is sort of in a broken state right now on GitHub. I'm playing with some modifications to the charts and dependency graph, and the localStorage autosave/load is a bit glitchy.
Just saw Dan posted it now, and figured I should chime in (and push my last few commits).
Victor has some interesting ideas about interfaces, but his campaign to replace simple, powerful, and quick algebraic manipulation with slow, laborious, trial-and-error pushing of numbers around with a mouse is, I think, seriously misguided. Algebra, in addition to giving you an exact solution more quickly, can help you discover general formulas for your problem, and reveal relationships between things, that you can use again the next time the problem comes up. The scrubbing calculator approach means you have to start over from scratch each time. You're throwing away all the conceptual power of mathematics and trying to replace it with arithmetic.
I don't think this is inherent in the idea. You can absolutely make an interface somewhat like this that could arrive at general solutions. Indeed, I think Bret would consider this most of the point. He covers this in "Up and down the ladder of abstraction" [0]
If by algebra you mean "the transformation of one written notation to another by rule," that is only one way of finding general solutions to mathematical problems. There is a lot of writing about "how" people doing original math think, and many of them work in ways far divorced from algebraic notation. Some seem to develop graphical models they picture in their head, others report using other senses like sound and smell.
Freeman Dyson on Feynman: "The reason Dick's physics was so hard for ordinary people to grasp was that he did not use equations... Dick just wrote down the solutions out of his head without ever writing down the equations. He had a physical picture of the way things happen, and the pictures gave him the solutions directly with a minimum of calculation."
Einstein on his thinking: "The words of language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements of thought are certain signs ( symbols ) and more or less clear images which can be voluntarily reproduced and combined... The above mentioned elements are, in my case, of visual and some muscular type."
The idea that "real math is bare symbols" isn't true. Not only isn't it true for doing groundbreaking math, it isn't true even for easy math. I would challenge you to teach children mathematics as a purely symbolic game, never showing them that numbers correspond to physical entities, or that derivatives correspond to tangents on a curve. Those of us who are "good" at algebra are good at it because we understand it in a way deeper than the symbols.
Yes, symbolic notation is a great form of communication and necessary for proof, but it is usually not the way to create the initial intuitive spark underlying a solution.
I would add to your references Hadamard's book[0], where he talks about how ideas form in various mathematicians' heads.
I don't really disagree with you, it's just that Victor seems to be claiming too much for his new interfaces, and selling symbolic manipulation short. Of course all kinds of things need to be happening inside our heads before we start manipulating symbols. But Feynman's papers and books are full of equations and derivations, not just answers. Most of Victor's examples are actually counter-examples, because they are examples of problems where a little algebra would not only be faster and easier but would yield a simple formula that you could use later, the next time the problem comes up.
Take the book design example. What happens the next time you need to fit a similar graph on the page? If you had used algebra you could have written a little calculator in Python in which you could plug in the new numbers and which would return the solution. Victor's way requires you to treat each graph-fitting problem as a special case, even if they're all the same.
I'm not trying to be waspish, but are you suggesting we just be like Feynman? I will say I'm rather dubious we can achieve that with a simple interface change.
There is certainly merit and utility to graphical representation; we all use it via matlab, python+scientific packages, and so on. And, of course, the faster we can change something and see the result the better. It is not always an incremental change, but a pretty fundamental change, to see something change in real time.
Yet, this only really works in tiny cases. I recall Victor demonstrating using scrubbing variables to change how a game reacts. Okay, but the vastest amount of the programming I do (and I do fairly numerical things in the computer vision line) I either don't see the utility, or practicality. I can't variable scrub my way to a better algorithm.
I've seen, for example, graphical demonstrations of things like binary tree algorithms, sorting, and the like. It's momentarily interesting, but I can't claim that it really made any difference in my understanding of the subject. I think this is a case where people vary. I am not terribly visual, and most of Victor's arguments go over like a lead balloon with me. Yet, I cannot deny the excitement of others. I find it far more commodious to work with, for example, linear algebra. Certainly there are times when I am drawing 3D arrows and the like to work out a problem, but by-and-large the symbolic representation serves far better.
There is a reason we use text for communication on HN, and not pictures. It is not because we lack vision or foresight (no pun intended), but because it is the most expressive and flexible way to communicate our ideas. And, of course, sometimes we labor away, trying to communicate in words what would be so easy to illustrate with a single image. Neither is 'best'; we just use the best tool for the job.
None of that is to argue against one of your points - that math is more than symbolic representation. Once you are reduced to applying symbolic rules to an equation you are often quite lost, and that is often not taught in schools. Yet so much of math really cannot be done graphically at all, yet we succeed. A lot of people stall out in math (I think) at after the algebra/calc I,II,III level because beyond that there is often no recourse to graphics and physical representations; graphics can be as much a crutch as a help.
Anyway, visual thinking is certainly a form of thinking that should not be downplayed, but it is not how some to many brains work, nor is it even possible in many forms of important thought (I leave savants to one side, we can not make decisions based on exceptional/unique people).
I think you might be misunderstanding Victor a little, but then again your viewpoint has expanded mine, and I think there is more to be said about what his intentions are, which we in the sphere, may be inherently misunderstanding.
What I think his point is all about with re-thinking the interface is that the interface itself is a new language, and has always been propelled forward by those willing to use that language actively. His desire is an educational one, of enlightenment to and on the part of, the User.
Learning algebra your way may have been good for you, but learning algebra with interactive control may be good for others, too. My 6 year old son has had much fun with worrydream.com, as a personal matter of fact, and I'm sure he - as a kid growing up with immense computing power - is going to be just as cognisant of the subject, through interfaces such as this, as any of us elders.. I think Victor is an educator, mostly, and in that sense his position is immensely valuable. Kids who xbox, dude, is the market.
This is possible. It might be that the greatest benefit of these kinds of interfaces is in pedagogy. In any case, this work is clearly worth keeping an eye on, even if I don't buy all of Victor's arguments for its utility.
I also have children in grade school, who can scarcely believe that when I was their age, and in high school, and in college, actually having a computer in your house (never mind in your shirt pocket) was just science fiction. So I struggle with trying to figure out how valuable computers can be in helping to educate them, or whether they are more of a distraction.
I understand what you're saying about how to bring computers into perspective for the younger generations. So .. My kids have a working 8-bit computer setup in their bedroom. It was my first 'real' computer, 48k. (Oric-1) They learned to hack on that before they got to a PC. The 6-year old regularly totes his OLPC along to the grandparents, just to run the emulators of these old machines. There is something magnificent about being able to plot something in BASIC and see the results immediately. My kids, I think, got just as much fundamental understanding of what the computer was doing, because of this interface, as I did when I was younger. There is much value to not throwing things away.
It seems aimed at cases where someone would spend about as much time setting up the problem and solving algebraically as dragging the numbers around for a single instance.
Now let's talk about REPLs, where you have to type in your program every time you want to run it; surely it's faster to write your program into a file instead of starting from scratch each time.
It seems to me that in all of his examples the symbolic approach, or just the equivalent in your head, would be way faster than fiddling with the computer interface, even if you know you'll never see the problem again.
I'm not sure if your evocation of REPLs was ironic, but surely most people do what I do, and load a file into the REPL at the beginning of an interactive session.
Agreed. In this case it seems even worse because it plays with the way you typically denote units. I suppose it is best to think of the 'ignored' words as variable names instead of units. But... I have never seen things written that way before.
So... I have to confess I can't figure out how to get this to do anything. :( I'm guessing I'm supposed to be able to "scroll" to change some values. However, I can't even get it to calculate anything. Does this only work from chrome?
In Photoshop you can increase/decrease a value with more control by holding Shift (+/- 10) or Option (+/- 0.1). Similar shortcuts might aid in finer value control without adding anything to the UI.
Just saw Dan posted it now, and figured I should chime in (and push my last few commits).
The version here works the OK-est, but it still needs work: http://www.omarrizwan.com/cruncher/
Hoping to get back to it sometime within the next few weeks.