Godel, Escher, Bach covers this in great detail. Symbols are a weird idea in the human brain, and there are multiple pseudo languages throughout the book that when you really sit down and think about it, aren't any less ridiculous than the notation we use for mathematics outside of that book.
My idea-book (is that a crazy person thing?) is probably a lot like what you're talking about. I use words for my main data, different arrows going in between each sub-idea, and then sometimes more words attached to each arrow ("pub/subs", "queries", "happens once/happens always", etc)
I think that people who think visual programming could be a thing ought to sit down with APL for a few weekends, until they have the epiphany of "oh, the code is the same as the notation for what I'm actually doing" and suddenly they realize that notation and code are interlinked, and each are basically useless without the other, in the same sort of way that x--p--p---- is exactly as valid as 2 + 2 = 4, without context.
I get the allure of NOT having to write code, but it shouldn't be so difficult for people to realize that it's a ridiculous fantasy.
Write fibonacci(N) in ANY visual language, and tell me you couldn't have done it easier, faster, and more coherently in python or whatever. It's obvious.
> Write fibonacci(N) in ANY visual language, and tell me you couldn't have done it easier, faster, and more coherently in python or whatever. It's obvious.
My idea-book (is that a crazy person thing?) is probably a lot like what you're talking about. I use words for my main data, different arrows going in between each sub-idea, and then sometimes more words attached to each arrow ("pub/subs", "queries", "happens once/happens always", etc)
I think that people who think visual programming could be a thing ought to sit down with APL for a few weekends, until they have the epiphany of "oh, the code is the same as the notation for what I'm actually doing" and suddenly they realize that notation and code are interlinked, and each are basically useless without the other, in the same sort of way that x--p--p---- is exactly as valid as 2 + 2 = 4, without context.
I get the allure of NOT having to write code, but it shouldn't be so difficult for people to realize that it's a ridiculous fantasy.
Write fibonacci(N) in ANY visual language, and tell me you couldn't have done it easier, faster, and more coherently in python or whatever. It's obvious.