Author here! Go ahead and ask if you have any questions about internals or otherwise. Or e-mail me if you think of your question much later (e-mail in profile).
I'm surprised it made it here even during a Github outage.
>Python is Lisp with syntactic sugar and Lisp is Forth with syntactic sugar.
Those languages have nothing in common other than being pleasant to work with because there isn't too much boiler plate.
Python is dictionary based, lisp is list based and forth is stack based. And that's just the base languages without the fancy stuff like macros and continuations in lisp, all the new syntactic sugar in python (:=, etc) and the bit tricks in forth (-1 = true, 0 = false).
> Those languages have nothing in common other than being pleasant to work with..
That's where the author had an insight, that they (could) have everything in common - except for the syntax on a surface level.
Underneath, languages are just different ways to produce and operate on the same "stuff", data and algorithms - values, function, list, dictionary, stack.
So, to me it makes sense that there can be a Lisp-y flavor of Python, Pythonic flavor of Lisp, Forthy Python, and so on. It's a matter of parsing/compiling them into a common format, targeting the same interpreter or virtual machine.
That said, this Forth-Lisp-Python Continuum verges on mad genius, to have all these different syntaxes in one language.
It was interesting to see this project, since I've been working on a hobby language, whose syntax is infix-oriented like C (but everything is an expression). It gets parsed into a prefix-oriented Lisp-like syntax tree.
The interpreter is a "machine" based on Lisp's eval, with lambda, macro, tail-call optimization, etc. So, basically, the language is just syntactic sugar that compiles down to whatever the machine expects. I plan to implement another (much simpler) parser for a Lisp-flavor of the same language.
In the case of FLPContinuum, I see that the interpreter is a stack-based loop (https://github.com/asrp/flpc/blob/f47de76fcab6d2d2b1840e5376...). The actual implementation of the "machine" almost doesn't matter for the language(s) though, as long as the interface stays the same.
Forth and Lisp are distorted reflections of each other: Forth intends to make it easy to write programs with minimal abstractions, whereas Lisp intends to make it easy to write programs with any abstraction you want. They both, therefore, converged on the notion of having minimal syntax for opposite reasons: Forth eschews most syntax because syntax is an abstraction, Lisp because it isn't necessarily that specific programmer's abstraction.
Python, OTOH, is a single, "frozen" M-synax for a Scheme-like Lisp descendant; if Lisp is a pantry of ingredients and a kitchen, Python is fast food, or a microwave dinner, albeit a relatively good example of such.
> is a single, "frozen" M-syntax for a Scheme-like Lisp descendant
Sounds very close to what Dylan is/was. Well, it still is, in the sense that there's a working, open source implementation (over at https://opendylan.org/), but it basically lacks the last 20 years of optimizations, which makes it, unfortunately, really slow. The ecosystem is also non-existent and very far from ergonomic. Still, it's a very interesting language - basically Scheme + CLOS in M-syntax (though it does have syntax-rules-like macros, so maybe not as "frozen" as Python here.)
At first glance, I love it! For years I programmed mainly in Cython, where you can float in-between Python and C, making each line as C-like or Python-like as you want, whichever suits the needs of the moment better, and can use packages/libraries from both. I liked that floaty feeling, but having 3 dimensions is a whole new ball-game.
I have to share what n-gate said about Cython, it's so funny and true:
...a horrible chimera of a programming language, wherein the drawbacks and limitations of Python are augmented by the drawbacks and limitations of C. The result is a language that introduces header files to Python and requires breathtaking amounts of boilerplate. The primary goal of Cython appears to be transforming the programming experience from "implementing a solution to a given problem" to "trying to guess when to turn off exception handling so that your code runs marginally faster."
Okay okay, I take issue with the last sentence of that (specifically the word "marginal"). The aim of Cython is to provide a smooth slope from "quickly writing comprehensible programs in Python" to "belaboring bits and mallocs in C to crush the performance of that Python crap", where the optimal Cython experience is "profile, identify a huge hotspot, and do that little bit in C"... and then brace yourself for that boilerplate, boy howdy.
These days I write a lot of C++ and using it through Cython wrappers. With both Python and C++ redefining themselves, each at a breakneck pace, I frequently hope that they'll converge to a common language. But then I remember that I know that devil, and its name is Cython.
this is really cool! I've actually been wondering for a while what a language that's both Lisp and something concatinative would look like, but I keep getting tangled up without actually solving it. I'm going to study this one
If you want to read this, I'd suggest looking at the sources in bootstrap sequence from the readme (boot.flpc, stage0.flpc, ...). Alongside, run some of the precompiled entries by hand by pasting it into the interpreter. Call `ps` once in a while to see the current state. Then for larger chunks of code invoke breakpoints by calling `debugger()` (in a `.flpc` file) or `debugger` (in a `.f` file though this will mess up source position printing beyond this point).
Wow! That's wonderful!. Forth and Lisp were my favorite languages long long time ago. Python is my current daily use language among others. I even don't know they are connected.
which looks like Lisp if you move every open parens one token to the left and remove the commas
(sum (list_comp (lambda x (quote (multiply x x))) (range 10)))
To evaluate this, parameters are first recursively evaluated (in order) and then the function is called on the outer value. Let's ignore the lambda for the moment.
(sum (list_comp quoted_inner_func (range 10)))
results in the following function calls are made at execution time _in this order_
quoted_inner_func 10 range list_comp sum
Normally, you'd have to pass the correct parameter to each function. However, in Forth, we use a global parameter stack so provided all the functions respect their inputs and output, running the above body would provide the desired result on the parameter stack!
They're not /really/ connected, at least historically. But the author sees a reasonable progression in ease of use, while maintaining a lot of flexibility. And the idea of starting with a forth-like system, building a lisp-like system with it, then putting a python-like syntax on top of it is very interesting.
> Eventually, the grammar itself will be modifiable at runtime
Wouldn't that make programs hard to understand since you can't learn the syntax for once and all. It's like having a book with different pages written in 3 different languages, to understand it you need to understand all those languages.
As a counterexample, imagine reading a book on mathematics that did not use any special notation, instead relying on natural-language descriptions of formulas and algorithms. Or imagine reading a book on molecular biology that didn't use any technical vocabulary or diagrams, instead relying on colloquial language and descriptions.
There is a place for specialized notation and vocabulary. Exactly where that line should be drawn in programming has been an ongoing conversation since 1953 when John Backus proposed the concepts that would lead to the development of Fortran.
Lisp let's you modify its syntax. You're making the standard argument about why that might be a bad idea. At least for Lisp, historically it worked pretty well. As new ideas came along, a lot of the good ones got incorporated in Lisp as syntax extensions.
Overloading in C++ is kind of similar. Im not sure that worked out so well in practice.
It is actually not hard to read (at least in the sense of knowing what something will do when executed; getting the bigger picture takes more practice). The (base) syntax is just whitespace delimited tokens, each representing a function call (or string but we'll come to that later). So
foo bar baz
will call the 3 functions in order
foo()
bar()
baz()
All functions are nullary (with side effects; these side effects determine their "true" arity). There aren't really any special characters other than whitespace so
1 1 + print
just translates to
1()
1()
+()
print()
Function names do not have to start with a letter or be alphanumeric. I've happened to name my function so that those ending in a colon treats the next token to the right as a string instead of a function call. The [ function treats everything as strings until ] (that is a single close square bracket as a token) and puts the function in those body in a quote, effectively creating an anonymous function. So
[ foo bar baz ] bind: somename
defines a function. The equivalent in Python would be
def somename():
foo()
bar()
baz()
And you can later call somename in later functions.
I'm surprised it made it here even during a Github outage.