Hacker News new | past | comments | ask | show | jobs | submit login
The Namingless Programming Language (github.com/akalenuk)
108 points by thunderbong 12 months ago | hide | past | favorite | 49 comments



It took me quite a while to understand the actual role of `_` here, and the description is quite unclear if you are not into this kind of esolang things.

What's going on here is that there are actually two operations, where one is spelled `_` and another is spelled every other character. The program starts with an empty array of trees (what is called the "current branch" here), and any character other than `_` appends to that array, while `_` takes the last element and does something accordingly. And that's a common theme for light-hearted esolangs: that single operation encodes a lot of semantics so that there are in fact multiple super-operations described with that operation. It is much harder to design a minimal esolang without no obvious super-operation structures.

One thing less obvious from the description is the Turing completeness. I think it is not, because the actual computation is only possible via `_`, but the number of `_` operations is set from the initial program. If `_` can be executed undefinitely, it is relatively easy to construct a program that replicates itself (cf. quine) in the current branch and `_` keeps running the replicated program over and over. Alternatively `_` can have a super-operation that can execute `_` multiple times, for example by interpreting a particular branch as a program [1]. But as it stands, it doesn't seem to have any sort of looping. A simple fix would be to assume an infinite number of `_` following the input program.

[1] This sort of iteration is not very uncommon in esolangs. See, for example: https://esolangs.org/wiki/Muriel


Indeed: it looks like on each iteration of `execute()`, exactly one leaf node is popped from `left`, no more and no less. So since there's nothing that can add more nodes to `left` after the program is first loaded, there's no way to call `execute()` more times than the total length of the program.

Another fix I believe sufficient (and perhaps a bit more interesting) would be an operation to exchange `left` and `right`, so that the program is replaced with the data, and vice versa.


> This sort of iteration is not very uncommon in esolangs.

Or academic computability analysis. Raúl Rojas's paper on the Z3 comes to mind. A neat trick.


_ is eval

"The datastructure" is cons


`_` is weaker than eval. It doesn't allow for any sort of recursion if I haven't mistaken.


I know that this is mostly for fun, and it looks cool. But on a serious note, as someone who's dabbled in both J, stack languages (Forth and Factor), as well as tacit programming in Haskell, I'll say that naming is an awesome way of making programs more readable.

Trying hard to avoid names (by, say, eta reduction in Haskell) is a fun excercise that rarely leads to more understandable programs.


I think it is not just "a way" of making programs readable, but that it is the main readability factor. Good names make a program readable, bad names ones do the opposite.

Almost all time spent improving names is worth it. The only exception might be one-off scripts.


Completely agree. Reading another engineer's code recently, which had egregiously bad names, the following thought occurred to me: Writing software is like writing poetry. Finding the words that perfectly capture and express the essence of your thinking and the underlying model or theory behind your code is so important. In software as in poetry, one is always on the lookout for that "mot juste".


Counterpoint: all of my one-off scripts (well, not all) have not actually been one-off.


I did not fully appreciate this until I had to edit a script written by a non-English speaker. It was absolutely incomprehensible until I ran all the variable names through Google translate. After that, piece of cake.


> The only exception might be one-off scripts.

Agreed, but if you're writing in J, Forth or Factor then it's hardly likely to make it into production anyway.


Agreed, and when it comes to Haskell, hlint pushing eta reduction like it's candy doesn't help.


Awesome effort but I also kind of hate stuff like this. I just feel like there are more constructive ways to waste time. But I guess everyone has a right to waste their time how they want.

Still tempted to use the flag link though. So maybe that's actually a compliment to the author in a weird way.


> I just feel like there are more constructive ways to waste time

From my own experience, working on "silly" projects like this, with severe artificial constraints for example, can be quite interesting and challenging and lead to lessons learned that can be applied to real projects later.

So for the author, it might have been quite constructive indeed.


A benefit from working with synthetic constraints in software is that it spikes creativity.

You can showcase the same programming techniques in a toy compiler as you would in a production-level system.

But you may not get there because you're drowning in practical matters instead of getting to the point of innovating.


“Constructive waste of time” is a contradiction in terms. Why would you call it waste if it's constructive?


I see the more creative esolangs like this as beautiful and strange to think and code in; a formal play on language design. Not everything needs to be practical or a learning experience.


If I can make a full scale model of the Burj Khalifa in minecraft survival then I don't begrudge this guy doing his thing.

And hacker news is where I come to read about random projects like this.


it's art!


This language is for fun, but the Unison [1], where names don't matter, is not.

[1] https://www.unison-lang.org/docs/the-big-idea/


Yeah, this always struck me as an anti-feature of Unison, and having listened to some podcasts and read some blog posts of the creators explaining the wonderful things it enables hasn't really changed my opinion.

Decoupling and abstraction via naming is fundamental to both computation and data storage.


> Decoupling and abstraction via naming is fundamental to both computation and data storage.

While I don't think Unison to become the next big thing(TM), that's irrelevant to what Unison claims in my understanding. You can always name any term as in most other languages, and in fact you can also have a nominal type as well (cf. `unique type`). So you can distinguish multiple terms with otherwise structurally equivalent representations, and use names to refer to them.

The obvious following question is: what is different then? In conventional languages where you define a function `f(x, y) = x + y`, `f` is engraved into some sort of namespace, like a symbol table of the static or shared library file. That obviously implies a possibility of collision, and a common remedy is to have multiple or nested namespaces to separate them, but that doesn't fully fix this issue because such separation is not automatic. In Unison everything is first automatically named according to a hash of its implementation, and then can be aliased to other names. So the collision is impossible by default. Unison still has a type system, so I believe decoupling is still possible by multiple terms with the same type but different implementations (thus different hashes).


Decoupling and abstraction not distinguishing.

The implementation is supposed to be hidden and changeable independently of the interface and independently of clients.

If I am coupled to the implementation, it's not an abstraction.


If that's your definition, I believe no one actually follows that definition because there are a lot of ways for abstractions to leak their implementation details, which might not be fully captured by interfaces. A good example is the performance. If you need a fast enough implementation with the same interface, you are not doing the abstraction in your definition because the "fast enough" implementation is an implicit coupling factor.

More commonly, and in my opinion, more reasonably, any abstraction involves a trade-off. Some abstractions can be quite leaky but darn easy to use. Others may be less leaky but difficult to use (or misuse, depending on your view). There is no non-leaky abstraction here. Once you've chosen a trade-off and thus abstraction, you are still not entirely free from coupling for the aforementioned reason, but can enjoy some level of freedom thanks to the trade-off made. Unison is no different from other conventional languages in this sense.


"The operation is spelled _"

And pronounced underbar, as in where you need to drink yourself when programming in this language


Of course, programming languages without names of any kind are not new. Like this one, they tend to be esoteric. There's the Turing Machine inspired Brainfuck, with its 8 symbol alphabet <>+-[]., Some are based on Combinatory Logic, like Lazy-K and unlambda, and others on Lambda Calculus with de-Bruijn indices, like binary lambda calculus and λDNA. And several others, like this one, are stack based.


Those who do not know Lisp, or in this case Lambda Calculus, are doomed to reinvent it.


Not really a programming language because there are no loops. Just a syntax for expressions that are evaluated using the stack of trees.

It goes to unnecessary lengths to avoid naming the output files themselves.

Also there's a significant copout that the language allows you to operate on explicitly named files.

This makes it possible to have named "variables" by writing whatever you need to keep into named files.

The stack of trees is interesting.

Also using _ to execute top thing on the stack is a cool concept that avoids need for quoting anything because nothing executes by default. Although you still need to quote _ itself as U_ if you don't want it to execute and the lang doesn't have subroutines yet so to make them sensible some quoting will probably still be necessary.

There's also a weirdness where strings are always compared as if they were numbers. At least according to the docs.


This could easily be designed to avoid superfluous the _ dud in the syntax.

It's like having to use an explicit call operator for everything (call + (call * 2 2) (call * 3 3)). You do that for applicators other than call like map or apply. Call is special, so you don't give it a name; it is named by the absence of a name.

You want the actual operators like ^ and H to be implicitly active. When you want ^ and H to just be the literal characters themselves, that's where some quoting syntax should be applied.


> To avoid the naming problem for the programs you write, the language uses the name of the executable as the source code.

I was amused until I read this.

So much pain owes its existence to special characters in filenames.

Don't do it.



The stuff of nightmares!


Luckily this nameless language has escape sequences for special characters. And it regards "." as a terminator.

The author seems to have thought it through!


It's also a severe limitation on the length of programs.


I used to encounter a lot more esoteric languages than I do now (I at one point made a working befunge compiler which was considered impressive at the time). I don't know if I've just aged out of it or if it's gone out of fashion or what. But that's a bit of whimsy I definitely miss from the 90s and early 00s.


They're still going strong on the Codegolf stackexchange.


> Every node is either a leaf containing a char, or a branch containing a dynamic array of the data structures.

So it’s basically lisp, with either lists or atoms, at the char level


if you didnt know about lisp, how would you have described this ?


Using monosyllabic grunts, punctuated by drooling.


In conclusion, naming is hard, not naming is harder.


How about some middle ground?

Keep naming types, but abandon the idea of variable names.


Missed opportunity to use V and W as single/double quote? :-)


Or maybe U and W.


This is kind of like Forth in concept only much less useful.


I love the mnemonics.


> Naming is hard

I think this gets repeated a bit too often. I don't find naming difficult personally, and I think I'm not alone.

You can actually practice this by thinking of concepts, entities, their relations and the way they intertwine.



“I salute you sir!”

(aside) “You may fire when ready.”


It deserves a spot in Wikipedia's list of esoteric programming languages, next to classic ones like BrainFuck, LOLCODE, Rockstar, and Piet:

https://en.wikipedia.org/wiki/Esoteric_programming_language#...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: