Hacker News new | past | comments | ask | show | jobs | submit login

Let me just point out some ideas that seem to have been forgotten:

- The idea that machines can exchange live objects, which can bring their "friends" along. (Before anyone says something about this not working - this is very similar to how we load JavaScript in the browser on nearly every page on the web. Except way more formalized and universal.)

- The idea that concurrency can be achieved by versioning object changes and reconciling different version of reality when you need to. Pseudo-time, etc.

- The idea that objects that respond the same to the same messages are equivalent. Which extends to the notion that a node on the network can be modeled as an object and conversely a local object can be made a node on the network.

...

Joe Armstrong had a lot of good intuitions when designing Erlang. I's a pity that after a brief surge in popularity Erlang/Elixir have been downgraded to "not hip enough" status.

...

Another thing. If you like Unix pipes, you have to love late-bound dynamically typed objects that communicate through message passing, because Unix pipes can be generalized as such. Yet I routinely see people who claim to like the former and hate the latter.

It's especially depressing how many people don't see the point of "message passing" as the fundamental building block of programming.

Just a couple examples why it matters:

- This progression. Object interface -> inter-object protocol (extends to sequences and state) -> object communication language (allows objects/system to derive protocols).

- If objects communicate through messages then object interactions with its environment are also messages. Which means you can create virtualized environments for an object by restricting or manipulating those messages.




> Another thing. If you like Unix pipes, you have to love late-bound dynamically typed objects that communicate through message passing, because Unix pipes can be generalized as such. Yet I routinely see people who claim to like the former and hate the latter.

Sorry if this is a dumb question, but could you explain this a bit more?


The reason people still care about Unix pipes is because you can take a bunch of arbitrary commands and string them together in a way that produces some useful results. Moreover, you can configure each pipe to do different things depending on what you need at the moment.

What stops you from stringing together a bunch of arbitrary objects to achieve the same effect? Well, a bunch of things:

1. Verbose syntax of the language.

2. The need to compile new code.

3. Early binding that stops you from calling things you don't know about before compiling code.

4. The fact that objects can't just take arbitrary method arguments.

#1 is self-imposed.

#2 is solved by JIT compilation and some other technique.

#3 is not a problem in dynamically typed languages.

Most importantly, #4 is explicitly what polymorphism in OOP was supposed to solve. Poly = many. Morphos = shapes. The idea that the object adapts to the messages you send it.

With this in mind, ls -a | grep “gear” can be interpreted as code that does the following:

1. Send the object representing 'ls' a message with -a argument. This will supposedly construct a list.

2. Send 'grep' object a message with "gear" argument. This will construct a configured grep instance.

3. Send response from #1 as a message (or stream of messages) to #2. This will generate a filtered list.

4. Send the object resulting from #3 a message that would request a console-friendly representation (e.g. a string with codes for color in console).

5. Display #5.

This is (with some caveats) very close to how OOP was envisioned in the 70s and early 80s. Except pipelines are a very crude way of stringing things together. You can get way more sophisticated with objects. E.g. :

https://www.youtube.com/watch?v=I9LZ6TnSP40

https://www.youtube.com/watch?v=if72CFsF_SY


Thanks, I understand better now. These kinds of dataflow-OOP environments always look like magic to me.


Yeah, with all this micorservices, serverless, lambda hype I can't help but feel this all was addressed, at least in the Java world, with EJB.


Yep, the complexity of deploying docker and k8s versus uploading an WAR or EAR file into a JEE application server.

One of the nice things of being in Java and .NET lands since day one, and C++ since its kindergarten years, is to see all the cool kids coming up with "totally new stuff". watch them re-learn what we already did, and eventually collect some improvements and those stacks adopt what was actually new since the last reboot.

In the long run the turtle still seems a better option.


But EJB felt over-engineered even compared to today's stack. Maybe it was actually not, and just felt that way, of course.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: