There's a lot of false premises and a lot of things that don't follow in this.
> Maybe the most marvellous, utopian idea for software was Unix program design, in the mid-1970s. This is the idea that programs will all talk together via unstructured streams of text.
It was a great idea in the 1970's. But we had OLE and COM and CORBA and GNOME and KDE and now PowerShell where you can load .NET assemblies and pass around structured objects in the same process. We've had Cocoa which exposed Smalltalk style control of objects in applications since the 1980's.
This isn't a problem of technology. It's a problem of incentives. It's the same incentives that drive programs to try to look different from each other instead of fitting smoothly into a set of user interface guidelines. It requires a producer of software to find it beneficial to them to fit into the ecosystem around them.
> The promise of a computer is to reveal truth, through a seamless flow of information.
Seamless flow of information requires connecting the underlying semantic domains. Semantic domains rarely exactly match. Humans use enormous amounts of context and negotiation to construct language games in these situations.
> music was around forever but in the 16th century, if you made music, you made it for god, or the king.
This is empirically false. We have records of dance music, of broadside ballads that were sung in the streets, of the music people played at home. And there were lots of student drinking songs about being young and free.
I think the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in.
> This is the idea that programs will all talk together via unstructured streams of text.
Curious. To me this is the worst thing you could ever do. Talking via streams to say `cat file.txt | grep ERROR | wc -l` is cool. But you could do SOOO much more, if programs would actually output structured data streams. You could connect standalone applications much in the same way as Visual Scripting, where you plug inputs and outputs together and mix them with operators (think of Unreal Engine's Blueprint, just for command line tooling).
It's a true shame that Linux did not develop a well defined CLI metaformat that defined exactly what parameters are there, what's their documentation, their completion, what outputs does a program produce based on the parameters you provide, etc. You could do true magic with all this information. Right now you kinda still can, but it is very brittle, a lot of work and breaks potentially with each version increment.
I think it stems from the design failure to build your app around a CLI. Instead, you should build your app around an API and generate the CLI for that API. Then all properties of structured data streams and auto-explore CLI shells come for free.
> Curious. To me this is the worst thing you could ever do. Talking via streams to say `cat file.txt | grep ERROR | wc -l` is cool. But you could do SOOO much more, if programs would actually output structured data streams.
A lot of people have had this thought over the decades, but it hasn't really happened -- powershell exists for linux, but who's using it? The genius of the primitive representation (stringly typed tables) is that it has just enough structure to do interesting processing but not enough to cause significant mental overhead in trying to understand, memorize and reference the structure.
Case in point of the difficulties of adding more structure without wrecking immediacy of manipulation is json.
For anything with more than 1 level of nesting, I do stuff like
blah | jq . | grep -C3 ERROR
probably a lot more than I do
blah | jq $SOME_EXPRESSION
because it's just so much less mental overhead -- I don't have to think about indexing into some complex hierarchy and pulling out parts of it.
I'm not saying it's not possible to get out of this local optimum, but it appears to be a lot more subtle than many people seem to think. There may be an simple and elegant solution, but it seems it has so far escaped discovery. Almost five decades later, composing pipelines of weakly structured and typed bytes (that, by convention, often are line separated tables, possibly with tab or space seperated columns) is still the only high-level software re-use via composition success story of the whole computing field.
Very few use Powershell for Linux because it doesn't pre-installed on a Linux box. Otherwise you can bet that people would be using it in large numbers. And yes I would prefer your second "mental overhead" way as it involves less typing. Unfortunately powershell is more verbose than bash not less.
Powershell is unfortunately not the shining example of a shell that best leverages structured/typed input/output succinctly.
But on Windows, sysadmins use powershell heavily. Nearly every IT department that manages windows machines uses Powershell.
> Very few use Powershell for Linux because it doesn't pre-installed on a Linux box
I don't buy that. On a GNU/Linux box, there's few things that are easier than installing a new shell, if you prefer a different shell than bash it's two commands away. Bash does the job people expect it to do and would probably be _very_ alienated it they'd had to start messing around with .net gubbins.
>And yes I would prefer your second "mental overhead" way as it involves less typing
Maybe for the first time you would. Maybe if you were to accomplish this specific thing. Anything else? Have fun diving into the manpage of your shell _and_ the programs you want to use, and you better hope they share a somewhat common approach to the implemented (object) datatype or well, good luck trying to get them to talk with each other
>Powershell is unfortunately not the shining example of a shell that best leverages structured/typed input/output succinctly
I would just remove the last part, then agree with you: ">Powershell is unfortunately not the shining example of a shell"
> Nearly every IT department that manages windows machines uses Powershell
I mean, what other choice do they have there? cmd? Yeah right, if you want to loose your will to live go for it
>I don't buy that. On a GNU/Linux box, there's few things that are easier than installing a new shell, if you prefer a different shell than bash it's two commands away.
When you are SSH'ing into one of 10k containers for a few commands, you will only use what is already there. Bash is there and works and that is what one will use 100% of the time. No one is going to permit Powershell to be bundled to satisfy personal preferences.
You're both moving the goal posts (if powershell were superior I and countless other people would absolutely chsh it for our accounts, since we're already not using bash anyway) and not making much sense. Many sysadmins tend to spend a fair amount of time doing command line stuff and/or writing shell scripts. If powershell offered significant enough benefits for either, of course at least some companies would standardize on it, just like your hypothetical company presumably standardized on using containers to run their 10k services rather than installing some custom k8s cluster to satisfy the whims of one individual infra guy.
When one doesn't have control over the repositories used in build and service machines since they are locked down nor have control over what goes into docker images (only secured images allowed and good luck trying to get your custom tools in), one will use the stuff that is already present.
This is far more common than you think in enterprise corporations. I work at the hypothetical one, which doesn't use k8s. (yet to upgrade cloud infrastructure of native data center)
If power-shell was bundled by default in Linux distro LTS releases, a lot of sysadmins I know would start using it, since they are already familiar with it for windows and write all their scripts in the same.
> And yes I would prefer your second "mental overhead" way as it involves less typing.
1. It doesn't, just use zsh and piping into grep becomes a single character, like so:
alias -g G='|grep -P'
2. Even apart from that I'm a bit sceptical that you can conjure up and type the necessary jq invocation in less time than you can type the fully spelled out grep line.
Not only that but a fixed object format also "forces" me to parse the data in a particular way. Think of representing a table in JSON. The developer of the producer will have to pick either row-major or column-major representation and then that is how all consumers will see it. If that's the wrong representation for my task I will need to do gymnastics to fix that. (Or there needs to be a transposition utility command.)
Obviously JSON is not suited for tabular data, but perhaps another format could be used. Ultimately, the user shouldn't care about JSONs or tabular objects.
IMHO, I need both text streams and metaformat. YMMV.
Just like GUIs should but usually are not gracefully, responsively scaled to user expertise, the developer experience should but usually are not gracefully, responsively scaled to the appropriate level of scaffolding to fit for purpose to the requirements defining the problem space at hand. I need more representations and abstractions, not less.
Metaformats drag in their own logistical long tail that in many use cases are wildly heavyweight for small problems. Demanding metaformats or APIs everywhere and The Only Option trades off against the REPL-like accessibility of lesser scaffolding. API-first comes with its own non-trivial balls of string; version control between caller and callee, argument parsing between versions, impedance mismatch to the kind of generative CLI's you envision, and against other API interfaces, etc.
The current unstructured primitives on the CLI, composable into structured primitives presenting as microservices or similar functions landing into a more DevOps-style landscape, etc. represents a pretty flexible toolbox that helps mitigate some of the risks in Big Design Up Front efforts that structure tends to emerge in my experience. I think of it as REPL-in-the-large.
As I gained experience I've come to appreciate and tolerate the ragged edge uncouthness of real world solutions, and lose a lot of my fanatical puritanism that veered into astronaut architecture.
> But you could do SOOO much more, if programs would actually output structured data streams.
Like even treating code and data the same, and minimizing the syntax required so you're left with clean, parseable code and data. Maybe in some sort of tree, that is abstract. Where have I heard this idea before . . .
> I think it stems from the design failure to build your app around a CLI. Instead, you should build your app around an API and generate the CLI for that API.
Now this I am fully in favor of, and IMHO, it leads to much better code all around: you can then test via the API, build a GUI via the API, etc, etc, etc.
> yeah, it renames the require statements, when you rename a file.
??? IntelliJ has probably been doing this since day 1 (in 2001), more have probably done so for far longer. Granted this applies to Java, not JS, but still. Speaking of IntelliJ, it is completely absent from the author pathetic history of IDEs, adding to the fact that he ought to get of the ghetto he's living in.
That post literally expresses amazement about "basic refactoring ability". It would blow author's mind of what a proper IDE can do, if they get a chance to use it.
Maybe not mind blowing, but just a few examples from my (very normal) day:
I wrote a statement in Rider that used a class that didn't exist. So I hit Alt+Enter with my cursor there, and had it create me a class. Then I hit Alt+Enter on that class and had it move it to a separate file. Then I added the base class it should inherit from, hit Alt+Enter, and had it scaffold out all the methods I need to override. About fifteen or twenty seconds with a modern IDE and didn't require any of my intellectual capacity to actually execute.
I realized that another class in this multi-GB codebase had a typo in its name, and hit Shift+F6 to rename it. Typed in the correct name, and twiddled my thumb for two or three minutes while it renamed every instance in the codebase.
Found a file that used a declaration style that's against our coding style. Hit Alt+Enter on one example, told Rider to configure the project to always prefer the other style and replace all examples in the file.
None of those are particularly magic, but having so many of them that are completely reliable a context menu away makes an enormous difference. Also with a recent file list popup and really excellent code navigation, I find that I don't keep a file list or tabs open at all. I just jump to symbols and toggle back and forth between the last couple of files.
text streams are a wonderful idea in 2020 too. Exchanging opaque blobs (objects) requires a fit too fine. It is like Kalashnikov (not very precise but works everywhere) vs. some finicky contraption that is more efficient by some metric but it can only be used in a sterile environment without frustration(e.g. all software version must be just so for it to work satisfactory).
Or using an electronics analogy, text is wire and objects are connectors.
Sometimes all you need is to temporarily connect things together, the environment is benign, and your requirements are not demanding.
But Mouser.com shows over 2 000 000 entries (over 390 000 datasheets) in "Connectors". High current, high voltage, high frequency, environmental extremes, safety requirements, ...
There are all sorts of situations where crimping wires together won't do the job. Same with text. By the same token, there are all sorts of situations where text will do the job, but people are tempted to over-engineer things.
it is high cohesion vs. loose coupling (old concepts that are also as useful in 2020 as ever). If something is too inter-tangled, it probably should be inside the same program/system where you can guarantee a tight fit between components.
I have to say, having started using PowerShell recently, it's better. It's frustrating because I have decades of muscle memory in ksh and bash, but that's not enough to prevent me from recognizing that the CLR for loading components into the same process space and being able to work with and pass objects in that process space in your shell is clearly the right way forward.
He's not wrong. Look further down thread, you can find people pointing out examples of domains where there is a dearth of software, and a definite need to keep writing it. All due respect to the expertise shown in articles posted on HN (it's part of the reason I keep coming back), but the Dunning-Kruger effect hits hard. As Einstein put it, "the horizon of many people is a circle with a radius of zero. They call this their point of view."
I'm really sorry, but have you looked at all the pictures in the author's story? I came up with the same association and took madhadron's remark to be about those pictures, not the author.
If your first reflex is to be 2nd hand offended, maybe you should relax a little and try to not to see hostility everywhere ¯\_(ツ)_/¯
Yes, I've looked at all the pictures in the author's story. And it never occurred to me to assume those are pictures of a ghetto the author lives in, as implied by the words "the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in" (emphasis mine).
But after I read your comment, I went back to the story and did a Google image search for a few (but not all) of those pictures. And guess what? They all come from other sources.
¯\_(ツ)_/¯
It's not like I was looking for hostility. I was reading the comment and nodding along and then the hostility came out of nowhere and smacked me in the face.
"Code ghetto" is actually a term I've heard bandied about before, usually used to denigrate an ecosystem someone doesn't like, but in this case I think it's wholly appropriate terminology[0], and I don't think the offense taken at it is justified.
God stop it with the Dunning–Kruger bollocks already. In my experience, people who refer to it are themselves rather shallow in their analysis of _any_ phenomenon at hand. It seems the thought form here is that anything even remotely falling out of your (very broad, apparently, huh) horizon is bound to be ridiculed and discarded by you as unintelligent nonsense. You're reading a poem, man. It's very clear from the reading that the author does not imply we have to stop writing _all_ software completely, might as well go back to the woods, etc. The poem deals with a very particular brand of software, or rather, software development in its own right; and it dissects the subject beautifully. Don't ever attribute anything you simply don't understand to stupidity. Now having read your comment I must say it's you who comes off as obnoxious, especially with them Kruger and Einstein references.
I'll stop using Dunning-Kruger when it becomes irrelevant. And no, I don't consider myself any less prone to it than other people, that's when I've found it most useful. And Dunning-Kruger is not "attributing stupidity" (talk about a shallow take), it's actually most insidious among those who are experts in one field, hence they've spent so much time there they have zero experience in other specialized domains (again, something I've recognized as one of my biggest weaknesses; thanks Dunning-Kruger!)
In my experience, people who get offended by Dunning-Kruger don't even realize they are prime examples of it.
"Code ghetto" is actually a term I've heard bandied about before, usually used to denigrate an ecosystem someone doesn't like, but in this case I think it's wholly appropriate terminology[0], and I don't think the offense taken at it is justified.
>> This is empirically false. We have records of dance music, of broadside ballads that were sung in the streets, of the music people played at home. And there were lots of student drinking songs about being young and free.
We have recordings from the 16th century? Do you mean "recordings" as in sheet music for automatic pianos or organs etc?
> Maybe the most marvellous, utopian idea for software was Unix program design, in the mid-1970s. This is the idea that programs will all talk together via unstructured streams of text.
It was a great idea in the 1970's. But we had OLE and COM and CORBA and GNOME and KDE and now PowerShell where you can load .NET assemblies and pass around structured objects in the same process. We've had Cocoa which exposed Smalltalk style control of objects in applications since the 1980's.
This isn't a problem of technology. It's a problem of incentives. It's the same incentives that drive programs to try to look different from each other instead of fitting smoothly into a set of user interface guidelines. It requires a producer of software to find it beneficial to them to fit into the ecosystem around them.
> The promise of a computer is to reveal truth, through a seamless flow of information.
Seamless flow of information requires connecting the underlying semantic domains. Semantic domains rarely exactly match. Humans use enormous amounts of context and negotiation to construct language games in these situations.
> music was around forever but in the 16th century, if you made music, you made it for god, or the king.
This is empirically false. We have records of dance music, of broadside ballads that were sung in the streets, of the music people played at home. And there were lots of student drinking songs about being young and free.
I think the real solution to the author's angst is to go study the field more broadly and get out of whatever ghetto they are living in.