Hacker News new | past | comments | ask | show | jobs | submit | hnlmorg's comments login

There’s plenty of LISP packages that support process composition. Also LISPs syntax better suits write once type environments like REPL shells than your average C-derived syntax.

Any examples? Searching for this is hard.

What would `foo | bar | baz` look like in a lisp?


Something like Clojure’s threading macro, probably:

(-> (foo) (bar) (baz))


There are shells out there that sit between PowerShell / Nu and the old guards like Bash / Zsh.

Ones that support data structures out-of-the-box (like any et al) but also work perfectly fine with existing UNIX commands (like Bash).



PowerShell seems to work fine calling a nix executable, unless I'm missing something:

> ls -t *.js | head -n 10 | foreach { echo $_ ; Get-ChildItem $_ | fl Name, Length } product_original.js

Name : product_original.js Length : 3353

gulpfile.js

Name : gulpfile.js Length : 382


Have you tried any alternative shells? Bash and Zsh might be moving slowly but the non-POSIX shells are light years ahead of even Zsh.

A preference based system would be almost impossible to maintain considering the literal infinite number of type faces out there (thousands plus more built every week).

Add to that, this feature also needs to be understandable for non-technical people who might never have view a page source in their lives.

Much as a granular system would be more powerful and preferable for the HN community, having a dumb toggle makes a lot more sense from Firefox’s perspective.


I wasn't thinking granular control. It could work exactly like the existing preference with users picking a preferred font. That font is always set as the highest priority font and used for any characters it supports, but the page's custom fonts are still fallbacks and kept in the font stack for any characters missing from the user font.

This would be helpful for icon fonts, but also for user preferred fonts that don't cover all language character sets.


Oh I see. Is that not how it works already?

Apologies if I’ve completely misunderstood your comments


No problem at all! I haven't been by my desktop the last couple days but I do want to test it. I hope that's already how it works, that would make more sense to me than blocking all custom font loading (though that does have its purposes).

> Thankfully, icon fonts are declining rapidly in usage

I wouldn’t be so sure about that. Resources like FontAwesome are still used heavily in VuePress and similar documentation generators.


I have one project using them, mainly because SVGs can be a bit of a pain to use in XSLT.

edit: it may not be SVGs themselves. The icons on that site are repeated and I would reach for an SVG sprite that can be reused, I ran into issues with that working cross browser in XSLT - Firefox was actually the problem if I'm not mistaken.


There’s some seriously impressive work that’s gone into that shell

Ok. So if you’re locked out of linkedin, now all you need to do is change the hundreds of thousands of other LinkedIn users to change their mindset instead of pleading with LinkedIn themselves to restore your access.

I’m pretty sure originals that defined this format did have examples and citations.

But I do agree that some of the later entries have felt a little lazy.


Right, I was about to comment that. One of the first ones I remember was this one, about addresses[1]; or this one, about names[2]. Both provide examples and information, which is the only thing making the whole article useful.

  [1]: https://www.mjt.me.uk/posts/falsehoods-programmers-believe-about-addresses/
  [2]: https://shinesolutions.com/2018/01/08/falsehoods-programmers-believe-about-names-with-examples/

I remember the address one. It was fantastic and I loved it. I wish they were all that helpful.

Ronnie might be the best but I’ve always enjoyed Mark Williams style of play more. The whole of the “Class of 92” are beyond special though.

Class of 92 are indeed very special. Williams is still doing amazingly well - just got to the final of the Saudi masters, and nearly clinched the deciding frame. Not bad for someone nearly 50 years old!

As someone who is working on modernizing the command line, I think it is entirely possible to strike a balance.

- the command line is just byte streams

Shells like Elvish, Nushell, Powershell and my own shell, Murex, support typed pipelines and that brings a great number of enhancements. Like native support for JSON, CSV and other data formats. Thus meaning you can have a unified syntax for handling different files rather than remembering a dozen different tools and their specific idiosyncrasies, eg `jq`, `awk`, etc

- readline is dated

We can do better than the typical shell interface. The popularity of tools like `fzf` and `starship` show that people do actually find modern TUIs useful

- rendered text is static

Lets say you view a JSON file in Firefox, you can expand and collapse specific branches of that JSON file. I'd love to be able to do that with JSON in the terminal. And not just JSON, tools like `tree` could benefit from that too.

Collapsible trees is a feature I'm working on implementing in my terminal emulator and it's completely optional. ie by default the tree renders the same way any other terminal would render that clump of text. Except you can then optionally collapse branches just like you can with code in an IDE or comments in HN.

---

I love the command line. I've been using it for 30+ years and written so many tools for fellow CLI enthusasts. But even I think an ideal world would be us moving away from grid based virtual teletype interfaces pushing communicating via raw bytes and with in-band control codes, and instead switch to something more robust. But that's never going to happen. Heck, even things like job control (ie when you hit ^z) is non-trivial to implement at a shell level. It requires multiple signals (each originating from a different sender), and 3 different hierarchies of process ID registration. Frankly, it's amazing any of this stuff still works. And that's before we touch on the plethora of competing standards to test what client (ie terminal emulator) is connected to that "virtual typewriter".


But how many of these new tools are robust when there's high latency? The advantage of good old POSIX shell is it works on cheap routers, the latest HPC systems, the system next to me and the system of the literal other side of the world, and everything in between.

Being realistic - how often is that a problem?For some people working on specific domains yes. But why are a large number of developers who _dont_ have those restrictions held back by ensuring that your terminal works properly over 9600 baud

> Being realistic - how often is that a problem?

I have 800ms RTTs on airplanes. Character-by-character editing is painful. Old-fashioned line-buffering makes the experience much better. It'd be nice not to give up this capability.


How much time do you spend coding on airplanes on remote systems?

As a thought experiment: send your next email 1 character at a time by regular post.

One doesn't have to do something often for it to be an awful experience. We choose tools, in part, because of the worst case experience not only the average case.


We shouldn’t optimise for the least frequent use case at the expense of the most common use case. Reversing a car is awkward, but we don’t reverse the steering wheel so that you get to turn the right direction while reversing.

Correcting reversed steering only while backing up is entirely possible. Drive by wire allows it.

Car designers realize the user base knows the older UI and uses the older UI without issue. The "install base" of existing user skills is a compelling reason to not change.


Uh, I live in Australia, it's always a problem ;)

More seriously, unless you're doing stuff locally (in which case, go ham doing cool things on your system), the tools you use need to work for others, with their requirements and restrictions (some of which can be changed, but others which are inherent).


The problem is that we cater for the absolute lowest common denominator and completely eschew the vast vast majority of people who don’t require that feature. Of course there’s an xkcd for it [0]. I haven’t owned a device in the last decade with a screen resolution less than 1920x1080, and most devices are significantly higher these days. My primary work monitor is 27 inches with a 144hz adaptive refresh rate - these are available at the bottom end of the scale, and are widely available at the mid level of the market.

And yet, I’m forced to use tools that adhere to standards from when latency to disk was higher than my latency to a remote server these days, that can’t handle resizing, that can’t reliably handle Unicode input. Imagine if I gave you a car designed in 2024 that had a hand crank to start, but you could configure it to use a battery if you wanted to. That’s how I feel about the restrictions of terminals, shells and TUIs these days.

[0] https://xkcd.com/1172/


The strength of these tools is that they are the lowest common denominator. I don't need to worry about `ls` not fitting in an 80 character terminal because its devs "haven't owned a device in the last decade with a screen resolution less than 1920x1080". I don't need to worry about `find` not working because it can't resolve DNS. I know when I SSH into my router, my raspberry pi running pi hole or emulation, my laptop, and my server that my `#!/bin/sh` script works exactly the same.

It was so, so hard to get here. Imagine chaos so maddening that autotools was somehow an improvement.

> The problem is that we cater for the absolute lowest common denominator and completely eschew the vast vast majority of people who don’t require that feature.

Not for nothing, but for most people in the world computers and internet access are an unaffordable luxury. I'm typing this on a machine that cost me $3,700, and in some ways I'm sympathetic to what you're saying. But average world GDP per capita is something like $12,500. Electricity isn't free. Internet access isn't free. Before we start making arguments about catering to the 1% of people fortunate enough to use the fastest machines and networks, we should consider who our actions may close the door on.


> The strength of these tools is that they are the lowest common denominator.

I disagree - the strength is that they're standardised. The weakness is that those standards are ancient.

> Not for nothing, but for most people in the world computers and internet access are an unaffordable luxury. I

Are we really doing "there's starving children in africa"?

There's a vast, vast middleground between a western developer with a stack of multi-thousand dollar devices with low latency ssh access, and someone in a rural village in kenya using a 2g sim card on a 25 year old OS. The line shouldn't be "you need an ARM Macbook to run a terminal", but maybe, just maybe we could realise that pretty much any device it possible to buy and run a shell on in the last 20 years is not restricted to being driven by control codes, and that I'd wager pretty much every device that is actually using a terminal emulator sold in the last 18 years (I'm going to draw the line at 2006 when Core2Duo appeared) has had an amount of hardware that was just not considered 35 years ago when they were designed.


> I disagree - the strength is that they're standardised. The weakness is that those standards are ancient.

Which standards do you have a problem with? Handling terminal escape codes is pretty easy; dozens of terminal emulators do it great. Are you saying it should be easier to build a TUI? It's already very easy: just use a library like Textualize or Bubbletea or rview (there are also dozens of these). I'm not saying every TUI app performs great, but maybe you remember an article that made it to the HN front page about how centering in web apps is impossible? Or how web apps can't both confine text to a reasonable center column of text and take full advantage of all the screen space when a browser window is fully maximized on a wide monitor? TUIs don't have a monopoly on layout weirdness is all I'm saying here.

> Are we really doing "there's starving children in africa"?

Feel free to correct me here, but it sounds like your point is that we should move the baseline from what was prevalent in the 70s to what was prevalent ~2006. I'd love to hear some examples of what you think that would enable beyond what we currently have, because thinking about it, I'm not coming up with too many that wouldn't really irritate me. Maybe tab completion could query external services because we have pretty fast networks and CDNs now, but I don't always have internet, and I'm not thrilled by the idea of my ISP knowing whenever I try to tab complete a Docker image or whatever. Maybe we could cache it, but I'm not wild about every tool I use keeping its own cache or history database. Maybe we could have big beautiful TUIs, in fact we already do, but I prefer to have a lot of splits in my terminal and therefore like it when things work fine at 80 columns.

So I can't think of significant improvements, but on the other hand I can easily envision this being a bigger barrier to expanding computer/internet use to more humans. Or to be maybe too frank about it, I'm completely fine with you being annoyed by yet another Unicode or layout bug if 1,000 more people--let's just say in rural Washington--get the chance to be charmed by Linux.


> Handling terminal escape codes is pretty easy; dozens of terminal emulators do it great

And yet some of the most common ones fail - here’s a thread about VSCode from this morning.

https://x.com/thdxr/status/1833727037074227603?s=46

> just use a library like Textualize or Bubbletea or rview (there are also dozens of these)

I’m more of a user of these TUI apps, so I don’t have control over what framework they use.

> Feel free to correct me here, but it sounds like your point is that we should move the baseline from what was prevalent in the 70s to what was prevalent ~2006.

We should move the baseline to represent how computers are used today. That might be 2004 or 2008, I don’t really care. One really perfect example of crazy behaviour that is still widespread is “tools that dump binary data to terminal emulators that are parsed as escape codes and cause the emulator to hang” - there are dozens of these foot guns, and removing these surprise behaviours would be far more likely to help those 1000 people in rural Washington than letting them run a terminal on a device that is possibly from before they were born.

> I'd love to hear some examples of what you think that would enable beyond what we currently have

I think the fact that we serialize everything to text to allow interoperability between tools is insane in this day and age. We spend so much computing time on this. Powershell has it right. The fact that in my hands I have a device that is many orders of magnitude faster than the device I first wrote code on (and this phone is probably an order of magnitude weaker than my primary workstation) and we’re still waiting for basic operations on files because of APIs and programs that were designed in the 80s when we needed to page text files

> I'm completely fine with you being annoyed by yet another Unicode or layout bug if 1,000 more people--let's just say in rural Washington--get the chance to be charmed by Linux.

One of the earliest memories I have of programming is having a computer say hello to me, and printing ascii art. I’d guess GenZ is far more familiar with Unicode than with ascii art, and there are millions of people out there with non-ascii characters in their names - shouldn’t they be able to have the same access that we do, or are we saying g that only poor Americans are entitled to have that magic?


> And yet some of the most common ones fail - here’s a thread about VSCode from this morning.

Well, not being an X user I can't see anything past "the vscode terminal is so poorly implemented it cannot deal with the ansi i am throwing at it", but as a Vim user you won't catch me defending VSCode. I will say if you're looking for perfect classes of applications you won't find any, i.e. if your standard for implementing things is that no implementation of a thing can have a bug, you'll never implement anything.

>> just use a library like Textualize or Bubbletea or rview (there are also dozens of these)

> I’m more of a user of these TUI apps, so I don’t have control over what framework they use.

I'm just trying to get at the outlines of your argument. If this wasn't what you meant, OK then.

> One really perfect example of crazy behaviour that is still widespread is “tools that dump binary data to terminal emulators that are parsed as escape codes and cause the emulator to hang”

This shouldn't do anything that running `reset` can't fix. If it does, that emulator has a bug. But further, that's not really a standard or anything to do with ancient computing. I can easily send you to a page that locks up your computer with bonkers JavaScript (it's called realclearpolitics lol). Terminal escape codes don't have a monopoly on locking up your apps.

> removing these surprise behaviours would be far more likely to help those 1000 people in rural Washington than letting them run a terminal on a device that is possibly from before they were born

My premise is that these people don't have computers. Also using a terminal on a device from before I was born sounds awesome (you can imagine what my YouTube recs look like haha)

> I think the fact that we serialize everything to text to allow interoperability between tools is insane in this day and age.

I think about this all the time, but primarily like in the realm of like JSON vs. msgpack. There's something very cool about being able to interpret a data stream just by looking at it (more or less). But it's also wildly inefficient, 99.999999999% of the time only computers are ever looking at it, and in the CLI case the general lack of structure or standard leads to annoying extra work and pretty evil bugs.

> One of the earliest memories I have of programming is having a computer say hello to me, and printing ascii art. I’d guess GenZ is far more familiar with Unicode than with ascii art, and there are millions of people out there with non-ascii characters in their names - shouldn’t they be able to have the same access that we do, or are we saying g that only poor Americans are entitled to have that magic?

There's no way I can dig it up now, but I remember reading some article about everything you'd have to go through to assuredly (more or less) print Unicode characters to a screen in C, like you're dealing with wchar_t, defining special things on Windows, making sure you have a supporting font, blah blah.

In fairness though, this is pretty easy in modern languages like Python or Go, plus I don't know how common it is for terminals to not render Unicode (if you're using a non-Unicode-aware terminal these days I'd be amazed, I mean how will npm display emojis??).

I don't know if Unicode is a good example for us to work with though. It is more computationally taxing so it works for your end, but it also extends computing to way more people so it works for my end too. A win-win doesn't really exercise the tension you and I are getting at.


Your last point is the key thing imo. The different between features and bloat is simply just if people need it.

However we won’t always agree on what we “need”.


> But average world GDP per capita is something like $12,500. Electricity isn't free. Internet access isn't free.

These people are big users of the command line?


If you read on, you'll find that my point is that if even the most basic apps (CLIs and TUIs) have hefty system requirements, we stand little chance expanding the use of computers to the average person. I'm not sure what your point is though.

The biggest problem with the latency of these new tools isn’t the tools themselves but rather the huge amount back-and-forth chatter that happens with SSH.

If Bash works then these should work too. And if Bash doesn’t work then you’ll need something that supports local echo like Mosh instead of SSH.


I never really got mosh working nicely with the VPNs and ssh jumphosts I needed to use.

I would say vim is excellent over high latency connections, as you can slow down your typing and plan movements and queue up commands. Readline's ability to pop open an editor makes running complex shell commands so much nicer.


> - readline is dated

But at least it was --- pre-GPLv3, pre-prompt_toolkit, etc. --- a standard. It would be nice to again have a single de-facto standard line editing system that I could

1) customize once, for all my programs, and

2) compose with other programs (e.g. shells embedding in other programs).

It'd also be nice for it to be friendly to high-latency connections.

But alas, we're heading towards a world in which the terminal is just a GUI with Minecraft pixels.


> But alas, we're heading towards a world in which the terminal is just a GUI with Minecraft pixels

This has been the case for decades already. Some TUIs use ncurses, some don’t. Some use 7bit escape sequences for switching to drawing characters, others use 8bit control codes. Neither fish nor (and correct me if I’m wrong on this one) zsh use readline.

There never was a standard way. Some de facto standards exist but even these were ignored as often as they were used. And the reason for this is because there never was a standard VT in the hardware days (DEC, Tektonix, etc all did things slightly differently) let alone in the xterm, VTE, et all days. And all of these different terminals have been driven by different operating systems. For example ‘ps’ on macOS uses different escape sequences to GNU ‘ps’ yet they mostly achieve the same output. So imagine how inconsistent things were when you had fundamentally different time sharing systems.


You can use rlwrap to wrap many command-line commands with readline. It is my single de-facto standard line editing system.

> rendered text is static

The way I see it is that you're making it dynamic by entering other commands. This doesn't really feel dynamic when you don't know what to enter, but the more fluency you achieve, the more dynamic/in the flow it feels.


It’s not dynamic though. Take my tree holding example and compare the speed of collapsing a tree in a GUI like Firefox vs doing the same with iteratively updating jq queries.

Much as I prefer command line tools for most tasks, it’s silly to say that they’re always just as dynamic as GUIs.

You can already recreate my tree folding example as a TUI using block characters and terminal mouse input. There’s nothing exotic about that. So all I’m suggesting is that this convenience should be built in as standard.


The issue is that you can't use the other tools of the terminal like grepping and friends on a TUI. You can get semi-dynamic behavior with stuff like piping into fzf, maybe there's something like this for JSON collapsing too?

> Shells like Elvish, Nushell, Powershell and my own shell, Murex......

Except nothing of this is really new, this is how REPLs in Xerox PARC worstations, ETHZ and Genera used to work.

The novelty of these shells is a side effect of UNIX wiping out the alternatives.


You’re calling them “new”, not me.

Besides, if you want to go down that line of thinking then nothing in technology is truly original. We all stand of the shoulders of giants.


I clearly pointed out these are 1970-80's ideas outside Bell Labs, one just needs to check the dates of those listed systems.

Those that call them new, only know UNIX and Windows.


…or they are aware but instead using the term “new” to refer to the release date of these tools rather then the invention of core concepts.

I feel like you’re picking a fight for the sake of an argument.


I wasn't the one that decided to drill down on the meaning of new....

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: