> On the other hand, if Windows had a proper shell and cli tools, like cygwin with zsh, but native and not Ubuntu layer inside - that would be tremendous.
It is PowerShell, and it really is. Until recently I thought Windows had poor CLI support, and I discovered PowerShell and now I favor it even more than bash.
Am I crazy? Possibly, but PowerShell is truly a piece of gem in the CLI history. It is a thoughtfully crafted product regarding what "command-line interface" should look like.
It is a thoughtfully crafted product regarding what
"command-line interface" should look like.
How so?
I've never appreciated what's actually good about powershell... you pass objects around? So...? Is that a thing that's useful?
If you want to just automate a task, having intermediate objects that are serializable (eg. strings) so you can `foo ... > blah` and inspect the value of blah before continuing (`cat blah | command2...`) has always seemed far more tangibly useful.
Having methods on an object you can invoke like a REPL for the OS sounds like a good idea, but I've never actually found it useful; its like the python REPL; useful for prototyping and doing stuff after you've imported the 50 packages and setup all of the environment, but once you open a new instance, you've got to spend the time doing that before you can actually do any work; and its useless for scripting.
...but, powershell gets a lot of love from people; so what do you actually find it useful for?
Honestly curious, I've only ever touched it briefly and then swapped over to other things.
> If you want to just automate a task, having intermediate objects that are serializable (eg. strings) so you can `foo ... > blah` and inspect the value of blah before continuing (`cat blah | command2...`) has always seemed far more tangibly useful.
String parsing is the bane of my command line scripting experience. Even in targeting "identical" environments, all it takes is one changed installation default altering the output of one of my many commands for my scripts to break - usually in some non-obvious ways that require a good hour to get to the bottom of, rework, and fix. To prevent such changes from forcing me to rewrite my entire scripts every time, I try to centralize such text parsing and munging in one place, "deserializing" those strings once and feeding it to the rest of the system. Shipping around this deserialized state in command line scripting languages can be so awkward at times, as to warrant rewriting the entire thing in a proper programming language. Inter-operating between your new program and your existing scripts will, of course, require even more text parsing.
Don't get me wrong, sometimes munging text is your least horrible option. Powershell still lets you do that.
But Powershell's objects also let you, with great frequency, skip the "try to 'deserialize' text that was really formatted for humans and isn't versioned, can be ambiguous, and otherwise was never written with machine consumption in mind" step. If I feel the need for a 'proper' programming language for parts of my script, I can write C# modules and use them from powershell without writing a bunch of text (de)serialization code on either end. This singlehandedly eliminates entire swaths of the most brittle, opaque, and otherwise obnoxious code to ever grace my scripts.
I've been using shells on Linux VMs and Macs for years now and I've probably written less than 50 functions, and the number of times I've typed sed or awk is probably lower than 100. I pull out the real scripting languages for real jobs. I only use shells for quick things.
Sure, I could write a Python script right now that would read me the last lines of a log file on a remote server. Or, I could just type something like this and hit enter: ssh user@server "tail /path/to/log"
I think what GP is trying to say is that PS is in an awkward position between the two. It has a deeper understanding than bash/zsh/whatever, but it also requires more typing. Yes, PS will fix issues like the ones you have described, but typing in long names (at least for me) defeats the purpose of using a shell in the first place. I don't want to type in "Get-Item" or whatever a million times, nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
> I pull out the real scripting languages for real jobs. I only use shells for quick things.
It's my experience that the latter eventually morphs into the former "without question".
And go figure, the build server doesn't have python installed. Or only has python 2. Or only python 3. Ditto for a coworker - this being game development, a lot of those coworkers aren't programmers, and won't be able to debug "hey python is missing" on their own - sucking up IT and developer time.
> It has a deeper understanding than bash/zsh/whatever, but it also requires more typing.
Aliases, tab completion... you're not wrong, but I've not found it an issue in practice. In fact, rather the opposite: I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using. This is perhaps because I'll script anything that gets annoying. I don't spend a huge amount of time doing bespoke commands in a shell, though.
> "Get-Item"
gi
EDIT: "Get-Alias" (or gal) will share a lot of shorthands. TIL %{...} is just using % as an alias, and that ?{...} is another option.
> nor do I want to ever worry that using redirection (e.g. "something > log.txt") will mess up because PS defaults are the way they are.
I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
> It's my experience that the latter eventually morphs into the former "without question".
I suppose we do vastly different things with our shells. Looking through my history, it's mostly things like "cd", "ls", "vi", "make", etc. and my longest bash script that's stood the test of time is 12 lines long, with the most complex part of it being an if statement in a string (trust me, there's a reason for that). I've ran much, much longer shell scripts, but I almost never write a shell script longer than 20 lines.
> And go figure, the build server doesn't have python installed [...] this being game development, a lot of those coworkers aren't programmers
AHHHHH ok we definitely do work in very different atmospheres! I suppose in instances where "coworkers aren't programmers, and won't be able to debug", I would just write a Python script and use PyInstaller so they could just double-click on a .exe
But if I'm on someone else's computer and they don't have Python or anything like it, then I would honestly just install Python. But I definitely see how you or anyone else would object to this, and I can totally understand the view that it's much better to use PS in this instance.
> Aliases, tab completion... you're not wrong
You're write, there are aliases and tab completion, just like on bash/zsh but on PS, I have to remember both "gi" and "Get-Item". Sure, I would use something like "gi" all the time, but whenever I look up something and see a StackOverflow answer that says "Get-Item", I have to know what that means, which means I have to memorize both the long and the short versions of a lot of things. On Linux shells, I feel like I only memorize a short thing like "cat". Sure, I also have to know what it does, but the same applies to PS.
> I have to do a lot more reading of documentation to decode bash/zsh scripts and whatever melange of implementation specific single letter flags they happen to be using
The letter flags part is a fair criticism. But don't all shells suffer that? It's the cost of writing quickly. I could Google "what is gi" but instead I choose to google "what does set -E do?"
As for the part about "reading documentation to decode bash/zsh scripts", I think that this discussion sums up why what you're saying is true for PS as well: https://news.ycombinator.com/item?id=14034414
> I've done a lot of redirection without problems - if there's a footgun I should know to avoid, please share!
Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
I was working on two programs. One would do stuff and print JSON to stdout, and the other would take JSON from stdin and process it. I had a Linux VM running inside Windows. From my VM, I ran something like `program1 > file.json` and then I ran `cat file.json | program2`. This way I could inspect the JSON file at any time in case something went wrong in one of the two, independent programs. Everything was working just fine.
Then I stopped writing code and testing it in my VM. I decided to go the Windows route, and update my code outside of my VM, and then run my code in PS. I ran `program1 > file.json` and it worked like a charm. Then I ran `cat file.json | program2` or whatever you run in PS (it's been a while) - but it didn't work. So I assumed it was my fault. Time to debug. I looked at `file.json` line-by-line, and it was just fine, so program1 was fine. I looked at program2 line-by-line, and it was just fine, so program2 was fine. I went to my VM and ran `program1 | program2" and everything worked fine.
How was it possible that my code worked just fine in Windows, but not in Linux? It turns out that when I ran `program1 > file.json`, it fucked up my json file in a way that was like undetectable. I ran `program1` in PS, selected the output, and copy-pasted it into a text editor, and save the file as file.json. Then I could run `cat file.json | program2` or whatever from Windows and it worked like a charm.
To this day I am not sure what happened. Also, program2 supports a file name as an argument which it will then open and read, so some of the commands I listed may be slightly different from what I actually typed, but the gist of it working perfectly in bash on Linux but not on PS was enough to destroy me. Perhaps the issue was something about encoding? Sorry if what I'm saying does not seem very concrete. Here are some links that demonstrate (possibly different) issues people have using redirection:
> But if I'm on someone else's computer and they don't have Python or anything like it, then I would honestly just install Python.
The problem is scaling this to many coworkers. At some point it becomes "wait for I.T. to get around to automating the install across the fleet" or make your scripts install python, ninja-like. But it sounds like you're more able to rely on python, so that probably makes more sense for you (if only so you don't have to rewrite the same script for non-Windows boxes.)
> Here's my horror story. This is the reason I swore off PS, as stupid and emotionally-driven as that sounds
It sounds bad enough I can totally get where you're coming from. Heck, it's basically the exact same place I'm coming from with the "strings shot my dog" quip ;)
> Perhaps the issue was something about encoding?
Something to do with e.g. UTF-8 BOMs or line endings (\r\n vs \n) would be top of my paranoia list. I'd break out a hex editor or binary diffing tool (I've used 010 Editor a couple times for this) if you find yourself in the same situation again. Understanding exactly when I have a single string with newlines vs when I have an array of strings with implicit newlines when joined isn't something I've got my head perfectly wrapped around yet in powershell, and could be another possible cause.
> Sorry if what I'm saying does not seem very concrete.
You're offering what you know, and I appreciate it :). Sorry for the short reply (I need to be somewhere...)
I'm using a large corpus of Powershell scripts, mostly written by enthusiastic Microsoft consultants, and even if I know exactly what they do I cannot stand the mysterious imports, the sequences of script invocations and bare statements that leave the session with the desired invisible state (and, conversely, closing and reopening the session after every major command just in case), the automagical option handling, and so on.
1. It's amazing because Windows-only admins (or predominantly Windows admins with almost no Linux experience) have never seen anything like it before. Until PowerShell, the state of the art was VB scripting or batch files, both of which are (objectively) garbage. Regardless of how long the rest of us have been working with shell scripts, Python scripting, etc., Windows users have never had the opportunity to do similar things with similar tools which are included with the OS.
2. It's amazing because it does a lot of great things that even bash scripting can't do. The idea of passing around structured data can be super handy for a lot of common topics. For example, on Linux I have to use 'ip addr list' to get list of IPs, grep to get just the IPs, awk to get just the IPs, and now I have a list of IP addresses. It's a huge stupid hassle that I have to do every single time I want to write a script that takes advantage of IP addresses.
Making everything a string makes sense when it's 1970 and you want everything to be compatible, but when basically none of the tools I use on a day-to-day basis provide the option for easily machine-parse-able output, it ends up very frustrating. The (theoretical?) promise of Powershell is that all output is machine-parse-able.
The benefit of passing objects around is that you could do things like Get me a list of network interfaces | filter by interfaces which are up | which have IP addresses | just show me the IP addresses. The few examples I've seen make it feel like your shell is some sort of half-bash/half-SQL system where you can filter, process, and loop over objects.
I can't count how many shell scripts I've had to write which parse output to get the list of data I want, then go back over that same output again to do actual work on it. You can hack a lot of stuff together with ugly hacks; getting all the interfaces on a MacOS machine with IPs except loopback? Maybe 'ifconfig | egrep "^[a-z]|inet[^6]" | grep -B1 'inet' | grep '^[a-z]' | grep -v lo' would do it. In most cases. Probably there's a better way to do it, but if you just want to get something written then you can hack it in like this, or loop over 'ifconfig -lu' (which, on my machine, shows 13 'up' network interfaces), etc.
I've been using PS for quite a bit of AWS automation lately and I have to say that I don't like it. Sure, you can pass objects around, but I've happened upon more than one cmdlet that does the wrong thing with the incoming object. In one case I was passing a "String" object to a cmdlet that accepts strings and it didn't know what to do with the object so it generated a fairly obtuse error. I had to manually cast the String object to a string. Grrr...
Another thing that bothers me is the lack of single line composability. In most Unix shells you can pipe things around with abandon; it's not pretty but it works. One more than one occasion, while working with PS, I've had to create a cmdlet because there is no way (or I don't know how) to store intermediate values between cmdlets. One example was processing things in a loop. I had to store the current value in a variable and then process that variable in another line. I know someone here will give a solution, but I looked for an hour before giving up and creating a script file.
On top of everything else, the cmdlets from Microsoft have differing switches for the same thing. One command might use -ServerName while another command will use -ComputerName. So, basically, you end up looking everything up before you can use it. I know Bash isn't much better, but at least I can expect that the tools are separate and not really designed to work together. I was expecting more consistency from PowerShell.
But why is it that the PowerShell terminal emulator is (graphically) even worse than a tty ? I've worked on ttys more agreeable on the eyes than Windows terminal emulators.
Windows is supposed to have better font support than Linux.
Also, if someone here has an answer: Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
> Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
Windows' way of program executable placement is using the holy Registry. It's called 'Application Registration'[1] and was introduced to reduce the needs to modify the system-wide PATH variable. (They thought it was a bad idea to modify a system variable so frequently, and I partially agree.)
You can find registered applications in `HKLM\Software\Microsoft\Windows\CurrentVersion\App Paths`. Very few programs use that feature, which is unfortunate, but popular applications like Chrome and Firefox register themselves in it. That's why you can invoke `chrome` in the 'Run' dialog.
Edit: Another context to add: at the time App Paths was added, to modify PATH you had to edit AUTOEXEC.BAT manually which was painful. Not only that, but also PATH had a length limitation of 128 characters. You can find more details in the Raymond Chen's blog, as useful as always.[2]
> Not only that, but also PATH had a length limitation of 128 characters. You can find more details in the Raymond Chen's blog, as useful as always.[2]
I should note there are still length limits - in practice you'll run into issues with as few as 2047 characters:
Debugging this is really annoying, as one of my coworkers found out when one too many applications decided to add multiple paths to PATH (for example, nVidia CodeWorks has added no less than 8 subdirectories of C:\NVPACK\ to PATH to support Android development - for gradle, ant, jdk, ndk, and the android SDK's support, build-tools, platform-tools, and regular tools.)
Said coworker ended up spending some time using directory junctions to shorten the paths in PATH to the point where his dev environment was useful again.
Hm, interesting. I suppose this is what http://scoop.sh should be using? (The per-user setting, "HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\App Paths").
Oddly it appears the python2.7 installer uses this (global setting), but not the python3.x one (It would seem that python2 could register python.exe (as it currently does), and python3 could register python3.exe (as it currently does not)).
It certainly doesn't seem to make much sense for python3 to have an option to add itself to the path environment variable, and an option to change the path length limit - but apparently not an option to use this "modern" way of registering itself? (Unless, python2 and python3 installers, when fighting it out, default to only registering python2... which makes sense, but is painful).
But based on a windows hyper-v vm with only python3 installed, it looks like python3 does not use this setting.
It is not only ugly but also slow. One of the pain points while using PowerShell. It seems that Microsoft didn't care much about the emulator until recently. Thankfully things are changing, it was improved a bit in the Windows 10 anniversary update and Microsoft promised to improve it further. I'm optimistic about it.
> Also, if someone here has an answer: Why in the design of Windows aren't programs installed or symlinked in the PATH by default? I guess that was a design choice somewhere along the history of Windows/DOS. Is there a reason?
This get's to the real issue of what makes powershell so horrible. It's not that anyone loves bash scripts, it's that there are a tonne of great utilities that bash script tie together. Windows doesn't have this.
Maybe it's great for Windows, but it's not great for working on Unix machines, which many of us do.
I'd be more open to Windows if trying to manage Unix boxes from it wasn't like trying to build a ship in a bottle. I always feel like I have one hand tied behind my back trying to do my job in Windows.
For simple web browsing and office work and such, it's fine.
I made a wrong statement. PowerShell is a great shell! I want a shell and gnu chain from coreutils onwards within Windows. Babun takes me close, but not close enough. Namely it's slow as hell, 64-bit is no really there, and it's cygwin. I'm one of those "runs Vim and writes their own Makefiles and uses GCC and stuff like that doesn't like cygwin" guys.
I see. I'm also kinda a "runs Vim and writes Makefiles" guy, but I'm more hopeful about the Ubuntu layer being more seamlessly integrated into the native rather than waiting for Cygwin to improve. After all, WSL is official, and Microsoft seems to put a lot of efforts into it.
As a side note, like you said, Cygwin is slow. I once measured how much time compiling things consumed on both Cygwin/MSYS2 and WSL. `./configure` was 3 times faster on WSL, and `make` was 2 times faster. I assume the reason to be WSL's process management being lighter. This is another reason I'm looking forward to see the improvements to WSL's native integration.
Great question. I use, all the time, applications which need full speed and full GPU support of which some are only on Linux and some are only on Windows and there's some overlap where most from both are on MacOS. Simple VM doesn't cut it, I've tried.
What about the other way around, Windows host + *nix VMs? I run FreeBSD in VirtualBox, forwarding apps to VcXsrv via PuTTY. Works very well. I even made a tray icon script to launch that setup https://github.com/myfreeweb/xvmmgr/blob/master/xvmmgr.psm1 :)
I discovered PowerShell and now I favor it even more than bash
Same here. Might be biased because I never completely mastered bash as I don't use it that much. But after the first bit of the learning curve is over the rest just seems to go automatically with seemingly way less searching the internet: things are just easier to discover and figure out by yourself, and that also makes it easier to remember them. Plus you can visually debug it. Also some of the bash things I'm addicted to (autojump and fzf) have some pretty good clones for PS, namely ZLocation and PSFzf, those are real timesavers for navigation/history search for me.
Powershell may be good (I wouldn't know), but the terminal emulator (cmd) is absolute garbage. Even in Windows 10, you still have to edit the registry just to use a decent font, There isn't a reasonable way to change the colors, &c. I honestly can't tell what has changed in cmd since Windows NT.
Sure, there has not been much progress with terminal emulators in the past couple decades, but in Windows there has literally been none at all.
PowerShell is very serviceable, but it has weird points of failure. Especially with some FOSS projects that seem to think cmd is the furthest they will go as far as Windows support, then they completely choke on powershell.
I think if I liked the rest of the .Net toolkit more, I'd be more enthused about it. It makes things livable on Windows, which is better than the dark days of Vista/7.
But powershell is an all or nothing proposition. At my job, the majority use OSX. I use Linux. Since we all use bash, there's very little friction here. I can't "just use powershell" in this scenario at all.
That does help, and is welcome. But convincing a team to switch something as fundamental as their shell is an uphill battle. I'm sure PowerShell is great (I've never used it), but shells always struck me as the kind of thing where "good enough is good enough".
It is PowerShell, and it really is. Until recently I thought Windows had poor CLI support, and I discovered PowerShell and now I favor it even more than bash.
Am I crazy? Possibly, but PowerShell is truly a piece of gem in the CLI history. It is a thoughtfully crafted product regarding what "command-line interface" should look like.