What's the difference between piping to a shell and running a random executable or installing a random ppa? All 3 could do massive harm to your system.
One difference is that piping to a shell is a lot easier to inspect. So you could easily argue that piping to a shell is safer.
Heck, even downloading a source tarball and compiling it yourself isn't any safer unless you actually inspect that source. And who does that?
The issue isn't executing untrusted code, it's connections terminating early and causing your shell to execute incomplete code.
The article ends with the example code
TMP_DIR=`mktemp`
rm -rf $TMP_DIR
And the stream ending pre-maturely may cause the second line to end with rm -rf / and then execute it. While this wouldn't do anything anyway without --no-preserve-root added, it still brings up a good point about interrupted connections executing incomplete code which would otherwise be safe if the command was finished.
The fact that the change is so obvious and so simple and yet so many developers keep telling users to pipe the output into a shell is precisely why this is a "big deal".
You are assuming wget returns error codes reliably. It does not. Also, your example assumes write permissions to the current working directory and no risk of someone nefarious being able to write to the current working directory.
No it doesnt turn into that, it turns into wget https://install.sh && ls && less install.sh && less lala2 && echo "ok looks good enough" && ./install.sh # no sudo, to user local install, its just a package or program ffs,
when you want packages to install for all users, or system wide, then you use the default package manager of your distribution.
Then take that to its logical conclusion and read the source code for all the stuff you are installing, and the description of all the packages you are installing. Your argument was the strawman.
Would it make sense to mitigate this by creating a (even smaller) bootstrap script that itself downloads the "real" script and checks e.g. the SHA256 hash of the downloaded file before executing?
I realize that I tend to do that by default in my shell scripts. After a few years of Python, I always make a main() function which gets the command line parameters. Is this weird?
It just seems to me that the main should worry about interfacing with the outside world, and the rest of your code should really just be written as a library.
I think I do that to avoid global variables as much as possible, as well. Declaring things as "local" tends to keep me honest, as opposed to having a bunch of junk cluttering the global namespace.
Why that? Have we never heard of defaced web pages?
Granted, using shacat is much better than piping into sh. But basic learning from security breaches is that nothing is safe, you only can find ways to do thing in a less catastrophic manner than others.
> Why that? Have we never heard of defaced web pages?
Well if you can't trust the website you're screwed anyway. If the website is compromised then absolutely any way they have of installing software is broken
Good, but in some cases this won't work with latest versions (content will change), plus, you need to add instructions for installing shacat first, which defeats the point of having a single line install. It's always been convenience over security.
or possibly a self checking script that only executed if it was complete... ie: the script is escaped and must run unescape(escaped_script) to be lethal but by then you can confirm that the script is infact whole and as the creator intended to be...
Those scripts are typically very short. They'll either download completely or not at all. And if they do abort short then what are the chances that it's going to do it in exactly the wrong place? And then you have to manage to kill the wrong process to start the aborted script.
Statistically a cosmic ray flips a few bits in your RAM every year. Theoretically those bit flips could transform a benign program into a malicious one, and I'm sure that's happened to somebody somewhere sometime.
But it's very unlikely to happen to you tomorrow. You could use ECC RAM to prevent this from happening, but how many people do that?
Counterpoint to the author: have you ever heard of anyone ever having anything catastrophic happen because of this, ever? Have you ever heard of anyone even having it fail with garbage?
Catastrophic? I can't claim that. But I can tell you that I had a server drop out on me when installing software this way. Here's a quick quote of the bug report I sent to Linode when I was installing Longview.
> Bug: Update the nginx (?) error pages to be bash friendly
>
> root@marius ~]# curl -s https://lv.linode.com/nGiB | sudo bash
> bash: line 1: html: No such file or directory
> bash: line 2: syntax error near unexpected token `<'
> 'ash: line 2: `<head><title>502 Bad Gateway</title></head>
>
> This could be done by just dumping out echo statements since its already
> being parsed by the shell. Additional meta about the request (headers, etc)
> could be dumped into comments too…
>
> # 502 Bad Gateway
> echo
> echo "Uh oh: Something went sideways during the install!"
> echo "("
> echo "Re-running the installer should clear things up."
> echo
> echo " sudo !!"
> echo
Yep, I agree this is generally bad practice and is prone to error conditions, but I am mainly addressing the article since it is claiming that this should never be done because of the possible catastrophic consequences, and I just don't think that scenario is likely enough to mount a campaign to abolish this practice :)
Other than what others already said about early connection termination, there is one major reason why you should not pipe to a shell.
To avoid users complaining about how dangerous it is.
This is why I distribute a tarball with an installer script in it. It's functionally almost the same, not that much harder, but it avoids all the backlash and makes my software more likely to be used.
Tend to agree. There is still the trust relationship downloading source and using it, although I guess with a source bundle you can at least verify the hash and have the same (potential malware) as everyone else.
One difference is that piping to a shell is a lot easier to inspect. So you could easily argue that piping to a shell is safer.
Heck, even downloading a source tarball and compiling it yourself isn't any safer unless you actually inspect that source. And who does that?