I have to take exception to some of this; it's "technically correct" but like much of shell, likely full of terrifying edge cases that are best avoided.
For instance, using eval to have variables with variable names is madness.
$ var="world"
$ eval "hello_$var=value"
$ eval printf '%s\n' "\$hello_$var"
I suppose the fancy business with printf is flexing, but I suspect the majority of people scanning documents like this would prefer to have examples with echo "${variable%%string_modifier}"
Escape sequences are mostly dependent on you being on a VT102 compatible terminal; I'm not sure what happens if you're on an ASR33 and you send this to it...
I'd consider type checking (is this a float) to be similarly cursed witchcraft -- after all I've run into all sorts of "floats" that aren't just 123.456 strings...
Overall -- there's a huge range of things shell can do that you probably should just avoid. Similarly, even if you can do it in a clever shell way (the bit shifting math for instance, or extensive string manipulation) should likely be done with external tools just because people understand "oh, awk or sed -- I know this!" instead of "what the hell is this line noise?" If you're writing performance optimized shell, well, probably put the keyboard down and walk away for a bit and reconsider your life choices.
Good doc; I'll probably stick to the man page though.
> I suppose the fancy business with printf is flexing, but I suspect the majority of people scanning documents like this would prefer to have examples with echo "${variable%%string_modifier}"
POSIX itself contains some interesting notes about `echo` and `printf`, e.g. from the `printf` page:
"The printf utility was added to provide functionality that has historically been provided by echo. However, due to irreconcilable differences in the various versions of echo extant, the version has few special features, leaving those to this new printf utility, which is based on one in the Ninth Edition system."
The behaviour of `echo` is especially inconsistent for escape sequences. Often, `echo -e` is used to enable escape sequences for it but this is not POSIX compliant.
For any "standards-perferring" shell scripting pages I'd thus think `printf` is the right way to present it if only to advertise that this builtin exists.
> should likely be done with external tools just because people understand "oh, awk or sed -- I know this!" instead of "what the hell is this line noise?"
I am not actually sure about this "I know this" part. I think many users of AWK only ever use it to do some sort of "access the n-th column of input" as in `echo a b c | awk '{ print $2 }'`. Similar for `sed` which is often only known and used for "replace string a by string b in input" use cases.
> If you're writing performance optimized shell, well, probably put the keyboard down
There is some difference between performance optimized and "hugely inefficient" that can make the difference between a task taking minutes vs. a few seconds. It may not seem much but shell scripts are often integrated into automated processes (think of build environments or system startup tasks) and there, seconds quickly accumulate. Whether this is relevant to your use case, only you can know.
I found this "Pure Sh Bible" to be highly interesting, but there are some places where it has some rough edges and it may not show the "best practices" for standard use cases. Still a very useful resource.
There are a huge variety of things you can do in shell that you simply shouldn't. If you're adventuring into a world where it matters if there's a newline at the end of your output as a result of your script running on noodlebsd or ache or freeGum, you've already lost.
Don't play the game. Don't do things where it matters what the exact output of echo gives you.
Did you know that you can't put a null into a shell variable? You can't. Also, you shouldn't ever be in a position to care. Did you know that you can fiddle with the "IFS" to let you parse CSVs? Don't ever do that; it works but nobody will ever understand what the hell you did. You can make an case statement and evaluate it with eval to make your own shell script that writes its own shell scripts. Please don't.
All of these adventures are possible, and work fine, and I've done them and come back to the code a decade later and I both understand what I was trying to do and the code still works across a variety of bourne shell interpreters. Nevertheless, these are things that shouldn't be done except to flex.
Yes. printf is not a "flex", it's the only safe way to display arbitrary strings. The link mentions one good reason to avoid echo (string might be "-n"), but another important reason is that the behaviour of echo is implementation-defined if the string contains backslashes (e.g. `echo 'foo\nbar'` displays one line in bash but two lines in dash and zsh).
Here are my scary evals to use basic ANSI/vt220 color.
N="$(printf '\033[')" x=30
for a in Bl R G Y B M C W # 4-bit Black Red Green Yellow Blue Magenta Cyan White
do eval $a='$N'"'"$(( x))"m'" \
b$a='$N'"'"$((60+x))"m'" \
${a}bg='$N'"'"$((10+x))"m'" \
b${a}bg='$N'"'"$((70+x))"m'" # bX=bright Xbg=background bXbg=brgt bgnd
x=$((x+1))
done
N="${N}0m"
printf "$Y${Gbg}I am yellow on green.$N\n"
I pasted this into my Android terminal, and it works as advertised.
I'm scratching my head at most of it and there's large portions that are incomplete or trivial. bash is ubiquitous. If you're trying to develop an entire web framework to run on a Bourne shell from busybox, dash, or FreeBSD 4, there's probably a better way to do it. Use the tool you have, not a hypothetical one compatible with 1983. YAGNI.
Here's a common pattern for appending to PATH system-wide once:
# /etc/profile.d/foo.sh
DIR=/opt/foo/bin
case ":$PATH:" in
*":$DIR:"*) ;;
*) PATH="$PATH:$DIR"; export PATH ;;
esac
unset DIR
If zsh or bash is available:
# $1 var
# $2 prepend this string to var
# $3 separator (: default if omitted)
# doesn't remove or prevent duplicates
prepend() {
eval "export $1=$2\${$1:+${3-:}\$$1}"
}
# $1 var
# $2 append this string to var
# $3 separator (: default if omitted)
# doesn't remove or prevent duplicates
append() {
eval "export $1=\${$1:+\$$1${3-:}}$2"
}
Your "zsh/bash" code doesn't actually use any zsh/bash-only features, so it is compatible with POSIX sh. But you have a quoting bug: you should escape the `$` in `$2` and `${3...}` to protect from double-eval.
I'd personally try to minimise the use of eval, e.g.
prepend() {
local val
eval "val=\$$1"
export "$1=$2${val:+${3-:}$val}"
}
append() {
local val
eval "val=\$$1"
export "$1=${val:+$val${3-:}}$2"
}
Programmers do that without realizing it all the time. There's like five classes of exploits that are just programmers assuming whatever variable they are interpolating is safe because they assume their language handles all security problems for them.
If you are at this level of required complexity as in those examples, you should use a proper programming language, not shell. Half those snippets fail with spaces in the wrong place, newlines, control characters, etc.
I think all such shell "magic" should come with a huge disclaimer pointing the user at better alternatives such as perl or python. And all snippets should have the necessary caveats on horribly broken and dangerous stuff such as "if this variable contains anything other than letters, your 'eval' will summon demons" in eval "hello_$var=value" and stuff...
I enjoy using bash, and throw together little scripts now and then. It is convenient to be able to wrap up some bash commands and turn them into a script, when I realize I’ve been using them repeatedly.
But, every time I see examples of how to write sh scripts properly, it makes me wonder if this is just the wrong way to look at the world. Maybe it would be easier to extend Python down to make it better for command line use. Xonsh or something.
Ansible is like Python but for scripting (orchestration). They used a YAML format and with all the curly braces and quoting, made it just as bad as shell.
Easier, lightweight syntax for shell-like pipes, command execution and catching stdin/stdout/stderr.
Something like Perl's IPC::Run.
Also, more shell-relevant stuff in the default distribution, so that one doesn't need to care about any modules (which is the primary reason for using bash or even sh, those are installed practically everywhere along with at least coreutils and stuff). Edit: examples that a standard python doesn't really do would be quick and easy recursive directory traversal and doing stuff to files found there (like the unix 'find' tool), archive (de)compression, file attribute operations (not only simple permissions but also ACLs, xattrs, mknod, etc).
But the sister comment clarified it in another way, so this maybe irrelevant.
Sorry, I was sloppy. I meant using it as the system shell. So, processing arguments, I guess, would be less of a big deal.
Convenience features, like ls and other classic shell commands being run without parentheses would have to be handled… I’m not breaking any new ground here, actually this has gotten me to look into xonsh and it looks pretty decent.
Yes, but actually no: I was tempted to include it but didn't. The one big argument for bash and sh is ubiquity and compatibility. Perl also has those. Python is somewhat lacking in those. Ruby is very lacking on both.
Using a more advanced language just because you find shell syntax to be wacky is like using a car to get groceries because you find panniers or a backpack to be wacky. It's the use case that matters; if you're 60 meters from the store, just use your bike, or walk.
There are plenty of cases in which Perl or Python will make things much more complicated than 5 lines of spooky-looking shell script. Sometimes a little mystical sorcery is what the doctor ordered.
Shell is full of ridiculous footguns though. It's like saying if the store is only 60m away (across an Indiana Jones-tier trap gauntlet) then just walk there.
Remember that time bumblebee accidentally deleted everyone's /usr? Or that time steam deleted everyone's homedir? Both because of the easiest to avoid bash footguns - spaces and empty variables.
My hard and fast rule that I've never regretted is - if you use a bash if statement, consider python. If you're going to do a for loop, you must instead use python.
Typically as a side effect once the programmer is in python at my prompting, they find having such easy access to libraries suddenly lets them make the whole script more robust - proper argparsing is an immediate one that comes to mind.
Frequently the reticence I see from people, especially juniors, is that they're worried about people thinking "haha they have to pull an entire python into their image just to run this script" or "wow they're so newbie/young that they can't even write some shell". I reassure them: don't worry, there's a reason we used Perl even back then too.
Personally I've found Python to have significantly fewer foot-guns than bash.
The biggest reason why I don't use it all of the time is that calling / piping commands takes a lot more typing, so it's easier to use bash for very simple shell scripts. And while there are libraries that simplify shell scripting, that adds external dependencies to your scripts.
> Remember that time bumblebee accidentally deleted everyone's /usr? Or that time steam deleted everyone's homedir? Both because of the easiest to avoid bash footguns - spaces and empty variables.
I have written my large share of bash scripts at this point in my life. However, I recently started a new project. I opened up a file and started noodling out a sh script. I stopped exactly because of what you are saying. I then installed powershell. I have not decided if I am going to use powershell, python or ansible yet for this. But as it is gluing a bunch of commands together with some string manipulation and some very simple math calculations powershell seems better for the job in this case.
Bash is also good at these things but it feels like you are using weird archaic tools to get things done. They work very well but the syntax on some of the commands you end up having to use are entire oriley books by themselves. It has its own odd way of doing things. Bash is nice for when you know you can not really manipulate the environment. As it is fully complete and usually 'table stakes' for what is installed. It is just kind of odd the way it works. In this case I decided to start with something that is more akin to what I am used to writing.
Bash may or may not. It depends on how your script was called. If your script is sourced you need to remember to set and restore the flags.
In bash your data can accidentally become code. "rm $fn" usually deletes one file, but it might one day delete a few (spaces), or wildcard expansion makes it delete many. With Python, calling the function to delete one file will always delete one file. Your function will never run with a "continue on errors" mode.
oh come on, whatever is feeding files to the function I'll just trick into using some other data with different files. you don't need to "execute data" to have substitution bugs.
and it's easy to add status checks to your shell script just like you can for Python. exceptions are not the only way to stop on error. but it's sure a hell of a lot easier to have a non-working program in Python, whereas it's a lot easier for a shell script to keep working.
The "loop over the contents of a file" entry does not protect stdin.
Use an alternate descriptor.
while read -r line <&9
do printf '%s\n' "$line"
done 9< "$file"
Particularly problematic is ssh, that will pass $line to any remote command that is able to consume it.
The looped commands will receive stdin of the loop when an alternate descriptor is used. I use 9 by habit after reading the flock manual page many years ago.
::EXIT trap
dash will only call the EXIT trap on a normal exit. Add INT (and any other signal that you want from "kill -l") if you also want to catch abnormal terminations.
Windows busybox sh/bash only catches regular exits, no matter what else you add.
POSIX shell scripting is just broken. Don't get me wrong. I love doing it, but from a language design point it is abysmal.
Just this week I tried to write a script to apply a custom function to a list of files. But the amount of time you spend just to make sure it works with special cases like spaces or new line characters in file names is not healthy. After all, it is probably one of the most standard scenarios.
First, I thought I would use 'find -execdir' or 'find | xargs', but neither supports shell functions. So I went on to take the 'while read' road, just to learn, that POSIX 'read', does not support all options you would need to make that work with all special cases.
There's an option to install the "bash" applet as a link to either ash or hush, the two shells that busybox comes with. Turns out that a large number of "bash scripts" use no bash-specific features in spite of using "bash" in the #!, or only a few bash-specific features like [[ ]].
It's disabled by default, and arguably not a good idea to enable it because compatibility is not great as you found out. Either way, "busybox bash" doesn't exist: only the option to alias ash or hush to bash.
The bash you're seeing is a symlink to busybox, as are all busybox applets.
You can configure busybox to install an applet named "bash", but it's not a full bash shell. It's basically busybox ash, with maybe a few bash-specific extensions implemented. For example, it doesn't support arrays; `arr=(10 20)` will give you a syntax error.
In a default configuration, there is no "bash" applet. You can optionally configure busybox to alias "bash" to either "ash" or "hush" (and likewise for "sh"). This allows the use of "#!/bin/bash" scripts, but only if they don't use bash-specific features not supported by busybox.
You can implement arrays for POSIX shell in POSIX shell by changing the input separator to new line, and writing some simple helper functions to search for it. It’s not the prettiest solution but works for many use cases of arrays in shell scripts.
I feel the same way about "advanced" shell scripting techniques as I do about "cutting-edge" accounting techniques: someone's probably going to jail.
I have written some big complex systems in shell and regretted it.
My rule of thumb is that if a script needs error handling, it's time to write it in a real programming language. Tricky shell programs running as root are a real danger.
It's nice to know that some of these techniques exist, but better not to use them. Many of them have to do with string manipulation, which is a sign that a proper language with data structures would be a win.
Many people don't realize that perl is part of the Debian base system. If you are going to go crazy with one-liners, then that's a better tool. I would generally recommend Python, though.
Avoiding bash-isms is useful so that you can run under busybox shell, reducing the size and dependencies of containers.
Embedded systems often use shell. If you are tempted to install bash for more power, a better solution is probably lua. It's smaller and saner.
Some nice stuff, but also some stuff, where things get easier if you just switch to a language with less caveats (like Python).
My rule of thumb is to use Bash to write simple scripts, if they fit on the command line and have just one level of loop or if-else (lines can get very long, though ...). However for more complicated stuff I use the alternative programs (find, sed, awk, ...), as they behave more predictable.
Furthermore I am not sure one should use such simplified versions, that are just wrong:
is_float() {
# Usage: is_float "number"
# The test checks to see that the input contains
# a '.'. This filters out whole numbers.
[ -z "${1##*.*}" ] &&
printf %f "$1" >/dev/null 2>&1
}
I mean having a '.' in the string does make it a non-integer, but what about other floats like 2e3?
> where things get easier if you just switch to a language with less caveats (like Python).
For general, non-domain specific things, it gets IMO easier by two main reasons 1) obviously when using tools where one has more experience with, but 2) also when the tool is more widely available, and POSIX shell is probably one of the most (by default) available interpreters with a somewhat human-friendly input mode there is. I know if I write a script or some functions in POSIX shell I can copy it and run it wherever I want (besides Windows, but don't really know anybody close to me that still uses that, especially in a server environment). Sure, if everywhere you want to run it python/perl/... is just available, or simple to install (i.e., any modern Linux or *BSD distro), then the second point hasn't that much weight anymore besides maybe for minimal Alpine Linux CTs (< 8 MB for a full fledged distro!).
> I mean having a '.' in the string does make it a non-integer, but what about other floats like 2e3?
They use the printf "%f" float format conversion so `printf "%f" 2e3` will work just fine, whereas using something that definitively is not a float makes printf exit with an error code as it fails to format the value as float, failing this check.
> They use the printf "%f" float format conversion so `printf "%f" 2e3` will work just fine, whereas using something that definitively is not a float makes printf exit with an error code as it fails to format the value as float, failing this check.
You are of course correct (sorry to sound like a language model), I missed the printf.
Actually, the function returns success if the input contains "." and printf succeeds, so it does not consider 2e3 to be a float. I agree that this function is odd (it's not even useful for identifying simple numbers of the form "123.456", because it considers 2.0e3 to be a float).
Also these operators from C that the author includes. Is he suggesting these are available in POSIX sh.
Quoting from the Pure Sh Bible:
+= Plus-Equal (Increment a variable.)
-= Minus-Equal (Decrement a variable.)
*= Times-Equal (Multiply a variable.)
/= Slash-Equal (Divide a variable.)
%= Mod-Equal (Remainder of dividing a variable.)
<< Bitwise Left Shift
<<= Left-Shift-Equal
>> Bitwise Right Shift
>>= Right-Shift-Equal
& Bitwise AND
&= Bitwise AND-Equal
| Bitwise OR
|= Bitwise OR-Equal
~ Bitwise NOT
^ Bitwise XOR
^= Bitwise XOR-Equal
One could probably learn from reading the POSIX standard, the Almquist shell source code and then experimenting. I also like reading other peoples' shell scripts, if they are good. Always something new to learn. Unfortunately this "Bible" did not teach me anything new.
Thank you. This helps. But does that mean a POSIX shell, or any other UNIX utility, must implement these operators. Using them in shell scripts makes the scripts non-portable, e.g., Almquist sh, NetBSD Almquist sh or Debian Almquist sh do not support them. Maybe the author of the "Pure POSIX Sh Bible" always runs Bash in --posix mode. Hence "Pure POSIX sh".
This just means that you're going to have to find a better way to describe the level of backwards compatibility you're going for other than "POSIX compliance", because these do seem to be required in the standards.
Seems like there are different interpretations by authors/maintainers of what is required versus what is optional.
As an ordinary user, I am most interested, perhaps mistakenly, in POSIX for a single reason: portability. (Whether portability is a goal of POSIX I am not sure. I have not done much research. Maybe it isn't.) As a user, I want to be able to write scripts on Linux than run on BSD and vice versa. Perhaps I have conflated portability with "POSIX compliance". However, as a practical matter, I would not use these operators in scripts that I wanted to be portable. When I have the motivation, I am working on a unenhanced port of NetBSD sh to Linux, i.e., without the Herbert Xu changes. Being lazy, so far I just added tab completion to dash so it feels more like NetBSD sh.
The "Bible" I would be interested in reading, if it exists. is the "Portable Sh Bible". (I am not a Bash user.) When I have a question I usually consult https://www.in-ulm.de/~mascheck/
I found something neat recently. The coreutils version of `test` doesn't have this, and when you use `test`, typically what you're using is the coreutils one.
But if you use `builtin test` to force the bash builtin variant of `test`, this has a nice `-v` switch, which allows you to check if a variable is set.
I found out about this recently when I had to use it in my bash argument processing library, to check if an option expecting a value had not been provided a value (since checking for an empty variable instead would mean that an actual empty string would also be ignored). (see here: https://git.sr.ht/~tpapastylianou/process_optargs/tree/main/...)
Note that `test -v` is not in POSIX sh (see https://pubs.opengroup.org/onlinepubs/9699919799/utilities/t...). And you don't need `builtin` because `test ...` invokes the builtin by default; if you want to run coreutils test you need to use `/usr/bin/test ...` or `env test ...` or `(exec test ...)`.
The usual way to check if $2 is present is to use `[ $# -ge 2 ]`. For named variables, I like to use `[ "${myvar+set}" != "set" ]` (the parameter expansion `${myvar+word}` expands to "word" if myvar is set or "" if it is unset, see https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V...).
(Often people want to treat an empty value as unset, so they just use `[ -n "$myvar" ]` to check that the value is non-empty. If you enable `set -u` to treat references to unset variables as errors, this should instead be `[ -n "${myvar-}" ]`.)
Some comments on the script you linked:
I highly recommend ShellCheck (https://www.shellcheck.net/), it picks up a legitimate issue in your script: `tr -s [:alpha:] Y` should be `tr -s "[:alpha:]" Y` (otherwise it fails if the current directory contains a single-letter filename).
Also, you should use `printf "%s\n" "$var"` instead of `echo "$var"` in case $var is "-n" for example.
All great comments, thank you! I feel the need to reply, even though there isn't a real need to, but here goes:
1. True; but in this case we're talking about bash specifically, so I prefer to use cleaner syntax whenever this is available. I find ${var+x} a bit too hacky ...
2. Huh, I was under the impression that bespoke commands overrode builtins. Good to know. What the hell is even the reason to have a 'builtin' keyword then ... just for "alias test=/usr/bin/test" style scenarios??
3. True about $#. Not sure why I didn't think of that, d'oh.
4. Thanks re Shellcheck. I do have it in mind, and was planning to use it prior to releasing a v1; but process_optargs came out of a larger project which is still in v0 (https://git.sr.ht/~tpapastylianou/bashtickets), and I only forked it into its own thing because someone asked me about it on HN, so I lost track. Thanks for the reminder. Having said that, I can't see why the error you describe would occur in this instance. Would you mind explaining? Why would the presence of a single-letter filename have anything to do with the piped output to tr?
5. Good point about printf (or about habitually passing "--" as an argument to things, I guess).
> I can't see why the error you describe would occur in this instance. Would you mind explaining?
Sure, it's due to pathname expansion. The most commonly used pattern is `*` (e.g. `echo *.txt` might expand to `echo foo.txt bar.txt`), but there is also `[...]` for matching a character range. If the current directory contains files named "i" and "j", then `tr -s [:alpha:] Y` will expand to `tr -s i j Y`, which is not what you want. (The reason it works when the current directory doesn't contain single-letter filenames is that a pattern which doesn't match any filenames is left as-is.)
Ah right, gotcha. Actually I think `i` and `j` would be fine, since the whole point is that [:alpha:] will expand to any of the characters contained in that range before it has a chance to be interpreted as the intended character class (and therefore, i and j are safe from accidental pathname expansion, since they don't appear in that range).
But yes, this indeed causes problems if you have a file called `:`, `a`, `l`, `p`, or `h` on the system.
True, but I consider this very hacky, error-prone, and unnecessary when a clear, bespoke test flag exists.
Also, if you prefer the "[ ... ]" syntax for testing, then you can use `-v` directly anyway (since that is equivalent to the builtin test keyword anyway).
Unfortunately, it seems that he has dropped off the grid. Didn't realize this until you mentioned it actually. It's really too bad. Not a lot is known about why, but more information can be found here: https://www.reddit.com/r/kisslinux/comments/lsbz8n/an_update...
i dont know about anyone else. but since ChatGPT, my shell game is probably the one that is most levelled up.
A lot of things i would either do manually because i had forgotten shell programming, or i would have done in a seperate pyenv with 2-3 packages i can get done in about 4 minutes.
Beware, I would not trust ChatGPT to produce safe scripts. At minimum, consider using ShellCheck on the result to detect common mistakes such as not quoting variables. Writing safe scripts is very hard.
But the lines and code snippets presumably won’t match the style of the rest of the program (or each other), so I guess that you’ll get programs which does things wildly differently from line to line.
Sounds like you've never actually used it for writing code, and are basing this on how you think it works. As someone who uses it for hours a day every day for writing code ... no, it does not have that problem.
I'll go a step further and say "just use zsh", which is clearly superior for scripting and installed so easily it hardly matters.
(aside: I don't think bash the "the default on most *nix systems"; it's not on BSD or macOS (which does ship with a very old bash), and while it's certainly common on Linux even there it's not universal)
It shouldn't. "Bible" etymologically derives from "book", and its meaning denotes an authoritative book, or "the" book on a topic.
While it is true that the "Christian Bible" has customarily been shortened to just "Bible", in principle it is appropriate to call any authoritative book that claims to be "the" book on a topic as a "Bible", without necessarily carrying any religious connotations.
That's not to say that people might not make that link mentally, but technically and etymologically speaking the term is not religious. In the same way that some people may no longer identify as gay=happy, but if you start getting offended about people acting having a "gay old time", then that would say more about your sensitivies than about how people choose to use the word in different contexts.
The common phrasing christians use is actually "The Holy Bible", with "Holy" being a common word in the bible meaning "set apart for God". So "The Holy Bible" is that particular "Bible" (book) which has been set apart for God.
In fact, calling it a Holy Bible implies there are Bibles which are not Holy. As one would expect, given a bible is just a book, in greek.
It may be in poor taste, but from a religious person’s point of view, quite a bit of the current culture in most western nations is in quite poor taste. I understand and sympathize with your objection, but I think that those called to faith (including myself) need to make peace with secularization and do as we are taught: take the road of peace at all times, accept without condoning, approach all things with open hands, provide a better example, help when help is needed, and be willing with the courage of our convictions to testify for faith when asked (and only when asked).
Struggling to see it. Think this might just be a you thing? Maybe talk to a religious leader in your community, they might help you with a different perspective that'll speak to you.
From the Cambridge Dictionary definition of the English word "bible":
> a book, magazine, etc. that gives important advice and information about a particular subject: Vogue magazine quickly became the bible of fashionable women.
But in Hebrew, "The Torah" doesn't mean a certain book anyway. There are "The five one-part-of-five's of Torah". The Torah is like the teachings, or the postulates, or the theory. So there's Jehova's Torah, or Torat Hashem; but there's also your mother's torah: "Remember, my son, your father's Mussar (= mores), and do not abandon your mother's Torah (= teachings)".
For instance, using eval to have variables with variable names is madness.
$ var="world"
$ eval "hello_$var=value"
$ eval printf '%s\n' "\$hello_$var"
I suppose the fancy business with printf is flexing, but I suspect the majority of people scanning documents like this would prefer to have examples with echo "${variable%%string_modifier}"
Escape sequences are mostly dependent on you being on a VT102 compatible terminal; I'm not sure what happens if you're on an ASR33 and you send this to it...
I'd consider type checking (is this a float) to be similarly cursed witchcraft -- after all I've run into all sorts of "floats" that aren't just 123.456 strings...
Overall -- there's a huge range of things shell can do that you probably should just avoid. Similarly, even if you can do it in a clever shell way (the bit shifting math for instance, or extensive string manipulation) should likely be done with external tools just because people understand "oh, awk or sed -- I know this!" instead of "what the hell is this line noise?" If you're writing performance optimized shell, well, probably put the keyboard down and walk away for a bit and reconsider your life choices.
Good doc; I'll probably stick to the man page though.