DO NOT DO: echo | read -r... ... | while read; do... Bugs in Bash shell. See details and new "SOLUTION"! below. Exit code / return values can only be integers 0 <= x<= 255. After 255 they start back at 0. Can unshift inside of a 'for x in "$@"; do' loop. DO NOT DO: expression && { stuff } || { other stuff } This is NOT like if/else because other-stuff is entirely dependent on the last command in the 'stuff' block. Use if/else/fi if that is what you want to do! <&- closes stdin, which is similar but not equivalent to: &- "typeset" == "declare" in Bash. Great associative array docs: http://www.artificialworlds.net/blog/2012/10/17/bash-associative-array-examples/ Use /bin/bash in interpreter line, because Linux has no /usr/bin/bash (no diff to Solaris since one dir is just a link to the other). Better is: #!/usr/bin/env bash because it uses search path. Old versions support juse 2 params (probably 1st token + everything-after-the-first-token). Modern versions of 'env' support -S argument which parses everything after it exactly as you would want. E.g.: #!/usr/bin/env -S node -e 'blah blah; blah˛' No "print" command. No "whence" command. The external "which" command cannot, of course, determine if a command is built in or not. But see "type". $(subshell) just like ksh. Token parsing: Give quoting to $(cmd) command to satisfy the command and to generate required CHARACTER output. Tokenizing makes no difference here, since the shell will eval the $(cmd) output. Apparently can not pass through unusual tokenization. Double-quote the entire $(...) cmd to output a single token instead of tokenized. Apparently not with a X=$(...) assignment (see down few lines) GOTCHA! trailing \n is stripped out if present with =$(...) and ="$(...)". VARIABLE ASSIGNMENT Special Cases. Trailing newline is trimmed: V9=$(echo -e "one two\n") === V9=$(echo -e "one two") No reason to ever X='$(...)' or Y="$(..)" X=$(...) Acts as if $(...) were double-quoted. To assign an array var, do Y=($(...)) May make sense to to X="other$(...)other" By default $(...) captures just stdout. Stderr flows through, so do something with it. Useful: scalar string length is ${#name}. echo Nothing analogous to "-u2". Must suffix command with 1>&2. '-n' works as expected. Escapes like \c and \n only work if in xpg mode (or use "echo -e"!) xpg mode is set by -e (on) / -E (off) / "shopt [-su] xpg_echo". There is just no portable way to print a non-line-terminated string with echo. Try using printf. Arrays Like Ksh arrays except Optionally declare like "declare -a name" instead of "typeset -A name". Or "declare -a name=(el1...)" instead of "typeset -A name el1...". Regular arrays are called "indexed arrays". 0-based indexing. -a Hash/maps are called "associative arrays". -A String keys may not contain metacharacters, even if you quote them! [? I think that with older bashes, "declare -a x" creates one zero-length element in the array] Multi-assignment like "name=(val1 val2...)" for indexeds instead of "typeset -A name val1 val2...". and for associatives like "aa=([hello]=world [ab]=cd)". Append to indexeds like either name=("${name[@]}" "four" "five") ! N.b. this method is VERY SLOW MUCH faster to read from a file like: ORIG_IFS="$IFS" IFS=$'\n #GLOBIGNORE='*' Not necessary for me??? ABC=($(< /tmp/numbers)) IFS="$ORIG_IFS" OR name[${#name[@]}]=six INSTEAD OF 'typeset -A name "${name[@]}" "four" "five"' printf -v works great with indexed vars: printf -v ARR[7] '(%s)' 'helo w' Last-val idiom works just like ksh: ${name[${#name[@]}-1]} N.b. that ${#name[@]} is VERY DIFFERENT from ${#name}. Array Length: ${#name[@]} Length of 1st element: ${#name} "$@" is severely broken in some recent bashes, but it's fixed w/ current. "> filename" does not update the file timestamp, at least if the file was empty to begin with (i.e., is unchanged). INVOCATION Login shell: arg0 of -* , or with --login switch. Interactive shells always have i in $-. Interactive logins OR --login: /etc/profile ~/.bash_profile OR ~/.bash_login OR ~/.profile ~/bash_logout upon exit Interactive NON-login OR if /etc/profile sources OR via rshd ~/.bashrc Non-interactive: $BASH_ENV + IF invoked via a remote shell daemon: [ ~/.bashrc IMPORTANT! IMPORTANT! IMPORTANT! IMPORTANT! IMPORTANT! IMPORTANT! Excellent Bourne idiom to set variables based on command output: command | read -r VAR # gets entire stdin line command | read -r -a newArrayVar but broken in Bash!!! BUGS (Applies to Bash's "sh" too. May exist in ksh too. Does not exist in zsh) #1 "read" silently fails with "piped" stdin. read < /tmp/1line works fine, whereas cat /tmp/1line | read does not set $REPLY read on is, everything after | is run in a subshell so no variable changes made in the subshell are accessible AFTER the subshell. #2 "exit" DOES NOT WORK inside of PIPED read blocks. while read; do exit 3; done < /tmp/atf works fine, whereas cat /tmp/atf | while read; do exit 3; done will not exit shell. #3 There is related complicated problem where inside of code blocks reading from a command pipe or from a file input redirect, shell commands (at least) will gobble up all remaining input. The way to handle this is to &-, often fails for unknown reasons, I think to do with tty setup. Just redirect to /dev/null instead for maximum reliability. alias values are expanded extremely close to as if you typed them in like that. They are very different from functions in this respect. GOTCHA for multi-command aliases. The commands are not grouped in any way, so that prefixes apply only to the first command and redirections only apply to the last command. VAR=x aliasname will only export VAR in the first inner command. tee gotcha: tee is a separate program so you lose the exit status of the "piping" program if you use tee. piping_program 2>&1 | tee -a file.log || BAD TEST Test for integral variable (incl. command-line param): case "$VAR" in *[!0-9]*) echo NON-INTEGER;; esac To handle both true and false cases, use cmd && true action || false action If you have the || first, the true action will always trigger &&, || chains Don't do, because operators other than the first are just dependent on the evaluation of final command within previous block. I.e. with x && { y; z } || a execution of a is dependent both on x and on z. If you want a to be dependent on just x then you should us if/else. Mixing built-ins and prefixed variable assigments: "time" is a built-in and must preceed assigments: time V=vee /tmp/echov.bash "nohup" is not a built-in and must preceed assignments: V=vee nohup /tmp... N.b. nohup automatically sends stderr to stdout, so no need to 2>&1. Getting parent dir for executed script: http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in My version: case "$0" in /*) SCRIPTDIR="${0%/*}";; */*) SCRIPTDIR="$PWD/${0%/*}";; *) SCRIPTDIR="$PWD";; esac case "$SCRIPTDIR" in *?/.) SCRIPTDIR="${SCRIPTDIR%/.}"; esac while [ -h "$SCRIPTDIR" ] ; do SCRIPTDIR="$(readlink "$SCRIPTDIR")"; done Wildcards in case candidate and conditions are independent of file system: case ab* in ab*) echo Y;; esac === case 'ab*' in 'ab*') echo Y;; esac For... case abb in ab*) echo Y;; esac And case ab* in abb) echo Y;; esac makes no difference if files ab* in current directory. Can use curlies to make a redirect block, all in the same shell process. Closing curly must be followed by \n or ;. [!...] much more portable than [^...]. Great way to format env var contents: printf -v VAR 'format str' args... # == sprintf Crazy comparison for strings including \\. Can't mix quote types even though both quoted strings may resolve to same thing: TRUE: [ 'e:\a' = "e:\b" ] TRUE: [ 'e:\a' = 'e:\b' ] # etc. TRUE: [ 'e:\\a' = 'e:\\b' ] # etc. FALSE!: [ 'e:\\a' = "e:\\b" ] Inheritance of functions and aliases Aliases are NEVER imported. To use aliases from script you must load them and: shopt -s expand_aliases By default, functions are not exported. Just: export -f functionToExport Shell option '-a' automatically exports ALL env vars AND FUNCTIONS. Cannot use subscript with $* or $@. If you must, then just copy to a new named array. Validating numbers. Use negative character class(es) with either [^ or [!. For example case "$NUM" in '|[^0-9]') bad;; esac Token parsing for case "cases" is abnormal. With regular token parsing '|x" == ''|x but that's not so with case cases. Must use 2nd form to match empty string. 'command' built-in is useful in shell functions to avoid self-recursoin. Character escape substitution is done by the "echo" implementation not the command-line parser. Therefore, for example to pipe in a multi-line string you can't do "cat <<< 'one\ntwo'", you must instead do: cat <<< 'one two' OR cat <<< $VAR_WITH_BINARY_NEWLINES Major limitation of <<< is that it always adds a \n to the end of the specified text and there's no way to prevent that. Quote Nesting Double-quote nesting works intuitively, where internal " may be escaped like "A string containing \" character" But a single-quoted string MAY NOT CONTAIN a single-quote. Idiom for test for existence of filesystem nodes EXPAN=$(echo appl-*); [ "$EXPAN" = "appl-*" ] && echo T Self-logging According to https://serverfault.com/questions/103501/how-can-i-fully-log-all-bash-scripts-actions #!/bin/bash exec 3>&1 4>&2 trap 'exec 2>&4 1>&3' 0 1 2 3 exec 1>log.out 2>&1 # Everything below will go to the file 'log.out': LOGNAME vs. USER. USER is marginally more effective-user a.o.t. who logged in as. But real difference is that LOGNAME is Sys-V-provided, USER is BSD_provided, and other variants provide either one or both. [...] vs. [[...]] Don't know why [...] isn't covered by Bash man page, but it is just a Bourne syntax for "test". [[...]] gives extended operations for Korn, Bash, etc. = vs. == Equivalent for Bash. = is Bourne. env var string manipulation: ${VARNAME} Most useful conditional variable dereferencing: ${X:+y} If X is set, then 'y'. If X not set then doesn't even make a param. Op examples: #*/ parsimonious strip from beginning ##*/ greedy strip from beginning %... same as above for end /INPUT/OUTPUT single pattern subst //INPUT/OUTPUT global pattern subst /#INPUT/OUTPUT single pattern subst at beginning /%INPUT/OUTPUT single pattern subst at end Patterns are ONLY filename globs! Skip /OUTPUT to just delete No good way to test for a SINGLE match of a FS glob. Best I can figure is to test if glob expands like _ar=(lmas*); [ "${_ar[0]}" != 'lmas*' ] && [ ${#_ar[@]} -eq 1 ] && [ -e "${_ar[0]}" ] function: See ~/code-templates/globTest.bash Pattern matching of *ANY* (incl. multiple) FS node names: compget -G 'wilcard*' >/dev/null Use '' or "" depending whether you want $ expansion first. -G says to check for exist of an matching files, can use all other "test"s, like -f to test for only a regular file. OR: _ar=(xmas.*); [ -e "${_ar[0]}" ] &&... [success] OR: ls -d xmas.* > /dev/null 2>&1 &&... [success] TODO: Which test is more portable, -a or -e? getopt. See ~/code-templates/getopt.bash To use inside a function, MUST declare OPTIND and maybe the getopts placeholder as local vars. 'command' built-in is for executing a command without function or alias execution. Outputting binary characters with echo -e: \1, \12, \123 octal \xa, \xab hex \ua, \uab, \uabc, \uabcd unicode hex \cx control-x character Coprocesses are awful. Fricking complicated and significant limitations. Use mkfifo + mktemp instead. FIFO=$(mktemp -t -u -p . fifo.XXXX) mkfifo $FIFO { trap 'rm $FIFO' EXIT; tee "$USER.log" < $FIFO | unbuffer -p tr e @ > transformed.txt; } & nohup sleeper.bash > $FIFO & INTEGER MATH ((...)) is cool but not for real/decimals! Just use variables without $ prefix in a SINGLE expression. Exit value is always 0, so that's not useful, but assignments done in the expr work! These operators work: (nest groupings), x++, x--, ++x, --x, 'x +=' == 'x =+', 'x -=' == 'x =-' IT IS elif GOTCHA: export prefix eliminates useful return from assignment RHS subshell failure: export X; X=$(ls /no/such/file) || echo Failure 1>&2 # works export X=$(ls /no/such/file) || echo Failure 1>&2 # does not work sort -u with counts: sort filename.txt | uniq -c To order according to frequency/counts: sort filename.txt | uniq -c | sort -n