Bash Script Sunday #2: Writing a Safe and Robust Bash Script

In today’s post, I’d like to give some insight into writing safer scripts using Bash’s built-in options.

Why Script Safety Matters

A poorly written script can cause unintended data loss, infinite loops, or security vulnerabilities. Writing robust scripts ensures they behave predictably and handle errors gracefully.

1. Enabling Safe Bash Options

Bash provides built-in options to catch errors early and prevent common pitfalls.

Add this at the start of your script:

set -euo pipefail

Breakdown of these options:

  • -e : Exit immediately if any command fails.
  • -u : Treat unset variables as errors.
  • -o pipefail : Return the exit status of the first failing command in a pipeline.

Example: Handling unset variables safely

#!/usr/bin/env bash
set -euo pipefail

echo "Processing file: $filename"

Running this script without defining $filename will trigger an error instead of running with unintended behaviour.

2. Input Validation

Never assume user input is correct. Use read with validation checks.

read -p "Enter a directory path: " dir
if [[ ! -d "$dir" ]]; then
    echo "Error: '$dir' is not a valid directory."
    exit 1
fi

3. Preventing Dangerous Expansions

If handling filenames, always quote variables to prevent issues with spaces or special characters.

Bad example:

rm -rf $dir/*

If $dir is empty, this expands to rm -rf /* - disastrous!

Safe version:

rm -rf "$dir"/*

4. Using Temporary Files Safely

Use mktemp to avoid collisions instead of relying on predictable filenames.

tmpfile=$(mktemp)
echo "Working in $tmpfile"
rm -f "$tmpfile"

5. Handling Errors Gracefully

Use trap to clean up resources on failure.

cleanup() {
    echo "Cleaning up..."
    rm -f "$tmpfile"
}
trap cleanup EXIT

Conclusion

By following these principles, you can make your scripts safer, more reliable, and easier to maintain. Always test your scripts in a safe environment before using them in production.