Shell Course 5

Chapter 5: Automation with Shell Scripts

## The Power of Shell Scripts

We’ve explored many commands throughout this course, each powerful in its own right. However, the true magic happens when we combine these commands into shell scripts – reusable programs that can automate tasks, simplify complex operations, and save hours of repetitive work.

Shell scripting is the art of creating these miniature programs. It’s like having the ability to craft your own specialized tools, perfectly adapted to your specific needs. Whether you’re managing systems, processing data, or simply trying to make your digital life more efficient, shell scripts are invaluable allies.

Think of shell scripting as a form of digital alchemy – transforming the base elements of individual commands into gold by combining them in clever and powerful ways.

Why Learn Shell Scripting?

Before diving into the details, let’s understand why shell scripting is worth your time:

  1. Automation: Perform repetitive tasks without manual intervention
  2. Consistency: Ensure operations are performed the same way every time
  3. Documentation: Scripts serve as executable documentation of procedures
  4. Efficiency: Accomplish complex tasks with minimal effort
  5. Customization: Create tools tailored to your specific needs

The ability to script effectively is what separates casual shell users from power users. It’s the difference between knowing how to use tools and being able to create them.


## Your First Shell Script

Let’s start with a simple script that greets the user and displays some system information.

Creating a Script File

First, create a file with a .sh extension using your favorite text editor:

1
$ nano hello.sh

Now, add the following content:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
#!/bin/bash

# This is a comment - it's ignored by the shell
# My first shell script

echo "Hello, $(whoami)!"
echo "Today is $(date)"
echo "Your working directory is $(pwd)"

# Display system information
echo -e "\nSystem Information:"
echo "---------------------"
echo "Hostname: $(hostname)"
echo "Uptime: $(uptime)"
echo "Kernel: $(uname -r)"

Save the file and exit the editor (in nano: Ctrl+O, Enter, Ctrl+X).

The Shebang Line

The first line #!/bin/bash is called the “shebang” or “hashbang” line. It tells the system which interpreter should be used to execute the script. In this case, we’re specifying the Bash shell. Other common shebangs include:

1
2
3
#!/bin/sh        # For the basic Bourne shell
#!/usr/bin/env python # For Python scripts
#!/usr/bin/perl # For Perl scripts

The shebang line is crucial for scripts that will be executed directly (without explicitly calling the interpreter). Without it, the system won’t know how to interpret your script.

Making the Script Executable

Before you can run your script, you need to make it executable:

1
$ chmod +x hello.sh

Running Your Script

Now you can run your script in several ways:

1
2
3
$ ./hello.sh        # Execute directly (requires shebang)
$ bash hello.sh # Explicitly using bash to execute
$ source hello.sh # Run in the current shell context

The difference between ./ script and source script is subtle but important: the former runs in a new shell process, while the latter executes in your current shell. This means that variables defined in a sourced script remain available after the script completes, while variables in a directly executed script disappear when the script terminates.


## Variables and Input

Scripts become truly powerful when they can work with different inputs and store information.

Defining Variables

Variables in shell scripts don’t need to be declared before use – simply assign a value:

1
2
name="World"
echo "Hello, $name!"

Important: There must be no spaces around the equals sign when assigning variables.

1
2
name = "World"  # This is WRONG
name="World" # This is correct

Using Variables

To use a variable’s value, prefix it with a dollar sign:

1
2
3
echo "Hello, $name!"
# or with braces for clarity, especially when next to other text
echo "Hello, ${name}s!" # Prints "Hello, Worlds!"

Command Substitution

You can capture the output of commands using command substitution:

1
2
3
4
5
6
current_date=$(date)
echo "The current date is: $current_date"

# Older style (still works but less preferred):
uptime=`uptime`
echo "System uptime: $uptime"

Reading User Input

The read command allows scripts to accept user input:

1
2
3
4
5
6
7
echo "What's your name?"
read username
echo "Hello, $username!"

# Read with a prompt in one line:
read -p "Enter your age: " user_age
echo "You are $user_age years old."

Environment Variables

Environment variables store information about your shell session. Common ones include:

1
2
3
4
echo "Your home directory is: $HOME"
echo "Your shell is: $SHELL"
echo "Your user name is: $USER"
echo "Your path is: $PATH"

Special Variables

Shell scripts have several special variables:

1
2
3
4
5
6
7
echo "Script name: $0"
echo "First argument: $1"
echo "Second argument: $2"
echo "All arguments: $@"
echo "Number of arguments: $#"
echo "Process ID: $$"
echo "Exit status of last command: $?"

These special variables are particularly useful for making scripts flexible. Instead of hardcoding values, use arguments to pass different inputs each time you run the script.


## Script Arguments

Arguments make your scripts flexible and reusable. Let’s create a script that uses arguments:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
#!/bin/bash

# A script to show file information

if [ $# -eq 0 ]; then
echo "Usage: $0 filename [filename2] ..."
exit 1
fi

for file in "$@"; do
if [ -f "$file" ]; then
echo -e "\nFile: $file"
echo "Size: $(du -h "$file" | cut -f1)"
echo "Type: $(file -b "$file")"
echo "Modified: $(stat -c %y "$file")"
elif [ -d "$file" ]; then
echo -e "\nDirectory: $file"
echo "Contains: $(ls -1 "$file" | wc -l) items"
else
echo -e "\nError: $file does not exist"
fi
done

Save this as fileinfo.sh, make it executable, and try it:

1
$ ./fileinfo.sh hello.sh /etc /non-existent-file

Always check if required arguments are provided, and show usage information when they’re not. The user will thank you later – and that user might be your future self!


## Control Structures

Control structures allow your scripts to make decisions and repeat operations. Let’s explore the most important ones.

Conditional Statements (if/else)

The basic syntax for conditional execution:

1
2
3
4
5
6
7
if [ condition ]; then
# Commands to run if condition is true
elif [ another_condition ]; then
# Commands to run if another_condition is true
else
# Commands to run if all conditions are false
fi

Example:

1
2
3
4
5
6
7
8
9
10
11
#!/bin/bash

read -p "Enter a number: " num

if [ $num -lt 0 ]; then
echo "The number is negative"
elif [ $num -eq 0 ]; then
echo "The number is zero"
else
echo "The number is positive"
fi

Test Conditions

The [ in conditionals is actually a command (an alias for test). Common test conditions include:

String comparisons:

1
2
3
4
[ "$str1" = "$str2" ]     # Equal
[ "$str1" != "$str2" ] # Not equal
[ -z "$str" ] # Empty string
[ -n "$str" ] # Non-empty string

Numeric comparisons:

1
2
3
4
5
6
[ $num1 -eq $num2 ]       # Equal
[ $num1 -ne $num2 ] # Not equal
[ $num1 -lt $num2 ] # Less than
[ $num1 -le $num2 ] # Less than or equal
[ $num1 -gt $num2 ] # Greater than
[ $num1 -ge $num2 ] # Greater than or equal

File tests:

1
2
3
4
5
6
7
[ -e "$file" ]            # File exists
[ -f "$file" ] # Regular file exists
[ -d "$file" ] # Directory exists
[ -r "$file" ] # File is readable
[ -w "$file" ] # File is writable
[ -x "$file" ] # File is executable
[ -s "$file" ] # File is not zero size

Always quote your variables in tests to avoid problems with empty values or whitespace.

Modern Test Syntax

Modern scripts often use double brackets instead of single brackets for tests:

1
2
3
if [[ "$str1" == "$str2" ]]; then
echo "Strings match"
fi

Double brackets [[ ]] offer advantages like:

  • No need to quote variables (though it’s still a good practice)
  • Support for pattern matching and additional operators
  • Logical operators use && and || instead of -a and -o

Case Statements

For multiple conditions checking against the same variable, case is often clearer:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#!/bin/bash

read -p "Enter a fruit name: " fruit

case "$fruit" in
"apple")
echo "Apples are red or green"
;;
"banana"|"lemon")
echo "That fruit is yellow"
;;
"orange")
echo "Oranges are orange"
;;
*)
echo "I don't know what color that fruit is"
;;
esac

Logical Operators

You can combine conditions with logical operators:

1
2
3
4
5
6
7
8
9
10
11
12
13
if [ $num -gt 0 ] && [ $num -lt 10 ]; then
echo "Single digit positive number"
fi

# Or with modern syntax:
if [[ $num -gt 0 && $num -lt 10 ]]; then
echo "Single digit positive number"
fi

# OR example:
if [ $num -lt 0 ] || [ $num -gt 100 ]; then
echo "Number is out of range"
fi

## Loops

Loops allow scripts to perform repetitive tasks efficiently.

For Loops

The basic for loop syntax:

1
2
3
for variable in item1 item2 item3; do
# Commands using $variable
done

Examples:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
# Loop through a list
for fruit in apple banana orange; do
echo "I like $fruit"
done

# Loop through files
for file in *.txt; do
echo "Processing $file"
# Do something with the file
done

# Loop through numbers
for i in {1..5}; do
echo "Number $i"
done

# C-style loop
for ((i=1; i<=5; i++)); do
echo "Count: $i"
done

While Loops

The while loop continues as long as a condition is true:

1
2
3
4
5
counter=1
while [ $counter -le 5 ]; do
echo "Counter: $counter"
counter=$((counter + 1))
done

A useful example is reading file contents line by line:

1
2
3
while read line; do
echo "Line: $line"
done < input.txt

Until Loops

The until loop runs until a condition becomes true:

1
2
3
4
5
counter=1
until [ $counter -gt 5 ]; do
echo "Counter: $counter"
counter=$((counter + 1))
done

Loop Control

Control the flow of loops with break and continue:

1
2
3
4
5
6
7
8
9
10
11
12
13
for i in {1..10}; do
if [ $i -eq 5 ]; then
echo "Skipping 5"
continue # Skip the rest of this iteration
fi

if [ $i -eq 8 ]; then
echo "Breaking at 8"
break # Exit the loop entirely
fi

echo "Number: $i"
done

Loops are the workhorses of shell scripts, handling everything from processing multiple files to repeatedly asking for valid user input. Master them, and you’ll dramatically increase your scripting power.


## Functions

Functions allow you to organize your code into reusable blocks, making scripts more modular and maintainable.

Defining Functions

The basic syntax for defining a function:

1
2
3
4
5
6
7
8
function_name() {
# Commands
}

# Alternate syntax
function function_name {
# Commands
}

Example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#!/bin/bash

# Define a greeting function
say_hello() {
echo "Hello, $1!"
}

# Define a calculator function
calculate() {
if [ $# -ne 3 ]; then
echo "Usage: calculate number1 operator number2"
return 1
fi

case "$2" in
"+") result=$(($1 + $3)) ;;
"-") result=$(($1 - $3)) ;;
"*") result=$(($1 * $3)) ;;
"/") result=$(($1 / $3)) ;;
*) echo "Invalid operator: $2"; return 1 ;;
esac

echo "$1 $2 $3 = $result"
return 0
}

# Call functions
say_hello "World"
calculate 10 + 5
calculate 20 "*" 3

Function Arguments

Functions access their arguments using the same positional parameters as scripts:

1
2
3
4
5
6
my_function() {
echo "First argument: $1"
echo "Second argument: $2"
echo "All arguments: $@"
echo "Number of arguments: $#"
}

Return Values

Functions can return status codes using the return statement:

1
2
3
4
5
6
7
8
9
10
11
12
13
is_even() {
if [ $(($1 % 2)) -eq 0 ]; then
return 0 # Success (true)
else
return 1 # Failure (false)
fi
}

if is_even 42; then
echo "42 is even"
else
echo "42 is odd"
fi

For returning actual values (not just status codes), use command substitution:

1
2
3
4
5
6
get_square() {
echo $(($1 * $1))
}

result=$(get_square 5)
echo "5 squared is $result"

Note: Variables in functions are global by default. Use local to create function-scoped variables:

1
2
3
4
5
count_words() {
local text="$1"
local word_count=$(echo "$text" | wc -w)
echo "The text contains $word_count words"
}

Functions are to scripts what commands are to the shell – building blocks that perform specific tasks. Well-written functions are self-contained, properly documented, and handle errors gracefully.


## Error Handling

Robust scripts handle errors gracefully, providing useful feedback and recovering when possible.

Exit Codes

Every command returns an exit code:

  • 0 indicates success
  • Any non-zero value indicates an error

You can check the exit code of the last command with $?:

1
2
3
4
5
6
ls /some/directory
if [ $? -eq 0 ]; then
echo "Directory exists"
else
echo "Directory does not exist"
fi

The set Command

The set command modifies shell behavior. Two useful options for error handling:

1
2
set -e  # Exit immediately if a command fails
set -u # Treat unset variables as an error

Example usage:

1
2
3
4
5
6
7
8
#!/bin/bash
set -e
set -u

# Script will exit if any of these commands fail
cd /some/directory
cp important_file backup/
rm important_file

Custom Error Handling

Create your own error handling function:

1
2
3
4
5
6
7
8
9
10
11
#!/bin/bash

error_exit() {
echo "ERROR: $1" >&2
exit 1
}

# Use it in your script
if [ ! -f "$input_file" ]; then
error_exit "Input file $input_file does not exist"
fi

Trapping Signals

The trap command lets you catch signals and perform actions before exiting:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
#!/bin/bash

# Cleanup function
cleanup() {
echo "Cleaning up temporary files..."
rm -f /tmp/temp_file_$$
echo "Done."
}

# Trap EXIT signal (runs when script exits for any reason)
trap cleanup EXIT

# Create a temporary file
echo "Creating temporary file..."
touch /tmp/temp_file_$$
echo "Processing..."
sleep 5
echo "Script completed."
# The cleanup function will run automatically here

Common signals you might want to trap:

  • EXIT: When the script exits for any reason
  • INT: When the script receives an interrupt signal (Ctrl+C)
  • TERM: When the script receives a termination signal
  • ERR: When a command returns a non-zero exit status

Proper error handling is often what distinguishes professional scripts from casual ones. It shows consideration for the user and anticipates real-world scenarios where things might not go according to plan.


## Practical Scripting Examples

Let’s put our knowledge to work with some practical examples:

Backup Script

A script to back up important directories:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
#!/bin/bash

# Configuration
backup_dirs=("$HOME/Documents" "$HOME/Pictures")
backup_destination="/mnt/backup"
date_format=$(date +%Y%m%d_%H%M%S)
archive_name="backup_$date_format.tar.gz"

# Ensure backup destination exists
if [ ! -d "$backup_destination" ]; then
mkdir -p "$backup_destination" || {
echo "ERROR: Could not create backup destination directory" >&2
exit 1
}
fi

# Create temporary file list
temp_file=$(mktemp)

# Add files to backup list
for dir in "${backup_dirs[@]}"; do
if [ -d "$dir" ]; then
find "$dir" -type f -not -path "*/\.*" >> "$temp_file"
echo "Added $dir to backup list"
else
echo "WARNING: $dir does not exist, skipping" >&2
fi
done

# Check if we have files to backup
if [ ! -s "$temp_file" ]; then
echo "ERROR: No files to backup" >&2
rm "$temp_file"
exit 1
fi

# Count files to back up
file_count=$(wc -l < "$temp_file")
echo "Backing up $file_count files to $backup_destination/$archive_name"

# Create the backup
tar -czf "$backup_destination/$archive_name" -T "$temp_file" && {
echo "Backup completed successfully"
echo "Archive: $backup_destination/$archive_name"
echo "Size: $(du -h "$backup_destination/$archive_name" | cut -f1)"
} || {
echo "ERROR: Backup failed" >&2
rm "$backup_destination/$archive_name" 2>/dev/null
}

# Clean up
rm "$temp_file"

Log File Analyzer

A script to analyze log files and report errors:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
#!/bin/bash

# Configuration
log_file="/var/log/syslog"
error_keywords=("error" "fail" "critical")
output_file="$HOME/log_analysis.txt"

# Check if log file exists
if [ ! -f "$log_file" ]; then
echo "ERROR: Log file $log_file does not exist" >&2
exit 1
fi

# Write header to output file
{
echo "Log Analysis Report"
echo "==================="
echo "Generated: $(date)"
echo "Log file: $log_file"
echo ""
} > "$output_file"

# Extract error messages for each keyword
for keyword in "${error_keywords[@]}"; do
echo "Searching for keyword: $keyword"

# Count occurrences
count=$(grep -i "$keyword" "$log_file" | wc -l)

# Write to output file
{
echo "== $keyword (Found $count occurrences) =="
echo ""

if [ $count -gt 0 ]; then
grep -i "$keyword" "$log_file" | head -10

if [ $count -gt 10 ]; then
echo "... and $(($count - 10)) more instances"
fi
else
echo "No occurrences found"
fi

echo ""
} >> "$output_file"
done

# Add summary information
{
echo "Summary"
echo "======="
echo "Total lines in log: $(wc -l < "$log_file")"
echo "Total errors found: $(grep -i "$(IFS="|"; echo "${error_keywords[*]}")" "$log_file" | wc -l)"
echo ""
echo "Report complete"
} >> "$output_file"

echo "Analysis complete. Results saved to $output_file"

System Monitoring Script

A script to monitor system resources:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
#!/bin/bash

# Configuration
check_interval=300 # 5 minutes
log_file="$HOME/system_monitor.log"
cpu_threshold=80
mem_threshold=80
disk_threshold=90

log_message() {
echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" >> "$log_file"
}

check_cpu() {
# Get CPU usage (this method may vary by system)
cpu_usage=$(top -bn1 | grep "Cpu(s)" | sed "s/.*, *\([0-9.]*\)%* id.*/\1/" | awk '{print 100 - $1}')
cpu_integer=${cpu_usage%.*}

if [ $cpu_integer -gt $cpu_threshold ]; then
log_message "WARNING: High CPU usage: ${cpu_usage}%"
return 1
fi
return 0
}

check_memory() {
# Get memory usage percentage
mem_usage=$(free | grep Mem | awk '{print $3/$2 * 100.0}')
mem_integer=${mem_usage%.*}

if [ $mem_integer -gt $mem_threshold ]; then
log_message "WARNING: High memory usage: ${mem_usage}%"
return 1
fi
return 0
}

check_disk() {
# Get disk usage percentage for root partition
disk_usage=$(df -h / | awk 'NR==2 {print $5}' | tr -d '%')

if [ $disk_usage -gt $disk_threshold ]; then
log_message "WARNING: High disk usage: ${disk_usage}%"
return 1
fi
return 0
}

# Main monitoring loop
log_message "System monitoring started"

while true; do
# Check all resources
check_cpu
check_memory
check_disk

# Wait for next check
sleep $check_interval
done

These practical examples demonstrate how shell scripts can automate real-world tasks. They combine the various concepts we’ve learned: variables, control structures, functions, and error handling, into cohesive, useful tools.


## Debugging Shell Scripts

Even the best programmers write code with bugs. Knowing how to debug scripts efficiently is a valuable skill.

Adding Verbosity

The simplest debugging technique is to add more output:

1
echo "DEBUG: value of variable is $variable"

For temporary debugging, you can create a debug function:

1
2
3
4
5
6
7
debug() {
[ "$DEBUG" = "1" ] && echo "DEBUG: $*" >&2
}

# Then in your script
DEBUG=1
debug "Processing file: $file"

Using set -x

The -x option to set enables tracing mode, which prints each command before executing it:

1
2
3
4
5
6
7
#!/bin/bash
set -x # Enable tracing
echo "Hello, world!"
name="John"
echo "Hello, $name!"
set +x # Disable tracing
echo "This command is not traced"

You can also enable tracing for just a section of your script:

1
2
3
4
5
6
7
8
9
10
echo "Normal command"

set -x # Enable tracing
# Problematic section
for i in {1..3}; do
echo "Iteration $i"
done
set +x # Disable tracing

echo "Back to normal"

ShellCheck

ShellCheck is an external tool that performs static analysis of shell scripts and points out common mistakes:

1
$ shellcheck myscript.sh

It’s available in most package managers and also as an online service at shellcheck.net.

Common bugs to watch for:

  1. Forgetting to quote variables
  2. Using [ $foo = bar ] instead of [ "$foo" = "bar" ]
  3. Using == in [ ] (use = instead, or use [[ instead of [)
  4. Forgetting to check if files exist before using them
  5. Assuming a command succeeded without checking its exit code

Debugging shell scripts is partly art, partly science. Practice makes perfect, and even experienced shell scripters regularly use debug techniques to identify and fix problems in their code.


## Best Practices for Shell Scripting

Adopt these best practices to write clean, maintainable, and robust shell scripts:

Script Organization

  1. Start with a shebang: Always include #!/bin/bash (or appropriate interpreter)
  2. Add a header comment: Describe purpose, usage, and author
  3. Define constants at the top: Put configuration variables at the beginning
  4. Structure logically: Group related functions together

Example header:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
#!/bin/bash
#
# backup.sh - Daily backup script for important files
#
# Usage: backup.sh [destination]
#
# Author: Your Name <your.email@example.com>
# Created: March 4, 2025
#

# Configuration
BACKUP_DIRS=("$HOME/Documents" "$HOME/Pictures")
DEFAULT_DESTINATION="/mnt/backup"
# ...

Coding Style

  1. Use consistent indentation: 2 or 4 spaces (no tabs)
  2. Use meaningful variable names: file_count instead of fc
  3. Add comments for complex sections: Explain why, not just what
  4. Use functions for reusable code: Break down complex tasks
  5. Keep functions focused: Each function should do one thing well

Robustness

  1. Quote your variables: Use "$variable" to handle spaces and special characters
  2. Check for required tools: Verify dependencies before using them
  3. Validate input: Don’t trust user input or external data
  4. Handle errors gracefully: Check exit codes and provide useful messages
  5. Use absolute paths: Avoid assuming current directory

Security

  1. Don’t use eval with user input: It can lead to code injection
  2. Be careful with rm -rf: Always check variables before using them in destructive operations
  3. Set proper file permissions: Make scripts executable for the intended users only
  4. Don’t embed sensitive information: Use environment variables or secure storage
  5. Validate file paths: Prevent directory traversal attacks

Portability

  1. Specify the shell explicitly: Use the shebang line
  2. Avoid bashisms in /bin/sh scripts: Not all features work in all shells
  3. Use POSIX features when possible: For maximum compatibility
  4. Test on different platforms: Ensure your script works as expected everywhere it needs to run
  5. Document dependencies: List required tools and their minimum versions

Remember: The goal of scripting is not just to get the job done, but to create tools that are reliable, maintainable, and usable by others (including your future self).


## Conclusion

Shell scripting is a powerful skill that enables you to automate tasks, create custom tools, and solve complex problems efficiently. In this chapter, we’ve covered the fundamental concepts of scripting, from basic syntax to best practices.

As you continue your shell scripting journey, remember that practice is essential. Start with small scripts to solve real problems you encounter, and gradually build more complex solutions as your confidence grows.

Don’t be afraid to read other people’s scripts to learn new techniques, and share your own work with the community. The shell scripting ecosystem is rich with examples and shared knowledge.

In the next and final chapter, we’ll explore advanced shell tools and practical tips that will further enhance your command-line productivity.

Shell scripting is both a science and an art. Like a master craftsperson, you’ll develop your own style and approaches over time, but the foundations we’ve covered here will serve you well throughout your journey.