| name | bash-executor |
| description | Execute bash commands and scripts safely with validation, error handling, and security checks. Use for system operations, file management, text processing, and command-line tools. |
| version | 1.0.0 |
| author | Production Skills Team |
| plugin | my-first-plugin |
| tags | bash, shell, linux, unix, scripting, cli |
Bash Executor Skill
Execute bash commands and shell scripts safely with comprehensive error handling, security validation, and best practices.
Plugin: my-first-plugin | Version: 1.0.0
When to Use This Skill
Use this skill when the user needs to:
- Execute system commands and utilities
- Process text files (grep, sed, awk, cut, sort)
- Manage files and directories
- Run CLI tools (curl, wget, git, docker)
- Create shell scripts for automation
- Parse command output and logs
- Execute batch operations
- System administration tasks
Core Capabilities
- Safe Command Execution: Validates commands before running
- Error Handling: Comprehensive exit code checking
- Security Validation: Blocks dangerous patterns
- Dry-Run Mode: Preview commands before execution
- Output Capture: Structured stdout/stderr handling
- Timeout Protection: Prevents hanging processes
- Script Generation: Creates reusable bash scripts
Execution Workflow
Step 1: Understand the Task
Determine if bash is the right tool:
- ✅ System operations, file management
- ✅ Text processing with standard tools
- ✅ CLI tool orchestration
- ❌ Complex logic (use Python/Node.js)
- ❌ API calls (use Node.js/Python)
Step 2: Validate Commands
Check for dangerous patterns:
bash scripts/validate_command.sh "<command>"
Step 3: Execute Safely
Use the helper script:
bash scripts/execute_bash.sh "<command>" [timeout]
# OR for scripts:
bash scripts/execute_bash.sh <script-file> [timeout]
Step 4: Handle Errors
Check exit codes and provide meaningful feedback:
- Exit 0: Success
- Exit 1-127: Command-specific errors
- Exit 124: Timeout
- Exit 126: Permission denied
- Exit 127: Command not found
Security Guidelines
CRITICAL: Always Block These Patterns
❌ NEVER Allow:
rm -rf / # System destruction
dd if=/dev/zero # Disk wiping
:(){ :|:& };: # Fork bombs
chmod 777 / # Permission destruction
curl | bash # Arbitrary code execution
eval $user_input # Code injection
mkfs.* # Filesystem formatting
✅ Safe Patterns:
ls -la # Directory listing
grep "pattern" file.txt # Text search
find . -name "*.log" # File finding
tar -czf backup.tar.gz dir/ # Archiving
sed 's/old/new/g' file.txt # Text replacement
Command Validation Rules
- No Root Operations: Reject commands requiring sudo/root
- Path Restrictions: Operations only in current directory or subdirectories
- No Destructive Commands: Block rm -rf, dd, mkfs without confirmation
- No Arbitrary Downloads: Validate URLs before wget/curl
- No Process Killing: Restrict kill, killall commands
Error Handling Best Practices
Always Check Exit Codes
#!/bin/bash
set -euo pipefail # Exit on error, undefined vars, pipe failures
if some_command; then
echo "Success"
else
echo "Failed with exit code: $?"
exit 1
fi
Use Trap for Cleanup
#!/bin/bash
cleanup() {
echo "Cleaning up..."
rm -f /tmp/tempfile
}
trap cleanup EXIT ERR
# Your commands here
Validate Input
#!/bin/bash
FILE="${1:-}"
if [[ -z "$FILE" ]]; then
echo "Error: File argument required" >&2
exit 1
fi
if [[ ! -f "$FILE" ]]; then
echo "Error: File not found: $FILE" >&2
exit 1
fi
Common Use Cases
1. File Management
Create Directory Structure:
mkdir -p project/{src,tests,docs,config}
echo "Created project structure"
Find and Process Files:
#!/bin/bash
# Find all .log files modified in last 7 days
find . -name "*.log" -mtime -7 -type f | while read -r file; do
echo "Processing: $file"
# Process each file
done
Bulk Rename Files:
#!/bin/bash
# Rename all .txt files to .md
for file in *.txt; do
if [[ -f "$file" ]]; then
mv "$file" "${file%.txt}.md"
echo "Renamed: $file -> ${file%.txt}.md"
fi
done
2. Text Processing
Search and Extract:
#!/bin/bash
# Extract email addresses from file
grep -Eo '[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}' input.txt | \
sort -u > emails.txt
echo "Extracted $(wc -l < emails.txt) unique emails"
CSV Processing:
#!/bin/bash
# Extract specific columns from CSV
cut -d',' -f1,3,5 input.csv | \
grep -v "^#" | \
sort > output.csv
Log Analysis:
#!/bin/bash
# Count error types in log file
awk '/ERROR/ {print $5}' app.log | \
sort | uniq -c | sort -rn
3. System Operations
Disk Usage Analysis:
#!/bin/bash
# Find largest directories
du -h --max-depth=1 2>/dev/null | \
sort -hr | head -20
Process Management:
#!/bin/bash
# Check if process is running
if pgrep -f "myapp" > /dev/null; then
echo "Process is running"
pgrep -f "myapp" | xargs ps -p
else
echo "Process is not running"
fi
Archive and Compress:
#!/bin/bash
# Create dated backup
DATE=$(date +%Y%m%d)
tar -czf "backup_${DATE}.tar.gz" \
--exclude='node_modules' \
--exclude='.git' \
./project/
echo "Backup created: backup_${DATE}.tar.gz"
4. Data Pipeline
Download and Process:
#!/bin/bash
set -euo pipefail
URL="https://example.com/data.json"
OUTPUT="processed_data.json"
# Download
curl -fsSL "$URL" -o raw_data.json
# Process with jq
jq '.items[] | select(.status == "active")' raw_data.json > "$OUTPUT"
echo "Processed data saved to $OUTPUT"
Multi-Step Pipeline:
#!/bin/bash
set -euo pipefail
# Step 1: Download
echo "Downloading data..."
wget -q https://example.com/data.csv -O data.csv
# Step 2: Clean
echo "Cleaning data..."
sed 's/\r$//' data.csv | # Remove carriage returns
grep -v '^$' | # Remove empty lines
tr -s ' ' > cleaned.csv
# Step 3: Analyze
echo "Analyzing..."
awk -F',' '{sum+=$3} END {print "Total:", sum}' cleaned.csv
echo "Pipeline completed"
5. Git Operations
Repository Setup:
#!/bin/bash
set -euo pipefail
REPO_NAME="${1:-my-repo}"
mkdir -p "$REPO_NAME"
cd "$REPO_NAME"
git init
echo "# $REPO_NAME" > README.md
echo "node_modules/" > .gitignore
echo ".env" >> .gitignore
git add .
git commit -m "Initial commit"
echo "Repository initialized: $REPO_NAME"
Branch Management:
#!/bin/bash
# Create feature branch from main
git checkout main
git pull origin main
git checkout -b feature/new-feature
echo "Created and switched to feature/new-feature"
6. Docker Operations
Container Management:
#!/bin/bash
# Stop and remove all containers
docker ps -aq | xargs -r docker stop
docker ps -aq | xargs -r docker rm
echo "All containers stopped and removed"
Image Cleanup:
#!/bin/bash
# Remove dangling images
docker images -f "dangling=true" -q | xargs -r docker rmi
echo "Dangling images removed"
Advanced Patterns
Parallel Execution
#!/bin/bash
# Process files in parallel
find . -name "*.txt" -print0 | \
xargs -0 -P 4 -I {} bash -c 'process_file "$@"' _ {}
Progress Monitoring
#!/bin/bash
# Show progress for long operations
TOTAL=$(ls -1 *.txt | wc -l)
COUNT=0
for file in *.txt; do
COUNT=$((COUNT + 1))
echo "Processing $COUNT/$TOTAL: $file"
# Process file
done
Error Recovery
#!/bin/bash
# Retry on failure
MAX_RETRIES=3
RETRY_DELAY=5
for i in $(seq 1 $MAX_RETRIES); do
if some_command; then
echo "Success on attempt $i"
break
else
echo "Failed attempt $i/$MAX_RETRIES"
if [[ $i -lt $MAX_RETRIES ]]; then
echo "Retrying in ${RETRY_DELAY}s..."
sleep $RETRY_DELAY
fi
fi
done
Helper Scripts Available
Execute Command/Script
bash scripts/execute_bash.sh "<command>" [timeout]
bash scripts/execute_bash.sh script.sh [timeout]
Validate Command
bash scripts/validate_command.sh "<command>"
Generate Script Template
bash scripts/generate_script.sh script_name.sh "Description"
Dry Run Mode
DRY_RUN=1 bash scripts/execute_bash.sh "<command>"
Debugging Techniques
Enable Verbose Mode
#!/bin/bash
set -x # Print commands as they execute
# Your commands here
set +x # Disable verbose mode
Check Variables
#!/bin/bash
set -u # Error on undefined variables
echo "DEBUG: Variable value: ${MY_VAR:-not set}"
Trace Execution
bash -x script.sh # Run with tracing
Performance Optimization
Use Built-in Commands
# Slow: external command
result=$(cat file.txt | grep pattern)
# Fast: built-in
result=$(grep pattern < file.txt)
Avoid Unnecessary Pipes
# Slow
cat file.txt | grep pattern | wc -l
# Fast
grep -c pattern file.txt
Use Parallel Processing
# Serial (slow)
for file in *.txt; do
process "$file"
done
# Parallel (fast)
ls *.txt | xargs -P 4 -I {} process {}
Environment Variables
Common Variables
HOME # User home directory
PATH # Executable search path
USER # Current username
PWD # Present working directory
OLDPWD # Previous directory
SHELL # Current shell
Set Variables Safely
#!/bin/bash
# Set with default
VAR="${VAR:-default_value}"
# Export for child processes
export API_KEY="secret"
# Unset after use
unset API_KEY
Troubleshooting
"Permission denied"
# Check permissions
ls -la script.sh
# Fix permissions
chmod +x script.sh
"Command not found"
# Check if command exists
command -v mycommand
# Check PATH
echo $PATH
# Use full path
/usr/bin/mycommand
"No such file or directory"
# Check file exists
[[ -f file.txt ]] && echo "Exists" || echo "Not found"
# Use absolute path
realpath file.txt
Script Hangs
# Use timeout
timeout 30s ./script.sh
# Or in script
set -euo pipefail
Best Practices Summary
- ✅ Always use
set -euo pipefailfor safety - ✅ Quote variables:
"$variable"not$variable - ✅ Check exit codes:
if command; then ... - ✅ Use
[[ ]]for conditions, not[ ] - ✅ Validate input before processing
- ✅ Use meaningful variable names
- ✅ Add comments for complex operations
- ✅ Use trap for cleanup
- ❌ Never use
evalwith user input - ❌ Avoid parsing
lsoutput
When NOT to Use This Skill
- Complex data structures (use Python/Node.js)
- API interactions (use Node.js/Python)
- Heavy calculations (use Python)
- Cross-platform scripts (use Python)
- GUI applications (use appropriate tools)
References
For more complex bash operations:
scripts/execute_bash.sh- Safe command executorscripts/validate_command.sh- Command validatorscripts/generate_script.sh- Script generator- Bash manual: https://www.gnu.org/software/bash/manual/
- ShellCheck: https://www.shellcheck.net/