Bash Quoting: The Complete Guide (Single, Double, and No Quotes)

Bash has three quoting modes and they do completely different things. Getting them wrong produces silent bugs that only appear when a filename has a space or a variable contains a glob character. Here is the complete map.

The three quoting modes

Every character you type in a bash command can be interpreted literally or specially. Quotes control which interpretation applies. The three modes are:

Mode Syntax Variable expansion Word splitting Glob expansion
No quotes $var Yes Yes Yes
Double quotes "$var" Yes No No
Single quotes '$var' No No No

No quotes lets bash do everything: expand variables, split the result on whitespace (and IFS characters), and expand any glob patterns like * or ?. This is correct only when you explicitly want all three behaviors — which is rarely the right default.

Double quotes expand variables and command substitutions ($(cmd)), but protect the result from word splitting and glob expansion. This is the right default for almost every variable reference.

Single quotes treat everything literally. No expansion of any kind occurs — not variables, not backslash sequences, not command substitutions. What you type is exactly what bash passes to the command.

NAME="John Doe" echo $NAME # word splits: two arguments: "John" and "Doe" echo "$NAME" # correct: one argument: "John Doe" echo '$NAME' # literal: prints the string $NAME unchanged
Rule of thumb

Double-quote every variable expansion unless you have a specific reason not to. The exceptions (word splitting you want, glob expansion you want) are far rarer than the bugs you avoid.

Word splitting: why $var without quotes breaks

After bash expands a variable, it scans the result for characters in the IFS variable (Internal Field Separator). By default IFS contains a space, tab, and newline. Every IFS character in the expanded value becomes a word boundary — the value splits into multiple arguments.

This is the root cause of the classic "files with spaces" bug:

FILE="my report 2026.pdf" # Broken: FILE splits into 3 words — cp receives 4 arguments cp $FILE /backup/ # equivalent to: cp my report 2026.pdf /backup/ # cp: cannot stat 'my': No such file or directory # Fixed: double quotes prevent splitting — cp receives 2 arguments cp "$FILE" /backup/ # equivalent to: cp "my report 2026.pdf" /backup/

The same problem applies to command substitution. Without quotes, the output of $(cmd) is split and glob-expanded before being passed as an argument:

# Broken: output splits on newlines, each filename becomes a separate word for f in $(ls /tmp/uploads); do process $f done # Fixed: use a glob directly (preferred), or quote the substitution for f in /tmp/uploads/*; do process "$f" done

IFS can be customized, which is occasionally useful but more often a source of confusion. If a script is misbehaving around word splitting, check whether IFS has been modified:

# Check current IFS (printed as quoted string to show whitespace) printf '%q\n' "$IFS" # Default output: $' \t\n' # A script that reads CSV might set IFS=, IFS=, read -r field1 field2 field3 <<< "a,b,c" echo "$field1" # a echo "$field2" # b

Glob expansion compounds the problem. If a variable expands to a value containing *, ?, or [...], bash will attempt to match those as filesystem patterns after word splitting:

PATTERN="report*.pdf" # Without quotes: bash expands the glob against the filesystem ls $PATTERN # lists report-2024.pdf, report-2025.pdf, etc. # With double quotes: the literal string is passed to ls ls "$PATTERN" # ls: cannot access 'report*.pdf': No such file or directory # (correct behavior if you want the literal name)

When single quotes are correct

Single quotes are the right choice whenever you need a completely literal string — no interpretation whatsoever. The three cases where single quotes are consistently the correct choice:

Literal strings and fixed values that contain special characters you do not want interpreted:

# Passwords, API keys, strings with $ or \ characters PATTERN='$2b$12$abcdefgh' # bcrypt hash — $ must stay literal grep 'price: \$[0-9]' file.txt # grep regex with literal dollar sign

Regular expression patterns passed to grep, sed, or awk. These tools have their own metacharacters that overlap with bash's special characters. Single-quoting regex patterns prevents bash from interpreting them first:

# Without single quotes: bash interprets special characters first grep "^[A-Z].*\.$" file.txt # works but fragile # With single quotes: pattern reaches grep unmodified grep '^[A-Z].*\.$' file.txt # correct and unambiguous # Awk programs almost always need single quotes awk '{print $1, $2}' file.txt # $1/$2 are awk fields, not shell variables awk "{print $1, $2}" file.txt # WRONG: shell expands $1 and $2 first (empty)

Sed substitution expressions that contain slashes, ampersands, or backreferences:

# Double quotes require escaping & and \ — error-prone sed "s/foo/bar \& baz/" # Single quotes: no escaping needed sed 's/foo/bar & baz/' sed 's/\(first\) \(second\)/\2 \1/' # backreferences work cleanly
Note

You cannot include a literal single quote inside single quotes. The workaround is to end the single-quoted string, escape the apostrophe, and re-open: 'it'\''s fine'. Or use $'it\'s fine' (ANSI-C quoting — see below).

The ANSI-C quoting trick: $'...'

ANSI-C quoting is the least-known quoting mode. The syntax is $'...' — a dollar sign immediately before a single-quoted string. Inside, backslash escape sequences are interpreted exactly as they are in C string literals.

This solves a genuine gap in POSIX sh: there is no standard way to embed a literal tab or newline inside a variable assignment without ANSI-C quoting or a subshell trick.

# Common escape sequences in $'...' TAB=$'\t' NEWLINE=$'\n' NULL=$'\0' BELL=$'\a' ESC=$'\e' # escape character (useful for terminal colors) # Practical example: set IFS to only newline IFS=$'\n' # Embed a literal newline in a string GREETING=$'Hello,\nWorld!' echo "$GREETING" # Hello, # World!

ANSI-C quoting also provides the cleanest way to include a literal single quote without escaping gymnastics:

# The ugly workaround echo 'it'\''s fine' # Clean with ANSI-C quoting echo $'it\'s fine' # Or embed the apostrophe directly echo $'it\x27s fine' # \x27 is the hex code for '

This form is supported in bash, ksh93, and zsh. It is not in POSIX sh, but in practice if you are writing bash scripts the #!/usr/bin/env bash shebang already signals that POSIX portability is not a hard constraint.

Tip

Use $'\n' and $'\t' freely in bash scripts. They are clearer than printf workarounds and do not require a subshell.

The 5 quoting mistakes developers make repeatedly

  1. 01
    Unquoted array expansion with ${arr[@]}

    Arrays are one of the few places where bash quoting has a non-obvious correct form. ${arr[@]} without quotes joins all elements into a single string that then gets split. "${arr[@]}" (with double quotes) expands each element as its own separate word — which is almost always what you want.

    FILES=("report 2026.pdf" "data backup.csv" "config.json") # Wrong: word splits on spaces — 5 words instead of 3 for f in ${FILES[@]}; do echo "$f"; done # report # 2026.pdf # data # backup.csv # config.json # Correct: each element is one word for f in "${FILES[@]}"; do echo "$f"; done # report 2026.pdf # data backup.csv # config.json
  2. 02
    Heredoc indent stripping with wrong quoting

    The <<- heredoc form strips leading tabs (not spaces). If you indent with spaces, the whitespace appears in the output. More subtly: the heredoc delimiter being quoted or unquoted controls whether variable expansion happens inside the heredoc. An unquoted delimiter expands variables; a single-quoted delimiter is literal.

    NAME="world" # Unquoted delimiter — variables expand cat <<EOF Hello $NAME EOF # Output: Hello world # Single-quoted delimiter — literal, no expansion cat <<'EOF' Hello $NAME EOF # Output: Hello $NAME # Indented heredoc (tabs only) run_setup() { cat <<-EOF This line has a leading tab stripped. $NAME is still expanded here. EOF }
  3. 03
    eval and quoting collapse

    eval runs a string as a bash command, which means quoting is processed twice: once when the string is built, and again when eval executes it. A variable containing special characters will break the inner command unless you double-escape or use printf '%q' to produce a safely quoted version.

    FILENAME="my file.txt" # Broken: eval sees: cat my file.txt — word splits eval "cat $FILENAME" # Correct: printf %q produces a shell-escaped string SAFE=$(printf '%q' "$FILENAME") eval "cat $SAFE" # eval sees: cat my\ file.txt — correct
  4. 04
    SSH remote commands

    When you pass a command to ssh host 'command', the remote shell interprets the string again. Variables you want expanded locally must be quoted to survive the first shell; variables you want expanded on the remote host must be escaped to survive it. Getting this right requires thinking about two shells simultaneously.

    LOCAL_DIR="/home/user/data" REMOTE_DIR="/var/app" # Expands $LOCAL_DIR locally (correct), $REMOTE_DIR on remote (correct) ssh host "cp $LOCAL_DIR/file $REMOTE_DIR/" # $HOSTNAME: expanded locally — probably not what you want ssh host "echo $HOSTNAME" # $HOSTNAME: escaped, expanded on the remote host — correct ssh host 'echo $HOSTNAME' # Mixed: expand LOCAL_DIR here, REMOTE_USER on the remote ssh host "cp $LOCAL_DIR/file /home/\$REMOTE_USER/"
  5. 05
    find -exec and argument quoting

    find -exec passes each found path directly to the command as a single argument — no shell involved, so no word splitting. Wrapping the path in quotes inside the -exec expression does nothing useful (the quotes become literal characters). If you need shell features (piping, redirection, complex logic), use -exec sh -c '...' _ {} \; and quote properly inside that subshell.

    # WRONG: quotes are literal — find passes '"{}' and '"' to rm separately find /tmp -name "*.log" -exec rm "{}" \; # Correct: no quotes needed, {} is already a single token find /tmp -name "*.log" -exec rm {} \; # If you need a shell: use sh -c with proper internal quoting find /tmp -name "*.log" -exec sh -c 'gzip -9 "$1"' _ {} \; # $1 inside sh -c receives the path as one argument, quoted correctly
30 bash rescue scenarios, mapped out

Bash Unfucked covers quoting failures, word splitting bugs, and 28 more patterns that break real scripts — with exact commands to diagnose and fix each one.

Get Bash Unfucked — $2 →

Free 10-scenario sampler on GitHub →

More from the blog

How to Debug a Bash Script (set -x, trap, and Common Mistakes) → How to Undo a Git Commit (The Right Way) → The Git Recovery Cheatsheet →

Get notified when we ship

New Unfucked references and dev tips. No spam.