How to Use Linux CLI: A Complete Guide
Unlock the true power of the command line. Learn about Input/Output streams, Piping, Redirection, Environment Variables, and advanced shell techniques.
1. Redirection (> and >>)
By default, commands output to the screen (stdout). Redirection lets you send that output elsewhere—to a file, for instance.
Overwrite (>)
Creates the file if it does not exist, or replaces all content if it does.
ls > file_list.txtSaves the directory listing to a file, overwriting previous contents.
Append (>>)
Adds to the end of the file without erasing existing content.
echo "Log entry" >> logs.txtAdds a new line to the log file, preserving everything before.
2. The Pipe (|)
The pipe is the single most powerful operator in the Unix shell. It takes the output of the command on the left and feeds it as input to the command on the right.
cat access.log: Reads the entire log file.grep "404": Filters to only lines containing "404".sort: Sorts lines alphabetically (needed for uniq to work).uniq -c: Counts consecutive identical lines.sort -rn: Sorts by count, numerically, in reverse (highest first).head -10: Shows only the top 10 results.
Result: The top 10 most common 404 error URLs in your access log.
3. Environment Variables
Variables that persist in your shell session and are passed to child processes. They configure how programs behave.
export NODE_ENV=productionecho $NODE_ENVPORT=3000 npm startprintenv or env4. Command Substitution
Use the output of one command as an argument to another command.
echo "Today is $(date +%Y-%m-%d)"tar -czvf backup-$(date +%F).tar.gz ./datakill $(pgrep node)Mastering the Linux Command Line Interface
The Linux command line is built on a philosophy of composability—small, focused programs that do one thing well and can be combined in countless ways. This approach, inherited from Unix, enables power users to construct complex data processing pipelines without writing traditional programs. Understanding streams, pipes, and redirection transforms the CLI from a simple command executor into a sophisticated computing environment.
This guide covers the intermediate and advanced concepts that take your CLI skills to the next level. While the basic commands (ls, cd, mkdir) let you navigate and manipulate files, the techniques here let you automate, analyze, and orchestrate complex tasks with elegance and efficiency.
Understanding Standard Streams
Every Linux program has three standard streams: standard input (stdin), standard output (stdout), and standard error (stderr). By default, stdin comes from the keyboard, and both stdout and stderr go to the terminal. Understanding these streams is fundamental to mastering redirection and piping.
Stdin (file descriptor 0) is where a program reads input. When you type into a program, you are writing to its stdin. Stdout (file descriptor 1) is where normal output goes. Stderr (file descriptor 2) is where error messages go—separate from stdout so that errors do not get mixed into data being processed.
This separation of output and error streams is powerful. You can redirect stdout to a file for processing while still seeing errors on the screen: command > output.txt. Or redirect both: command > output.txt 2>errors.txt. Or combine them: command > all.txt 2>&1.
The Art of Piping
The pipe operator | connects the stdout of one command to the stdin of another. This simple mechanism enables remarkable flexibility. Instead of writing a complex program, you chain together simple commands, each transforming the data in some way.
Consider log analysis. Without pipes, you might need to write a Python script that opens a file, reads lines, filters them, counts occurrences, and sorts results. With pipes, the same operation is a single line: grep pattern file | sort | uniq -c | sort -rn. Each command in the pipeline is simple and reusable.
The mental model is data flowing through a series of filters. Each filter transforms the data in some way—selecting rows, transforming columns, aggregating values, sorting results. Learning common filter commands (grep, sed, awk, cut, sort, uniq, head, tail) gives you a powerful toolkit for data manipulation.
Input and Output Redirection in Depth
Output redirection with > and >> is just the beginning. Input redirection with < feeds file contents to stdin: sort < unsorted.txt is equivalent to sort unsorted.txt for most commands, but the redirection form is more general.
Here-documents let you provide multi-line input inline: cat <<EOF starts a here-document that continues until a line containing only "EOF". This is commonly used in scripts to create files or send multi-line input to commands.
Here-strings (<<<) provide single-line input: grep pattern <<<"line to search'. This is useful when you have a string in a variable and want to pass it to a command that expects stdin, not an argument.
Environment Variables and Configuration
Environment variables are key-value pairs that configure program behavior without requiring command-line arguments. They are inherited by child processes, so setting a variable in your shell affects all programs you launch.
The PATH variable is the most important: it lists directories to search for executable files. When you type python, the shell looks through each directory in PATH until it finds an executable named python. Understanding PATH is essential for troubleshooting "command not found" errors.
Other common variables include HOME (your home directory), USER (your username), SHELL (your default shell), EDITOR (your preferred text editor), and language/locale settings like LANG and LC_ALL. Applications often define their own variables; Node.js uses NODE_ENV, Docker uses DOCKER_HOST.
Variables set with export persist for your session. To make them permanent, add the export commands to your shell configuration file: ~/.bashrc for Bash, ~/.zshrc for Zsh. These files run automatically when you start a new shell.
Command Substitution and Scripting
Command substitution with $(command) captures a command's output for use in another command. This is fundamental for scripting and even useful interactively. The older backtick syntax `command` does the same thing but is harder to read and nest.
Common uses include timestamping (backup-$(date +%Y%m%d).tar.gz), dynamic file selection (vim $(find . -name "*.py" | head -1)), and extracting values from commands for use elsewhere.
Combined with variables and conditionals, command substitution lets you build sophisticated shell scripts. Even simple one-liners can be powerful: for f in $(find . -name "*.jpg"); do mv "$f" "${f%.jpg}.png"; done renames all JPG files to PNG (well, just the extension, but you get the idea).
Job Control and Background Processes
The shell provides job control for managing multiple processes. Appending & to a command runs it in the background, returning control immediately. Ctrl+Z suspends the current process, and bg resumes it in the background.
jobs lists background and suspended jobs. fg brings a job to the foreground. kill %1 kills job number 1. This is useful when you start a long-running command and realize you need to do something else.
For processes that should survive logging out, use nohup or screen/tmux. Nohup prevents the hangup signal from killing the process when you disconnect. Screen and tmux are terminal multiplexers that create persistent sessions you can attach and detach from.
Advanced Text Processing
For more complex transformations, sed and awk are indispensable. Sed is a stream editor for pattern-based substitutions: sed 's/old/new/g' file replaces all occurrences. Awk is a pattern-scanning and processing language, ideal for working with columnar data.
cut extracts columns: cut -d, -f1,3 file.csv extracts the first and third comma-separated fields. tr translates or deletes characters: tr 'a-z' 'A-Z' converts to uppercase. wc counts lines, words, and characters.
Building Proficiency
The path to CLI mastery is practice and exploration. Use man to read manual pages for commands. Use --help for quick reference. Try commands in safe environments before using them on important data. Keep a personal notes file of useful commands and pipelines.
As you become comfortable, you will find yourself reaching for the CLI even when GUI alternatives exist, simply because it is faster and more precise. The investment in learning pays dividends throughout your technical career, across operating systems and roles.