Tested on Ubuntu 22.04 / 24.04, Debian 12 and RHEL-based systems using the default curl package
The curl command is a widely used Linux utility for transferring data between a client and a remote server. It supports many protocols, including HTTP, HTTPS, FTP, FTPS, SCP, SFTP, LDAP, and FILE.
System administrators and developers commonly use curl to:
- Download or upload files
- Test REST APIs
- Inspect HTTP headers and status codes
- Troubleshoot network connectivity
curl is powered by the libcurl library, which provides reliable and efficient data transfer capabilities.
Installing curl on Linux
Most modern Linux distributions ship with curl preinstalled. If you see curl: command not found, install it using your package manager.
RHEL, CentOS, Fedora
sudo dnf install curl
Ubuntu, Debian
sudo apt install curl
curl quick reference (most used options)
| Task | Command |
|---|---|
| View webpage content | curl https://example.com |
| Save output to a file | curl -o file.html https://example.com |
| Download file with original name | curl -O https://example.com/file.zip |
| Download multiple files | curl -O url1 -O url2 |
| Resume interrupted download | curl -C - -O https://example.com/file.iso |
| Limit download speed | curl --limit-rate 2m -O https://example.com/file.iso |
| Fetch only HTTP headers | curl -I https://example.com |
| Check HTTP status code | `curl -IsL https://example.com |
| Follow redirects | curl -L https://example.com |
| Silent mode (scripts) | curl -s https://example.com |
| Send POST data | curl -d \"key=value\" https://example.com/api |
| Download with authentication | curl -u user:pass ftp://example.com/file.txt |
| Upload file via FTP | curl -T file.txt -u user:pass ftp://example.com/file.txt |
| Use HTTP proxy | curl -x proxy:port https://example.com |
| Authenticated proxy | curl -U user:pass -x proxy:port https://example.com |
Downloading and saving content with curl
Viewing content without saving
Use this when you only want to inspect the response and do not need to store it as a file.
Displays the raw HTML or response body directly in the terminal.
curl https://linux.die.net
The output is printed to standard output and is not saved locally.
Saving response output as files
Use this when the URL returns content that should be stored locally. Writes the response content to a file instead of printing it to the terminal.
curl -o page.html https://linux.die.net
The file is saved in the current working directory with the specified name.
Downloading files from URLs
These examples apply when the URL points to a downloadable file.
Downloads the file and saves it using the filename provided by the server.
curl -O https://example.com/file.zip
The file is saved in the current directory.
You can also download multiple files in a single command.
curl -O url1 -O url2
Each file is saved using its original name.
Controlling filenames for downloads
Use this when you want to explicitly choose the output filename.
The -o option allows you to rename the downloaded file.
curl -o custom_name.zip https://example.com/file.zip
The downloaded file is saved with the specified name.
To keep the original filename provided by the server, use -O.
curl -O https://example.com/file.zip
curl automatically determines the filename from the URL.
Controlling download location
By default, curl saves files in the current working directory.
You can change the destination directory by specifying a full or relative path.
curl -o /path/to/directory/file.zip https://example.com/file.zip
The destination directory must already exist.
Combining filename and location control
Use these methods when you need to control both where the file is saved and what it is called.
Save to a specific directory with a custom filename.
curl -o /path/to/downloads/my_backup.zip https://example.com/file.zip
Save to a specific directory while keeping the original filename.
curl --output-dir /path/to/downloads -O https://example.com/file.zip
The file is saved as:
/path/to/downloads/file.zip
Quick rule to remember
- Use
-o→ choose the filename and path explicitly - Use
-O→ keep the original filename - Use
--output-dir→ change the directory without renaming
Handling large or interrupted downloads
Resume an interrupted download
Continues downloading from the point where the transfer was interrupted.
This avoids restarting the download from the beginning.
curl -C - -O https://releases.ubuntu.com/22.04/ubuntu-22.04-desktop-amd64.iso
Sample output (resuming):
** Resuming transfer from byte position 73400320
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3.6G 100 3.6G 0 0 4.8M 0 --:--:-- --:--:-- --:--:-- 4.8M
If the file already exists locally, curl automatically resumes from the last byte.
Limit download speed
Restricts the maximum transfer rate to avoid saturating the network.
Useful when downloading large files on shared or production systems.
curl --limit-rate 2m -O https://example.com/largefile.iso
Here, the download speed is capped at 2 MB per second.
Supported suffixes include k (KB), m (MB), and g (GB).
Inspecting HTTP responses
Fetch only HTTP headers
Retrieves response headers without downloading the response body.
Useful for checking server type, content type, and cache headers.
curl -I https://linux.die.net
Sample output:
HTTP/2 200
content-type: text/html; charset=UTF-8
server: nginx
last-modified: Wed, 07 Feb 2024 10:12:22 GMT
Check HTTP status code
Quickly verifies whether a request succeeded or failed.
Common status codes include 200 (OK), 301 (Redirect), and 404 (Not Found).
curl -IsL https://linux.die.net | grep HTTP
Sample output:
HTTP/2 200
Follow HTTP redirects
Ensures curl follows redirect responses automatically.
Required when a URL redirects to a new location.
curl -L https://linux.die.net
Without -L, curl stops at the first redirect and does not fetch the final page.
Silent and script-friendly usage
Run curl without progress output
Suppresses the progress meter and error messages.
This makes the output clean and suitable for scripting.
curl -s https://linux.die.net
If an error occurs, curl exits silently unless combined with error flags.
Sending data and authentication
Send POST request data
Sends form data or payload to the server using an HTTP POST request.
Frequently used when testing REST APIs and web forms.
curl -d "name=John&role=admin" https://example.com/api
Sample output (JSON response):
{
"status": "success",
"message": "Data received"
}
By default, curl sends data using application/x-www-form-urlencoded.
Download from authenticated FTP server
Downloads a file from an FTP server that requires authentication.
Credentials are passed using the username:password format.
curl -u user:password ftp://ftp.example.com/file.txt
Upload a file to FTP server
Uploads a local file to a remote FTP server.
Useful for backups, file sharing, and automated uploads.
curl -T localfile.txt -u user:password ftp://ftp.example.com/remote.txt
The file is transferred to the specified remote path.
Proxy and advanced usage
Use an HTTP proxy
Routes the request through a specified proxy server.
Commonly required in enterprise networks where direct internet access is blocked.
curl -x proxy_host:port https://example.com
Authenticate with proxy server
Supplies credentials when the proxy server requires authentication.
The format is username:password.
curl -U username:password -x proxy_host:port https://example.com
Proxy authentication with HTTPS destination
Most proxies use HTTP even when accessing HTTPS websites.
Explicitly specifying the proxy scheme avoids ambiguity.
curl -U username:password -x http://proxy_host:port https://example.com
The proxy establishes a tunnel and the HTTPS connection remains encrypted end-to-end.
Authenticate with both proxy and destination server
Use this when both the proxy and the target server require credentials.
curl -U proxyuser:proxypass -u appuser:apppass -x proxy_host:port https://example.com
-U is used for proxy authentication, while -u is used for server authentication.
Prompt for proxy password securely
Avoid exposing passwords in command history or scripts.
curl -U username -x proxy_host:port https://example.com
curl prompts for the proxy password interactively.
Use environment variables for proxy authentication
Commonly used in automation, CI/CD pipelines, and scripts.
export http_proxy="http://username:password@proxy_host:port"
export https_proxy="http://username:password@proxy_host:port"
curl https://example.com
This method keeps credentials out of command-line arguments.
Dictionary lookup using DICT protocol
Fetch word definition
Retrieves the definition of a word from a DICT server.
Useful for quick lookups directly from the command line.
curl dict://dict.org/d:linux
Sample output (truncated):
151 "linux" gcide "The Collaborative International Dictionary of English"
Linux \\Li*nux\\, n.
An open-source Unix-like operating system kernel
originally developed by Linus Torvalds.
This query contacts dict.org and returns the definition for the specified word.
Frequently Asked Questions
1. What is the curl command in Linux used for?
curl is used to transfer data to or from servers using supported protocols like HTTP, HTTPS, FTP, and SFTP. It is commonly used for downloading files, testing APIs, and checking connectivity.2. How do I run curl without showing progress output?
Use the -s or –silent option to suppress the progress meter and error messages.3. How can I check HTTP status code using curl?
You can use curl -IsL4. Does curl follow redirects by default?
No. You must use the -L or –location option to make curl follow HTTP redirects.Summary
The curl command is a versatile and script-friendly tool for transferring data over the network.
In this guide, you learned how to:
- Download and save content efficiently
- Resume large or interrupted downloads
- Inspect HTTP headers and status codes
- Use curl silently in scripts and automation
- Send data to APIs and authenticate with servers
- Work with proxies and advanced networking scenarios
By understanding these commonly used options, you can use curl confidently for troubleshooting, automation, and API testing in real-world Linux environments.
Also Read
Further Reading
These references are useful when exploring advanced curl options or debugging complex network issues.

