How to Install and Use cURL Command on Ubuntu or CentOS Linux

This post will guide you how to install cURL tool on your Ubuntu or CentOS Linux server. How do I use cURL command to download a web page or a file on Ubuntu Linux 16.04/18.04. How to fix the error message “curl: command not found” or “the program curl is currently not installed” on Ubuntu or CentOS Linux.

What is cURL?


cURL is a command line tool for getting or sending data using URL syntax. And it supports various common network protocols, currently including HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, LDAP, DAP, DICT, TELNET, FILE, IMAP, POP3, SMTP and RTSP. cURL supports HTTPS and performs SSL certificate verification by default when a secure protocol is specified such as HTTPS. You can use curl command to download or upload files using one of the supported protocols including HTTP, HTTPS, and FTP. You can also use CURL command to get HTTP Header information of a URL, or Pass HTTP Authentication, save or send Cookies.

Curl offers a busload of useful tricks like proxy support, user authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer resume, Metalink, and more. As you will see below, the number of features will make your head spin!

install and use curl command1

Installing cURL through Default Repository


By default, the curl package is installed on the most Linux distributions. You can try to check the curl package if it is installed on your Ubuntu system with the following command:

$ curl –help

Or

$ curl –version

Outputs:

root@devops:~# curl --version
Command 'curl' not found, but can be installed with:
apt install curl

root@devops:~# curl --help
Command 'curl' not found, but can be installed with:
apt install curl
root@devops:~#

If you get the error message “curl: command not found” from the output of those command, it indicates that curl tool is not installed on your system. Then you can try to install curl package via the default official Ubuntu repository, use the following commands to install curl by using the apt install command:

$ sudo apt update
$ sudo apt install curl

Outputs:

root@devops:~# sudo apt update
Hit:1 http://mirrors.aliyun.com/ubuntu bionic InRelease
Hit:2 http://mirrors.aliyun.com/ubuntu bionic-security InRelease
Hit:3 http://mirrors.aliyun.com/ubuntu bionic-updates InRelease
Ign:4 http://dl.google.com/linux/chrome/deb stable InRelease
Hit:5 http://mirrors.aliyun.com/ubuntu bionic-proposed InRelease
Hit:6 http://dl.google.com/linux/chrome/deb stable Release
Hit:7 http://mirrors.aliyun.com/ubuntu bionic-backports InRelease
Reading package lists... Done
Building dependency tree
Reading state information... Done
19 packages can be upgraded. Run 'apt list --upgradable' to see them.

root@devops:~# sudo apt install curl
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
augeas-lenses cpu-checker cryptsetup cryptsetup-bin db-util db5.3-util dmeventd extlinux hfsplus
ibverbs-providers icoutils ipxe-qemu ipxe-qemu-256k-compat-efi-roms ldmtool libafflib0v5 libaio1
libaugeas0 libbfio1 libcacard0 libconfig9 libdate-manip-perl libdevmapper-event1.02.1 libewf2
libfdt1 libhfsp0 libhivex0 libibverbs1 libintl-perl libintl-xs-perl libiscsi7 libldm-1.0-0
liblvm2app2.2 liblvm2cmd2.02 libnl-route-3-200 librados2 librbd1 librdmacm1 libsdl1.2debian
libspice-server1 libstring-shellquote-perl libsys-virt-perl libtsk13 libusbredirparser1 libvirt0
libwin-hivex-perl libxen-4.9 libxenstore3.0 linux-headers-generic-hwe-16.04
linux-image-generic-hwe-16.04 lsscsi lvm2 msr-tools osinfo-db qemu-block-extra
qemu-system-common qemu-system-x86 qemu-utils scrub seabios sgabios sleuthkit supermin zerofree
Use 'sudo apt autoremove' to remove them.
The following NEW packages will be installed:
curl
0 upgraded, 1 newly installed, 0 to remove and 19 not upgraded.
Need to get 159 kB of archives.
After this operation, 395 kB of additional disk space will be used.
Get:1 http://mirrors.aliyun.com/ubuntu bionic-security/main amd64 curl amd64 7.58.0-2ubuntu3.6 [159 kB]
Fetched 159 kB in 0s (760 kB/s)
Selecting previously unselected package curl.
(Reading database ... 230124 files and directories currently installed.)
Preparing to unpack .../curl_7.58.0-2ubuntu3.6_amd64.deb ...
Unpacking curl (7.58.0-2ubuntu3.6) ...
Setting up curl (7.58.0-2ubuntu3.6) ...
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
root@devops:~#

If you want to install PHP-curl on your Ubuntu system, use the following apt command:

$ sudo apt install php-curl

Outputs:

root@devops:~# sudo apt install php-curl
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
augeas-lenses cpu-checker cryptsetup cryptsetup-bin db-util db5.3-util dmeventd extlinux hfsplus
The following additional packages will be installed:
php-common php7.2-common php7.2-curl
The following NEW packages will be installed:
php-common php-curl php7.2-common php7.2-curl
0 upgraded, 4 newly installed, 0 to remove and 19 not upgraded.
Need to get 925 kB of archives.
After this operation, 6,827 kB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 http://mirrors.aliyun.com/ubuntu bionic/main amd64 php-common all 1:60ubuntu1 [12.1 kB]
Get:2 http://mirrors.aliyun.com/ubuntu bionic-security/main amd64 php7.2-common amd64 7.2.15-0ubuntu0.18.04.1 [882 kB]
Get:3 http://mirrors.aliyun.com/ubuntu bionic-security/main amd64 php7.2-curl amd64 7.2.15-0ubuntu0.18.04.1 [28.8 kB]
Get:4 http://mirrors.aliyun.com/ubuntu bionic/main amd64 php-curl all 1:7.2+60ubuntu1 [1,996 B]
Fetched 925 kB in 0s (1,978 kB/s)
Selecting previously unselected package php-common.
(Reading database ... 230131 files and directories currently installed.)
Preparing to unpack .../php-common_1%3a60ubuntu1_all.deb ...
Unpacking php-common (1:60ubuntu1) ...
Selecting previously unselected package php7.2-common.
Preparing to unpack .../php7.2-common_7.2.15-0ubuntu0.18.04.1_amd64.deb ...
Unpacking php7.2-common (7.2.15-0ubuntu0.18.04.1) ...
Selecting previously unselected package php7.2-curl.
Preparing to unpack .../php7.2-curl_7.2.15-0ubuntu0.18.04.1_amd64.deb ...
Unpacking php7.2-curl (7.2.15-0ubuntu0.18.04.1) ...
Selecting previously unselected package php-curl.
Preparing to unpack .../php-curl_1%3a7.2+60ubuntu1_all.deb ...
Unpacking php-curl (1:7.2+60ubuntu1) ...
Setting up php-common (1:60ubuntu1) ...
Created symlink /etc/systemd/system/timers.target.wants/phpsessionclean.timer → /lib/systemd/system/phpsessionclean.timer.
Setting up php7.2-common (7.2.15-0ubuntu0.18.04.1) ...
Creating config file /etc/php/7.2/mods-available/calendar.ini with new version
Setting up php7.2-curl (7.2.15-0ubuntu0.18.04.1) ...
Creating config file /etc/php/7.2/mods-available/curl.ini with new version
Setting up php-curl (1:7.2+60ubuntu1) ...

Once installed Curl package, you can try to verify curl if it is installed successfully. Using the following command to check the version of installed curl:

$ curl –version

Outputs:

root@devops:~# curl --version
curl 7.58.0 (x86_64-pc-linux-gnu) libcurl/7.58.0 OpenSSL/1.1.0g zlib/1.2.11 libidn2/2.0.4 libpsl/0.19.1 (+libidn2/2.0.4) nghttp2/1.30.0 librtmp/2.3
Release-Date: 2018-01-24

Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp smb smbs smtp smtps telnet tftp
Features: AsynchDNS IDN IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB SSL libz TLS-SRP HTTP2 UnixSockets HTTPS-proxy PSL
root@devops:~#

From the above outputs, you should notice that the current installed version of curl is 7.58.0.

Installing cURL From Source Package


Since the default repository are not updated frequently, so it may be possible that you will install an older version of package on your system. And If you want to install the latest version of cURL package on your Ubuntu system, you have to install it from source package. Here are the steps:

#1 downloading the installation package of cURL from the official web site of cURL with the following command (the current latest stable version of cURL is 7.64.0):

$  wget https://curl.haxx.se/download/curl-7.64.0.tar.gz .

Outputs:

root@devops:~# wget https://curl.haxx.se/download/curl-7.64.0.tar.gz .
--2019-03-07 21:50:52--  https://curl.haxx.se/download/curl-7.64.0.tar.gz
Resolving curl.haxx.se (curl.haxx.se)... 151.101.230.49, 2a04:4e42:36::561
Connecting to curl.haxx.se (curl.haxx.se)|151.101.230.49|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4032645 (3.8M) [application/x-gzip]
Saving to: ‘curl-7.64.0.tar.gz’
curl-7.64.0.tar.gz       100%[==================================>]   3.85M  1.97MB/s    in 2.0s
2019-03-07 21:50:54 (1.97 MB/s) - ‘curl-7.64.0.tar.gz’ saved [4032645/4032645]
--2019-03-07 21:50:54--  http://./
Resolving . (.)... failed: Name or service not known.
wget: unable to resolve host address ‘.’
FINISHED --2019-03-07 21:50:54--
Total wall clock time: 2.9s
Downloaded: 1 files, 3.8M in 2.0s (1.97 MB/s)
root@devops:~#

#2 extract all files from the downloaded archive package of cURL, type the following command:

$ tar –xvf  curl-7.64.0.tar.gz

#3 change the current directory to curl-7.64, type:

$ cd curl-7.64.0/

#4 compiling and installing cURL with the following command:

$ ./configure
$ make
$ make install

Then cURL should be installed on your Ubuntu Linux server.

Checking URL with CURL


If you want to check a given URL is valid or not, you can use CURL command with  -L option to print web content. For example, let’s check http://bing.com if it is valid or not. Type:

$ curl –L http://bing.com

Outputs:

root@devops:~# curl -L http://bing.com
<!DOCTYPE html><html lang="zh"><script type="text/javascript" >//<![CDATA[
si_ST=new Date
//]]></script><head><link id="bgLink" rel="preload" href="/az/hprichbg/rb/BrittlebushBloom_ZH-CN9198170508_1920x1080.jpg" as="image" /><link rel="preload" href="/sa/simg/hpc26.png" as="image" /><meta content="text/html; charset=utf-8" http-equiv="content-type"/><script type="text/javascript">

From the above outputs, it will print the html code of URL. It indicates that this URL is valid.

Let’s check an invalid URL with curl command, type:

$ curl -L http://abc.abcd

Outputs:

root@devops:~# curl -L http://abc.abcd
curl: (6) Could not resolve host: abc.abcd

From the above outputs, the domain name could not be resolved. So it is an invalid URL.

Download a Single File with CURL


You can save or download the output of the curl command from a URL by using –o or –O options.

The option –o (lowercase) will save the output to a file with a specified filename. And the option –O (upperase) will save the output to a file with the same file name as it is in the remove web server.

For example, let’s download one html file from the remove web server with the following command:

$ curl –O https://www.osetc.com/en/how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html

Outputs:

root@devops:~# curl -O https://www.osetc.com/en/how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
100 91604    0 91604    0     0  14058      0 --:--:--  0:00:06 --:--:-- 23108

root@devops:~# ls  how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html
how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html

From the above outputs, the html file is downloaded to the current working directory, and you also should see that html file has been downloaded with the same file name as in the remove web server.

If you want to specify a new file name for the downloaded html file, just type the following command with –o option:

$ curl –o newfile.h  https://www.osetc.com/en/how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html

Outputs:

root@devops:~# curl -o newfile.h https://www.osetc.com/en/how-to-install-google-chrome-browser-on-ubuntu-16-04-18-04.html

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
100 91603    0 91603    0     0  16413      0 --:--:--  0:00:05 --:--:-- 21048

root@devops:~# ls newfile.h
newfile.h

As you can see from the above outputs, the html file is downloaded with the specified file name.

Download Multiple Files with CURL


If you want to download multiple files at once with CURL command on your Ubuntu system, you can use multiple –O options followed by the download links. Let’s see the following syntax:

$ curl –O downloadLink1 –O downloadLink2

For example, you want to download two files at once with the following command:

$ curl -O https://www.osetc.com/en/how-to-install-blender-on-ubuntu-16-04-18-04-linux.html -O  https://www.osetc.com/en/how-to-install-bower-on-ubuntu-16-04-18-04-linux.html

Outputs:

root@devops:~# curl -O https://www.osetc.com/en/how-to-install-blender-on-ubuntu-16-04-18-04-linux.html -O  https://www.osetc.com/en/how-to-install-bower-on-ubuntu-16-04-18-04-linux.html
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
100 92768    0 92768    0     0  61233      0 --:--:--  0:00:01 --:--:-- 61233
100 87688    0 87688    0     0   119k      0 --:--:-- --:--:-- --:--:--  378k

root@devops:~# ls
how-to-install-blender-on-ubuntu-16-04-18-04-linux.html  how-to-install-bower-on-ubuntu-16-04-18-04-linux.html

Resume an Interrupted Download with CURL


If there is one download job interrupted for some reason, for example, network connection lost or pressing Ctrl + C keys, and you can easily to resume this download process on your system with curl  -C option. This option is very useful when you download large files, such as: Linux ISO files.

Continue/Resume a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped, counting from the beginning of the source file before it is transferred to the destination.  If used with uploads, the FTP server command SIZE will not be used by curl.

Use “-C -” to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.

If this option is used several times, the last one will be used.

For example, let’s download the installation package of chrome from google official set with the following curl command, and interrupt the downloading process with Ctrl +C, then resume this downloading process with curl –C option. Type:

$ curl -O https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb

$ curl -C - -O https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb

Outputs:

root@devops:~# curl -O https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current

Dload  Upload   Total   Spent    Left  Speed

 35 54.6M   35 19.5M    0     0  3984k      0  0:00:14  0:00:05  0:00:09 4010k^C

From the above outputs, you should notice that the current downloading process is 35%. Then it was terminated.

root@devops:~# curl -C - -O https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb

** Resuming transfer from byte position 22016000

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current

Dload  Upload   Total   Spent    Left  Speed

100 33.6M  100 33.6M    0     0  5655k      0  0:00:06  0:00:06 --:--:-- 6446k

From this output, you would notice that the previous download is resuming.

Limit the Rate of Data Transfer with CURL


If you want to limit the rate of data transfer when using curl command to download files, you can use –limit-rate option to specify the maximum transfer rate you want curl to use for both downloads and uploads files.

This option is also very useful if you have a limited pipe and you want your transfer process not to use your entire bandwidth.  And the given speed is measured in bytes/second by default, unless a suffix is append. Appending ‘k’ or ‘K’ will count the number as kilobytes, ‘m’ or ‘M’ makes it megabytes, while ‘g’ or ‘G’ makes it gigabytes. Examples: 200K, 3m and 1G.

For example, if you want to limit the download speed to 1K/s when downloading the installation package of chrome.  Using the following command:

$ curl –limit-rate 1K

Outputs:

root@devops:~# curl -O --limit-rate 1k  https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
0 54.6M    0  3595    0     0   1872      0  8:30:06  0:00:01  8:30:05  940

From the above outputs, you should see that the current transfer speed is near to the 1kB/s.

Get HTTP Headers Information of a URL with CURL


If you want to fetch the HTTP headers of a specified URL with CURL command, you can use –I option to get the header information of one URL. When used on an FTP or FILE file, curl displays the file size and last modification time only.

Let’s see the following example to get all the HTTP response headers of https://www.osetc.com, type:

$ curl –I https://www.osetc.com

Outputs:

root@devops:~# curl -I https://www.osetc.com
HTTP/1.1 200 OK
Date: Sat, 09 Mar 2019 10:33:45 GMT
Server: Apache/2.4.6 (CentOS) OpenSSL/1.0.2k-fips PHP/5.4.16
X-Powered-By: PHP/5.4.16
Link: <https://www.osetc.com/wp-json/>; rel="https://api.w.org/"
Content-Type: text/html; charset=UTF-8

Follow HTTP Location Headers with CURL


When the requested page has moved to a different location, such as: it indicated with a Location: header and a 3xx response code, and CURL command does not follow the HTTP location header. You can use –L or –location option to make curl redo the HTTP request on the new location.

For example, you try to request a web URL www.osetc.com, and it will be redirected to a new location: https://www.osetc.com. And if you want to redirect to the new location, you need to use –L option. Type the following command:

$ curl www.osetc.com
$ curl –L www.osetc.com

Outputs:

root@devops:~# curl  www.osetc.com
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://www.osetc.com/">here</a>.</p>
</body></html>

From the above outputs, you would notice that the request document has moved to https://www.osetc.com.

root@devops:~# curl -L  www.osetc.com | head
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
100   230  100   230    0     0    502      0 --:--:-- --:--:-- --:--:--   502
0     0    0     0    0     0      0      0 --:--:--  0:00:02 --:--:--     0<!DOCTYPE html>
<html lang="en-US" class="no-js no-svg">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="profile" href="http://gmpg.org/xfn/11">
<script>(function(html){html.className = html.className.replace(/\bno-js\b/,'js')})(document.documentElement);</script>

From the above outputs, you can see that the curl command redirected to the new location using –L option, and it downloaded the https://www.osetc.com html source code.

Pass HTTP Authentication with CURL


If you are using curl to request a web content that need to type a username and password to authenticate it. At this moment, you can use –u option to pass the user name and password to use for the remote server authentication. If you only specify the user name in curl command, and curl will prompt for a password. And the user name and password are split up on the first colon character.

The Syntax is as below:

$ curl –u username:password  https://www.osetc.com

Download Files from FTP Server with CURL


If you want to download files from a remote FTP server, you can also use curl command with –u and –O options.

For example, you want to download a file called text.tar from a FTP server ftp://myftp.com/pub/. And this file is located under pub directory in FTP server. Type the following command to download file to local disk with curl command:

$ curl –u ftpuser:ftppassword  -O ftp://myftp.com/pub/text.tar

This command will try to download the text.tar file from the myftp ftp server and store it in the current directory on your system.

If the specified FTP URL is also refer to a directory, and the above curl command will only list all the files under that directory in the given URL. Like the below command:

$ curl –u ftpuser:ftppassword  -O ftp://myftp.com/pub/

Upload Files to FTP Server with CURL


The above section explains that how to download file from ftp server, and you can also upload a file to the remote FTP server with –T option using CURL command.

For example, you would like to upload a file called text.tar to the FTP server ftp://myftp.com/pub/, just type the following command:

$ curl –u ftpuser:ftppassword  -T ./text.tar ftp://myftp.com/pub/

Set User Agent in HTTP Request with CURL


If you want to specify the user-agent string in HTTP request to send to the HTTP server, you can use –A option or — user-agent option in curl command.

For example, you want to set one user-agent string as “this is user agent from scott!!!!”, using the following command:

$ curl –I https://www.osetc.com/  --user-agent “this is user agent from scott!!!!”

Outputs:

root@devops:~# curl –I www.osetc.com/  --user-agent "this is user agent from scott"
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved <a href="https://www.osetc.com/">here</a>.</p>
</body></html>

[scott@test logs]# tail -f access.log
111.198.231.159 - - [09/Mar/2019:11:20:29 +0000] "GET / HTTP/1.1" 301 230 "-" "this is user agent from scott"

Check If URL Support HTTP/2 Protocol with CURL


If you want to check if a URL supports HTTP/2 protocol, you can use curl command with –I option and then combine with grep command to check it.  For example, you want to check URL  https://www.osetc.com  if it supports HTTP/s protocol, type the following command:

$ curl –I https://www.osetc.com | grep HTTP

Outputs:

root@devops:~# curl -I https://www.osetc.com | grep HTTP
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0
HTTP/1.1 200 OK

From the above outputs, you should see that it only supports HTTP/1.1 protocol.

Save Cookies with CURL


If you want curl to write all cookies to a specified file after a completed operation. You can use –c or –cookie-jar options to write all cookies from its in-memory cookie storage to the give file at the end of operations.

For Example, you want to save cookies to a give file called bingcookies.txt after fetching https://www.bing.com/index.html, using the following commands:

$ curl --cookie-jar bingcookie.txt https://www.bing.com/index.html –O
$ cat bingcookie.txt

Outputs:

root@devops:~# curl --cookie-jar bingcookie.txt https://www.bing.com/index.html -O
% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
Dload  Upload   Total   Spent    Left  Speed
0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0

root@devops:~# cat bingcookie.txt
# Netscape HTTP Cookie File
# https://curl.haxx.se/docs/http-cookies.html
# This file was generated by libcurl! Edit at your own risk.
#HttpOnly_.bing.com     TRUE    /       FALSE   0       _EDGE_S F=1&SID=17D7595C2B246363189E54462A0A6204
#HttpOnly_.bing.com     TRUE    /       FALSE   1585835514      _EDGE_V 1
.bing.com       TRUE    /       FALSE   1585835514      MUID    11E7B9E8E89860DE07EBB4F2E9B6610E
#HttpOnly_www.bing.com  FALSE   /       FALSE   1585835514      MUIDB   11E7B9E8E89860DE07EBB4F2E9B6610E

Send Cookies with CURL


You can pass the data to the HTTP server in the cookie header, when making HTTP requests using CURL. You can use –b or –cookie option to send specific cookies to remote HTTP server. The data should be in the format “NAME1=VALUE1; NAME2=VALUE2”. Or you can also specify a cookie file to send it to HTTP server. For example, you want to send cookie file called bingcookie.txt to make HTTP requests to remote server, type:

$ curl –cookie bingcokie.txt https://www.bing.com

Use Proxy to Download a File with CURL


If you need to use a proxy server on proxyauto.test.com port 8080 to fetch a webpage source code with curl command, you need to use –x option. The proxy string can be specified with a protocol:// prefix. No protocol specified or http:// will  be  treated as HTTP  proxy.  Use  socks4://,  socks4a://,  socks5:// or socks5h:// to request a specific SOCKS version to be used.

HTTPS proxy support via https:// protocol prefix was added in 7.52.0 for OpenSSL, GnuTLS and NSS.

Like this:

$ curl –x proxyauto.test.com:8080 https://www.osetc.com/en/how-to-install-bower-on-ubuntu-16-04-18-04-linux.html

And if the proxy server need to authenticate when curl fetch files, you need to use the –U option to specify username and password by a colon, like this:

$ curl –x proxyauto.test.com:8080 –U username:password  https://www.osetc.com/en/how-to-install-bower-on-ubuntu-16-04-18-04-linux.html

Send Mail with CURL


You can also use CURL to send mail using the SMTP server, and you need to specify a single address that the given mail should get send from. And specify a single address, user name or mailing list name. Like this:

$ curl –mail-from sender@osetc.com   --mail-rcpt  receiver@osetc.com   smtp://smtpserver.com

Conclusion


You should know that how to install and use cURL tool on Ubuntu or CentOS Linux from this guide, and you also know how to install cURL tool with the different methods on Ubuntu Linux server. If you want to see more information about cURL, you can go the official web site of cURL directly.

 

You might also like:

Sidebar



back to top