Wget is a computer software package for retrieving content from web servers using HTTP, HTTPS and FTP protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc. Its features include recursive download, conversion of links for offline viewing of local HTML, and support for proxies.
Lets have a look at some examples of using wget:
1. Download a webpage
$ wget http://www.examplewebsite.com/
2. Download a file from ftp server
$ wget ftp://ftp.examplewebsite.com/source-code.tar.gz
If you specify a directory, Wget will retrieve the directory listing, parse it and convert it to html
$ wget ftp://ftp.examplewebsite.com/
Both cUrl and Wget can be used to download files, but the question comes which one to use. cUrl and Wget have some similarities and as well as differentiating factors, the one to use depends on your need.