How to `wget` a list of URLs in a text file?

ShanZhengYang picture ShanZhengYang · Dec 6, 2016 · Viewed 96.1k times · Source

Let's say I have a text file of hundreds of URLs in one location, e.g.

http://url/file_to_download1.gz
http://url/file_to_download2.gz
http://url/file_to_download3.gz
http://url/file_to_download4.gz
http://url/file_to_download5.gz
....

What is the correct way to download each of these files with wget? I suspect there's a command like wget -flag -flag text_file.txt

Answer

dim-0 picture dim-0 · Dec 6, 2016

Quick man wget gives me the following:

[..]

-i file

--input-file=file

Read URLs from a local or external file. If - is specified as file, URLs are read from the standard input. (Use ./- to read from a file literally named -.)

If this function is used, no URLs need be present on the command line. If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line.

[..]

So: wget -i text_file.txt