How to download all links to .zip files on a given web page using wget/curl?

uyetch picture uyetch · Nov 23, 2012 · Viewed 66k times · Source

A page contains links to a set of .zip files, all of which I want to download. I know this can be done by wget and curl. How is it done?

Answer

creaktive picture creaktive · Nov 26, 2012

The command is:

wget -r -np -l 1 -A zip http://example.com/download/

Options meaning:

-r,  --recursive          specify recursive download.
-np, --no-parent          don't ascend to the parent directory.
-l,  --level=NUMBER       maximum recursion depth (inf or 0 for infinite).
-A,  --accept=LIST        comma-separated list of accepted extensions.