Let's say you have a list of images that another tool has created. Let's say you need to download each of those images and then do something with them. Instead of manually pasting each URL into a browser and saving the image, this trick will do them all in one go.
Create a text file like, such as
images.txt. Enter all the URLs into it like so:
Assuming you have
cd into the folder with the text file in, and run this:
wget -i images.txt
It'll download the image in the same folder, and you'll end up with a folder structure like this:
And that's it! You now have all the images listed in the text file, saved locally.
Update: Jan 17th 2022
If you add
-x after the file name, it will replicate the remote file structure. Taking our earlier example of:
With the updated command:
wget -i images.txt -x
Will yield the following:
Update: Nov 21st 2023
Sometimes, you might need to set a header. Here's how to do that.
header='--header=User-Agent: Mozilla/5.0 (Windows NT 6.0) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.97 Safari/537.11'
wget "$header" -i images.txt -x