Language: EN

descargar-ficheros-wget

How to Download All Files from a Web with WGET

Let’s see a mini tutorial on how to download all the files linked on a web page with a single terminal command thanks to the Wget application.

Imagine that you have found a very interesting website, with many free and public resources. For example, an image bank, video game assets, documents, PDFs, or whatever.

You would like to download them all. Well, you could give “right-click / save” to all, but … it’s not our style to click 500 times on a button 😉.

Normally, in this case, you would look for an application to help you download it. However, it is not necessary. We can solve it very easily with our old friend Wget.

Wget is a command-line tool that allows us to make web requests easily. Although there are many more complex solutions, Wget is a very quick and easy option to use if all we want is to download files.

In Linux, you can use Wget directly, because most distributions come with it pre-installed. In Windows, you can pick up an available version, like GNU Wget 1.21.4 for Windows, or simply use WSL.

We just have to use this command, for example:

wget --no-parent -A pdf -r http://www.website.com/url

In this example,

  • It will download all the files with the PDF extension on the website and in all the subfolders of the url folder.
  • The —no-parent option tells Wget not to download files from parent folders of the specified folder.

Of course, you can configure and play with the parameters to adapt them to your needs.

It is important to say that this will only work for links to files that are available directly on the Internet, that is, public. If a file is protected behind some security mechanism, like a captcha, it won’t work.

That’s how simple you can download all the files from a website with just a command in the command line, without having to create a script, or use a third-party program. Until next time!