To view and retrieve websites offline.
Although we are constantly connected to the internet , for different reasons you may want to download certain content to your computer or mobile device.
The reasons can be many. That the speed of your connection prevents you from watching a video or listening to a song without interruptions, that the website in question disappears or changes its content , avoid internet cuts or blockages.
In previous articles we have seen different ways of downloading things from the internet, such as online videos or getting the audio and video of any page without having to delve into the source code of the page in question, something that we can also do through the Inspect function element of your browser.
But there is an option that is not taken into account so much, downloading complete web pages, perhaps because today we have faster connections or because most of us navigate from mobile devices .
Downloading complete web pages can be very useful for, for example, getting all the multimedia content of a page or website from the flip. Or simply, as a backup in case we want to enjoy online content without accessing the web again.
It may also be the case that this website has disappeared or is not available, either because the server has dropped , because its author has removed it or because our provider prevents it.
For these and for many more reasons you would like to know that there are specialized tools to analyze the content of a web page and save it. Let’s see three examples.
HTTrack Website Copier
Free, easy to use and available for Windows, Linux and Android. HTTrack Website Copier connects to the web page that we indicate and shows us its content tree.
From there, and of the rules that we configure, HTTrack will download everything we indicate, both folders and subfolders such as HTML and PHP files, images and even videos . Anything that is linked to the website.
As incentives, we can create filters to discard specific file formats and thus not download what does not interest us. Then, while saving the content, we can skip files that we see unnecessary.
Also free but exclusive for Windows, Cyotek WebCopy serves both to scrutinize the interiors of any website and to download the content that interests us.
From a single window, organized in panels and tabs, we will have access to the content tree of the address that we indicate.
Then, based on certain rules that we indicate to include or exclude files, WebCopy will save the website content on our computer.
Another utility that we can give Cyotek WebCopy is to find broken links or lost pages on our own website. This way we will prevent our visits from being an unpleasant surprise.
If you need a tool similar to the previous ones but that works for both Windows and Mac and Linux, Darcy Ripper is one of the best options, since it does not require installation and is compatible with any operating system as it is a Java executable.
Like the previous solutions, Darcy Ripper analyzes a website from the link we provide and will download the content according to the rules we stipulate to download only what we want. We can, for example, limit the size of what is downloaded or download to a certain level of depth within the organization chart of the website.
In the end, we will see a history with everything downloaded and if there has been an error or problem during the download.
Other advantages are the possibility of making several connections at once to speed up the task, see the progress in real time and deal with both HTTP and HTTPS pages.