I’ve recently been experimenting with HTTrack, an open-source utility that makes it possible to download a full copy of any website. HTTrack is essentially a web crawler, allowing users to retrieve every page of a website merely by pointing the tool to the site’s homepage.
From the HTTrack homepage:
“[HTTrack] allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure.”
I thought I’d share my experience with it.
Read more on Create a Local Copy of a Website with HTTrack…