http://www.httrack.com/
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer.
Very usefull app. I have archived few web sites with content that I want to keep, in case the site goes down.
It runs on both Linux and windars.
If you use SUSE and you are too lazy to compile from source you can use
RPM i386 package (Mandrake & RedHat) and it will work with no problem.
I am not advocating mixing RPMs but in this case it works.
Copy web sites the -r way.
Hello my good Sir:)Void Main wrote:How does it differ from wget? Oh BTW, Howdy! Where you been the last year? I see NYC in profile.
I have sent you a PM btw.
At any rate, no difference really besides the fact that it gives you an index in your browser of all the sites you have archived. Which makes it kinda uselfull if you have archived a lot of them. You can use this program either from your browser or from the shell.