Copy web sites the -r way.

Discuss Applications
Post Reply
bazoukas
programmer
programmer
Posts: 192
Joined: Tue Jan 14, 2003 1:38 pm
Location: NYC
Contact:

Copy web sites the -r way.

Post by bazoukas » Mon Sep 26, 2005 6:10 am

http://www.httrack.com/

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer.

Very usefull app. I have archived few web sites with content that I want to keep, in case the site goes down.


It runs on both Linux and windars.
If you use SUSE and you are too lazy to compile from source you can use
RPM i386 package (Mandrake & RedHat) and it will work with no problem.
I am not advocating mixing RPMs but in this case it works.

User avatar
Void Main
Site Admin
Site Admin
Posts: 5716
Joined: Wed Jan 08, 2003 5:24 am
Location: Tuxville, USA
Contact:

Post by Void Main » Mon Sep 26, 2005 8:16 am

How does it differ from wget? Oh BTW, Howdy! Where you been the last year? I see NYC in profile. :)

bazoukas
programmer
programmer
Posts: 192
Joined: Tue Jan 14, 2003 1:38 pm
Location: NYC
Contact:

Post by bazoukas » Mon Sep 26, 2005 8:44 am

Void Main wrote:How does it differ from wget? Oh BTW, Howdy! Where you been the last year? I see NYC in profile. :)
Hello my good Sir:)
I have sent you a PM btw.

At any rate, no difference really besides the fact that it gives you an index in your browser of all the sites you have archived. Which makes it kinda uselfull if you have archived a lot of them. You can use this program either from your browser or from the shell.

Post Reply