[Israel.pm] RIP sites with perl

yaron at kahanovitch.com yaron at kahanovitch.com
Sun Jul 22 12:49:02 PDT 2007


If you wish to get a mechanism that enables you to iterate over pages and download, you should consider WWW::Mechanize 
Take a look at some examples at 
WWW::Mechanize::Examples => http://search.cpan.org/author/PETDANCE/WWW-Mechanize-1.30/lib/WWW/Mechanize.pm
WWW::Mechanize::FAQ => http://search.cpan.org/author/PETDANCE/WWW-Mechanize-1.30/lib/WWW/Mechanize/FAQ.pod

Yaron Kahanovitch
----- Original Message -----
From: "Chanan Berler" <bc.other at gmail.com>
To: Perl at perl.org.il
Sent: 09:22:48 (GMT+0200) Africa/Harare יום ראשון 22 יולי 2007
Subject: [Israel.pm] RIP sites with perl

Hi All,

I am trying to RIP a secured site found in the net.
Currently i used wget -r command i have installed on my comp
(i got winXP). But i am looking to do the same using perl.

Q1: is there a package / module i can use to RIP sites?
Q2: if i am looking to RIP only the images / docs from that site
      is there a way to do it better then using wget A.jpg -r -c "url"


----     Chanan Berler    ----
Perl mailing list
Perl at perl.org.il

More information about the Perl mailing list