开发者

Script that Auto-Downloads Files From a Specified URL

开发者 https://www.devze.com 2023-04-02 07:03 出处:网络
I am trying to write a script that searches for specific MIME types on a website containing download links (.mp开发者_运维技巧3, .avi, .mpeg) and want to be able to crawl that site and then download a

I am trying to write a script that searches for specific MIME types on a website containing download links (.mp开发者_运维技巧3, .avi, .mpeg) and want to be able to crawl that site and then download any files stored there. The pseudocode for this script:

Input URL
function search ()
{
    Search URL for matching MIME types and download to dir on my pc
}

Okay, that was really bad, but I am still learning. Would a Perl script be best for this?


Take a look at the wget command. Here is an example command which will search a site recursively for all mp3, avi and mpeg files and save them into the current directory:

wget -r -H -nd -N -np -A.mp3,.avi,.mpeg http://www.someurl.com

This is what the options mean:

-r turns on recursive retrieving
-H Enable spanning across hosts when doing recursive retrieving.
-nd to save all files in a single directory
-N timestamping
-np Do not ever ascend to the parent directory when retrieving recursively.
-A file name suffixes to accept

You can also add other options for recursion depth, timeouts etc. See man wget for more information.


Yes, it absolutely would. Have a look at the Module WWW::Mechanize.

0

精彩评论

暂无评论...
验证码 换一张
取 消