开发者

Can i use wget to download multiple files from linux terminal

开发者 https://www.devze.com 2023-03-23 02:27 出处:网络
Suppose i have a directory accessible via http e,g Http://www.abc.com/pdf/books Inside the folder i have many pdf files

Suppose i have a directory accessible via http e,g

Http://www.abc.com/pdf/books

Inside the folder i have many pdf files

Can i use something like

wge开发者_如何学编程t http://www.abc.com/pdf/books/*


wget -r -l1 -A.pdf http://www.abc.com/pdf/books


from wget man page:

   Wget can follow links in HTML and XHTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site.  This is
   sometimes referred to as ``recursive downloading.''  While doing that, Wget respects the Robot Exclusion Standard (/robots.txt).  Wget can be instructed to convert the
   links in downloaded HTML files to the local files for offline viewing.

and

   Recursive Retrieval Options
   -r
   --recursive
       Turn on recursive retrieving.

   -l depth
   --level=depth
       Specify recursion maximum depth level depth.  The default maximum depth is 5.


It depends on the webserver and the configuration of the server. Strictly speaking the URL is not a directory path, so the http://something/books/* is meaningless.

However if the web server implements the path of http://something/books to be a index page listing all the books on the site, then you can play around with the recursive option and spider options and wget will be happy to follow any links which is in the http://something/books index page.

0

精彩评论

暂无评论...
验证码 换一张
取 消