I do have permission to do this.
I've got a website with about 250 pages from which I need to download the 'product descriptions' and 'product images'. How do I do it? I'd like to get the data out into a CSV, so that I can put it in a DB table. Could someone point me to a good tutorial to get started on this? I should be using cURL, right?
So far, I got this from another stackoverflow page, How do I transfer wget output to a file or DB?:
curl somesite.com | grep sed etc | sed -e '/^(.*)/INSERT tableName (columnName) VALUES (\1)/' |psql dbname
And I created this, which sucks, to get the images:
#!/bin/bash
lynx --source "www.site.com"|cut -d\" -f8|grep jpg|while read image
do
wget "www.site.com/开发者_高级运维$image"
done
by watching this video: http://www.youtube.com/watch?v=dMXzoHTTvi0.
You want to do what's called screen scraping.
Here are some links to get you started:
- http://www.bradino.com/php/screen-scraping/
- http://www.developertutorials.com/tutorials/php/easy-screen-scraping-in-php-simple-html-dom-library-simplehtmldom-398/
- http://www.weberdev.com/get_example-4606.html
- http://www.google.com/search?q=screen+scraping+php
精彩评论