Say we have project that requires web scraping. (parsing strings (< 40) and scraping web pages (geting meta datas and such) I am aware of that perl has great and suited cpan modules for this job, so i can take that way and don't bother myself that much. But i don't have a clue about speed and memory related stuff.
So, which would you choose? (May be Python??) And in terms of speed which one is better for this job? Explain please...
Thanks in advance.
Use Perl or Python. Both have tons of libraries for web scraping.
In Python you could use BeautifulSoup
to parse even the crappy kind of HTML lots of pages like to use.
I once successfully used Perl with WWW-Mechanize in such a context. Hopefully you don't need to evaluate .js
.
I would go with perl... I haired a rumor that was the language that google used initially... Python is a good performance language as well.
精彩评论