I'm using HtmlUnit for web testing and it is great. But I also want to test the download time for the total page including linked images, javascript and CSS pages as well as ensure the links are valid. Is there a way to tell HtmlUnit to grab all the dependancies开发者_运维百科 for any given html page?
I don't know of anything as part of HtmlUnit itself, but it's not hard to do yourself.
Pull out all the dependent links yourself — e.g. use getByXPath()
to retrieve all the <a>
tags — and iterate through them and have HtmlUnit pull each down individually.
Just be careful using this to determine download times: unless you emulate how a "real" browser would retrieve things in parallel, you will not get an accurate measure. A tool like Xenu would be a better tool for this job.
精彩评论