I have a regular, nested HTML unordered list of links, and I'd like to scrape it with PHP and convert it to an array.
The original list looks something like this:
<ul>
<li><a href="http://someurl.com">First item开发者_Python百科</a>
<ul>
<li><a href="http://someotherurl.com/">Child of First Item</a></li>
<li><a href="http://someotherurl.com/">Second Child of First Item</a></li>
</ul>
</li>
<li><a href="http://bogusurl.com">Second item</a></li>
<li><a href="http://bogusurl.com">Third item</a></li>
<li><a href="http://bogusurl.com">Fourth item</a></li>
</ul>
Any of the items can have children.
(The actual screen scraping is not a problem, I can do that.)
I'd like to turn this into a PHP array, of just the links, while keeping the hierarchical nature of the list. Any ideas?
I've looked at using htmlsimpledom and phpQuery, which both use jQuery like syntax. But, I can't seem to get the syntax right. I can get all the links, but I end up losing the hierarchical nature and order.
Thanks.
Use DOMDocument and SimpleXMLElement along the lines of:
$doc = new DOMDocument();
$doc->loadHTML($html);
$xmlStr = $doc->saveXml($doc->documentElement);
$xml = new SimpleXmlElement($xmlStr);
$links = array();
foreach ($xml->xpath('//a') as $li) {
$links[] = $li->attributes()->href;
}
If href is being added to $links as a SimpleXMLElement, use ob_start and ob_clean to capture the string.
Cheat sheet for xpath queries (pdf)
精彩评论