I'm attempting to write a google chrome extension, which is effectively crawling a page, since chrome extensions allow cross-origin XHR reques开发者_开发知识库ts.
However, when it does it, it also tries to load EVERY SINGLE IMAGE on the page. This doesn't actually result in the images loading, as the paths are all relative, but it does result in the console becoming clogged with errors.
My question is, can I do a jQuery.get()
to request a webpage, without accidentally trying to preload all the images?
EDIT
Code looks like this:
$.get(
url,
function parseData(data) {
console.log("Images are automatically preloaded once " +
"this function exits, for some reason");
},
'html'
);
As you dont show your code, my guess is that you load the result into the DOM? once the are detected by the DOM, it will probably try to load them?
So perhaps if you load into a simple variable and then replace/remove all the images first?
put the pulled data into a hidden div, remove the imgs and show the hidden div
html
<div id="junk"></div>
script
$('document').ready(function(){
$('#junk').hide();
});
function parseData(data){
$('#junk').html(data).children('img').remove();
$('#junk').show();
};
example http://jsfiddle.net/E69UR/
精彩评论