I am trying to retrieve some information about any given webpage (namely the of the page and the of the domain) and then make a jquery asynchronous POST with the retrieved information. Unfortunately, the javascript execution reaches the $.post(), but never makes the actual web request. Here is my code:
$.get('../embed', {u: url}, function(html) {
alert('got "' + html + '"');
$.post('/media/add', { story: storyid, caption: caption, type: 5, title: title, content: html, meta: meta }, function(data) {
var obj = jQuery.parseJSON(data);
开发者_开发百科 var thumb = imageUrlFromMedia(obj);
var clip = addToClipboard(obj.id, thumb || '/img/icons/embedly.png', obj.name);
});
});
Is it even possible to make another ajax call in the success handler of $.get()? Has anyone here had any success with multiple chained HttpRequests before?
You can use $.ajax to find errors in the XML HTTP requests or JSON parsing (may not work for JSON parsing unfortunately). That should still display as text even if it fails to parse. Anyways, to get failure messages refactor as so
$.ajax({
url: '../embed',
data: {u: url},
success: function(html){
..
$.post
..
},
error: function(XMLHTTPRequest, textStatus, errorThrown){
//Danger Will Robinson
}
});
If you don't see a second request, this is the request that is failing. If you still don't find an error, try doing the same with the $.post. You should consider refactoring $.post to $.ajax. By the way, parseJSON is automatic if you use $.getJSON instead of $.post. This is akin to dataType: 'json' within $.ajax.
精彩评论