I keep running into a problem with multiple ajax requests that I tend to send to fetch data to refresh different components of a webpage. I always use my custom ajax functions(shared below) to do the ajax calls. And I find that the requests at times collides. I also make sure that the requests are sent in intervals of two seconds hoping that it serializes the way I send requests. Still I find that it does collide at times. What Am I doing wrong? Or rather, what changes can I do to my functions to make it work seamlessly, everytime?
Functions :
function get_xmlhttp_obj()
{
try
{
// Firefox, Opera 8.0+, Safari
xmlHttp = new XMLHttpRequest();
}
catch (e)
{
// Internet Explorer
try
{
xmlHttp = new ActiveXObject("Msxml2.XMLHTTP");
}
catch (e)
{
try
{
xmlHttp = new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e)
{
alert(invalidbrowser);
}
}
}
return xmlHttp;
}
function passUrl(url1)
{
url1 = url1+"&sid="+Math.random();
xmlHttp=get_xmlhttp_obj();
xmlHttp.onreadystatechange = function() { stateChanged(xmlHttp); };
xmlHttp.open("GET", url1, true);
xmlHttp.send(null);
}
function passposturl(url1,params)
{
xmlHttp=get_xmlhttp_obj();
xmlHttp.open("POST", url1, true);
xmlHttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xmlHttp.setRequestHeader("Content-length", params.length);
xmlHttp.setRequestHeader("Connection", "close");开发者_StackOverflow
xmlHttp.onreadystatechange = function() { stateChanged(xmlHttp); };
xml Http.send(params);
}
function stateChanged(xmlHttp)
{
if (xmlHttp.readyState == 1 || xmlHttp.readyState == 2 || xmlHttp.readyState == 3 || xmlHttp.readyState == 0)
{
// Wait state. I load loading text or image here
}
else if(xmlHttp.readyState == 4)
{
// catch the response and display in the specific container.
if(loadflag=="global_live_xjournal_feed")
{
document.getElementById("global_live_xjournal_feed").innerHTML=xmlHttp.responseText;
setTimeout("call_refresh_global_xjournal_feed", 5000);
}
}
}
function call_refresh_global_xjournal_feed(obj)
{
if(obj=="global_xjournal_feed")
{
var url = "ajax_activity_feed.php?id="+obj;
passUrl(url);
loadflag="global_live_xjournal_feed";
}
}
If you want to ensure your async calls are being performed in a serial fashion you can always use a recursive asynchronous call that has a setTimeout() in the callback. So for example:
function DoSomething() {
$.get('/ajax/call', function() {
//do whatever
setTimeout(DoSomething, 2000);
}
}
In this example I'm using a jQuery example but you could easily switch your function out with $.get().
This way you will never start another asynchronous call until the first one has completed. It might be a bit longer than 2 seconds, but everything will run in serial.
The way you have your code, each request is using a different xmlHttp object. There is no collision as each object will have it's own separate state. When processing the results, you just have to not assume any global state. As long as all you're looking at is the state in the particular xmlHttp object that has been called in order to know what to do, you should be fine. If both requests finish at the exact same moment, one will get into the JS queue before the other, it will run it's statechange function to completion and then the second one will get it's statechange notification.
For asynchronous calls like this, there is usually zero reason to spread them out unless you need the results of the first before firing the second. In fact, you get better end-to-end performance if you start both requests right away and then just wait for them both to complete. Whichever one completes first, you will get it's statechange callback and then the second one will come in after that.
FYI, if you are debugging with breakpoints or looking at debug console output, you will have to pay strict attention to which xmlHttp object is which when interpreting the output since different stages of their progress may be interleaved.
My AJAX library would assist with this in two posible ways:
1) The library allows multiple calls to be "packaged" into into a single request such that a handler is only called when ALL of the calls are completed.
2) The other is to create a request pool with a single caller (HTTP object). Normally pools would have multiple callers (allowing multiple simultaneous requests) - but limiting this to a single caller would force any requests queued to that pool to be run in serial.
I've found the library very useful in my work, it can be found here:
http://depressedpress.com/javascript-extensions/dp_ajax/
Hope this helps.
精彩评论