Okay so first off, sorry if the title doesn't make much sense...I'm not sure how else to summarize it!
So here's the issue:
I am using jQuery to attach a click event to my form's submit button. The jQuery click event triggers some GA code to track a virtual page view so I can use it as a step in a Goal funnel.
But what happens is that there's no delay between the GA code executing and the submit, so I'm concerned that GA isn't actually getting the data.
When I look at what's happening in firebug or httpfox (browser addons that look at the requests/response) vs. charles proxy (external sniffer, separate from browser) I am seeing two different things.
With firebug/httpfox I see the GET request to GA but status of 0 and it is showing up as (Aborted) NS_BINDING_ABORTED
...though it does show bytes having been sent etc.. just nothing for response.
But with charles proxy, I am seeing the same GET request with a status of 200 and the 1x1 pixel response.
So my theory here is that GA is receiving the data, but that the browser is moving on before it gets the actual response - which I'm okay with, as long as 开发者_JAVA技巧GA is getting the data, I'm okay with this. But it is just my theory and I don't know...
I know I can write the code to simply delay the execution of the submit by 500ms or whatever as insurance, but I don't wanna have to do that if it's not necessary..
And I know if nothing else I can just see if the data is showing up in GA but GA has a 24-48 hour delay on data so it is hard to QA.
Does anybody know or have any suggestions from experience...has anybody else experienced this "abort" thing and can say one way or the other if it is necessary to delay the submit or whatever?
HTTPFox is not a real sniffer. It just tries to mimic one. So the data you see on it is not always what is really happening in the background. Charles should get you a better picture. If you're seeing the 200 code in Charles. So the chances are that the hit is going through.
The bad news is that when you fire hits at the time the page unloads. (Outbound clicks, insite link clicks, form submissions, window.unload, etc) they won't go through every now and then. This happens because the Google Analytics JS Call basically appends a GIF to the page. And it returns after that. Than the browser will load the gif. When the code returns the browser is free to go, and if it goes away from the page it will cancel any pending requests that it may have, including that small GIF image. So the browser might haven't sent the tracking code, or might have sent it, but the TCP connection didn't go through and the browser would need to resend the package, but he's not willing to do it anymore.
So if accuracy is a need for you, you should add a 200-500 ms delay to it. But remember that Google analytics is not an Accuracy Tool, and if some events don't go through it won't probably affect the final outcome of your analysis.
According to the informatin on this page: Sending Data to Google Analytics there is a possibility that your data is not really being sent (the bad news, as Eduardo said). Transcribing the most important information of that page, related to your doubts:
Many browsers stop executing JavaScript as soon as the page starts unloading, which means your analytics.js commands to send hits may never run.
An example of this is when you want to send an event to Google Analytics to record that a user clicked on a form's submit button. In most cases, clicking the submit button will immediately start loading the next page, and any ga('send', ...) commands will not run.
The solution to this is to intercept the event to stop the page from unloading. You can then send your hit to Google Analytics as usual, and once the hit is done being sent, you can resubmit the form programmatically.
精彩评论