开发者

Logging Requests With Timing

开发者 https://www.devze.com 2023-03-07 14:43 出处:网络
What is the best way to l开发者_如何学Cog http requests for a web application, including ajax requests, so that I can later go back and query \"I want to know how many times this request was made, and

What is the best way to l开发者_如何学Cog http requests for a web application, including ajax requests, so that I can later go back and query "I want to know how many times this request was made, and how long it took to complete on average", or "show me the top 5 highest average time requests"

Would you use a separate database from the current production db to log these things to prevent all of those inserts causing IO slowdown, or does this end up not really making a big impact?

Would you bulk up requests and then push to the DB or would you do a single insert for each request?

Is there a better way to add this request logging in with timings besides wrapping each request handler in the application logic like:

start = CurrentTime()
/* request handler code */
end = CurrentTime()
Insert(requestName, start, (end - start))


You should be able to use your web server logs for this purpose. Apache and IIS logs both capture URL, query string, response code and duration. If the AJAX requests receive data via HTTP POST you will need to change the web server's configuration to capture that data if it's important to you. Then the best tool I've found for log analysis is Microsoft's Log Parser, which does a great job working with large files with a SQL syntax to calculate answers to the kinds of questions you're asking.

However, if you're intent on rolling your own, use some sort of local logging. And use a logging framework - such as log4J - which is smart enough to buffer out disk writes to minimize I/O, roll the log files, and delete old files. This is more scalable approach for clustered servers, otherwise each is hitting the database constantly. If you want to put the data in a table, make it a batch process, once an hour or day for instance.


Seems like an excellent use-case for Google Analytics (see event tracking, in particular).

If it's not an option, think early about scalability:

  1. Don't log from within the server page you're serving, as this can end up working against caching. Use a script or 1x1 image to toss in parameters, and have that operate in a separate (non-cached) process.

  2. Avoid hammering your hard drive if you end up making it DB-based. Use memory-based storage to store stats as they come, and periodically persist its content into your database. (Recall that Google Analytics was overwhelmed when Google initially opened it.)


If your goal is to simply track the timing of requests and save them to a DB but you don't want the inserts to potentially bog down you requests, then you should use a DB that supports high-speed inserts such as MongoDB. If you tweak the connection string to strict mode = off then the inserts are basically asynchronous.

Here's a good starting point:

The NoSQ LMovement, LINQ, and MongoDB

If you're doing this over ASP.NET MVC only, then you could put that together with a custom action filter and get it out of your controller code as well. I suppose you can also achieve that with a custom ASP.NET module and get all the requests (even non-ASP.NET) in IIS 7+.

If NoSQL / MongoDB isn't your thing, you can use SqlCommand.BeginExecuteNonQuery for inserts rather than making it a blocking call. If you insert to the same DB as production though you can cause some contention there.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号