I'm writing a web service which, for each method call, must write a log entry into database. At a specific 开发者_Python百科time, the call to this method may be very intensive (up to 1-2k request/minute), and our server is not so strong.
I want to use a List<Log>
to cache the log entries, and then:
- batch insert 30-40 rows to the database, which greatly reduces the overhead
- when no more requests in more than 30 secs or 1 minute, all of the remaining cache will be written to the database.
The first condition is OK, but I don't know how to implement for the second condition.
How can I do this, without adding too much overhead to my web service?
EDIT: I solved this based on Wheat's suggestion.
- For each log entry, I send it directly to MSMQ queue and forget about it
- An separated service run continuously, take all of log entries are currently in the queue, bulk insert them to database, and then sleep for 30 seconds.
MSMQ was very helpful in this case!
You could use MSMQ or SQL Server Service Broker.
It sounds like you want to trigger a task on an interval of some kind (30-60 seconds.) The trick is to have ASP.NET open and iterate through the cache at your desired interval (30 or 60 seconds) without having an incoming web request trigger this. Some discussion on this article: Easy Background Tasks With ASP.NET.
I'll suggest 2 options for storage with this approach:
1.. For an ASP.NET and IIS only solution, you could write to the ASP.NET Application Cache.
List<Log> myLogs = new List<Log>();
myLogs.Add(new Log{ Text = "foo"});
Cache["MyLogs"] = myLogs;
Consider this problem, though. The ASP.NET Application Cache isn't durable storage. What happens when your application dies, or IIS is reset, or the machine loses power? Do you care about what was in cache?
2.. Embed a SQL Compact database in your application. Write your Logs there. Combine this storage mechanism with the Easy Background Tasks With ASP.NET.
IMO, I'd fully go with MSMQ or SQL Server Service Broker option suggested in another answer. This guarantees durability for you. Hopefully you've got full control of that web server to leverage these components. One requires Windows components to be installed/enabled/secured, and the other is a SQL Server specific feature.
I would suggest using NServiceBus to do your logging. It is durable, so even if your server dies, as soon as you restart (as long as your disk hasn't failed) NServiceBus will continue where it left off. So if you set it to send messages at an interval, any messages you had queued up will be transmitted, even if the computer had been powered down, and then back up.
So you want to persist data someplace and slowly drain it to the database. This sounds like a job for redis or memcached. Essentially you have a cron or scheduled task that runs every 30 seconds, finds up to 40 keys waiting to written and then writes it to the database.
Redis can easily handle 1-2k requests a minute and has a LIST primitive that should work for nicely for log strings.
I suggest you to use System.Timers
to create a timer that time to time check time to time if you are idle, and then make your saves to the database.
System.Timers and not other times, because this kind of timer can do the job you ask for.
I would use HttpUtility.Cache and its events to get informed after a certain time without access to an object in the cache (sliding expiration). In this way, you could "touch" a LoggerInformation object in the cache whenever yuo put something in your log list. When the expiration takes place, simply bulk insert the log data. When a new log entry occurs, you (re)create a LoggerInformation object in the cache with a new sliding expiration.
The IIS Cache classes will provide you with all you need for the sliding expiration and the removal events, see the corresponding msdn documentation...
but as already said, this is not persisted and all logs will be lost if your webserver will go down before the log is written to the database.
精彩评论