I want to write (or implement) a quick and easy logging solution for our website. I figure rss is fine for the format.
I was thinking of piping apache's error log to a simple script that, if the url passes a blacklist, gets logged into a sqlite database. The database will store for each log entry the ip address, url, current count of this error, hash of this error, datetime, etc.
I was going to capture javascript/ajax errors by having javasascript make image requests that will 404 so that they will get logged as well.
We already capture php errors in its own log. I could easily integrate it into this database. Another script would be used to generate the rss.
I was 开发者_StackOverflow中文版hoping logrotate would handle rotating the sqlite database file to prevent it from getting too large.
Granted this won't scale, are there any issues I should avoid? Better quick and hacky solutions?
Hmmm. That sounds like you're going to collect an awful lot of data, more than you will ever be able to make sense of. What is the exact reasoning behind this? Do you want to simply have an error-free website (an effort I certainly respect), or do you have specific bugs/situations you want to address/prevent by using exhaustive reporting?
If you have to log Javascript errors - an idea that I find interesting but am not quite sure of what to think of yet - why not report them via AJAX? Much cleaner than creating 404 requests - logging 404 Errors will make you very unhappy very quickly anyway. Do it for a week on a public web server and you will know what I mean :)
精彩评论