开发者

Optimizing mysql / PHP based website | 300 qps

开发者 https://www.devze.com 2023-03-03 21:10 出处:网络
Hey, I currently have over 300+ qps on my mysql. There is roughly 12000 UIP a day / no cron on fairly heavy PHP websites. I know it\'s pretty hard to judge if is it ok without seeing the website but d

Hey, I currently have over 300+ qps on my mysql. There is roughly 12000 UIP a day / no cron on fairly heavy PHP websites. I know it's pretty hard to judge if is it ok without seeing the website but do you think that it is a total overkill? What is your experience? If I optimize the scripts, do you think that I would be able to get substantia开发者_Python百科lly lower of qps? I mean if I get to 200 qps that won't help me much. Thanks


currently have over 300+ qps on my mysql

Your website can run on a Via C3, good for you !

do you think that it is a total overkill?

That depends if it's

  • 1 page/s doing 300 queries, yeah you got a problem.
  • 30-60 pages/s doing 5-10 queries each, then you got no problem.

12000 UIP a day

We had a site with 50-60.000, and it ran on a Via C3 (your toaster is a datacenter compared to that crap server) but the torrent tracker used about 50% of the cpu, so only half of that tiny cpu was available to the website, which never seemed to use any significant fraction of it anyway.

What is your experience?

If you want to know if you are going to kill your server, or if your website is optimizized, the following has close to zero information content :

  • UIP (unless you get facebook-like numbers)
  • queries/s (unless you're above 10.000) (I've seen a cheap dual core blast 20.000 qps using postgres)

But the following is extremely important :

  • dynamic pages/second served
  • number of queries per page
  • time duration of each query (ALL OF THEM)
  • server architecture
  • vmstat, iostat outputs
  • database logs
  • webserver logs
  • database's own slow_query, lock, and IO logs and statistics

You're not focusing on the right metric...


I think you are missing the point here. If 300+ qps are too much heavily depends on the website itself, on the users per second that visit the website, that the background scripts that are concurrently running, and so on. You should be able to test and/or compute an average query throughput for your server, to understand if 300+ qps are fair or not. And, by the way, it depends on what these queries are asking for (a couple of fields, or large amount of binary data?).

Surely, if you optimize the scripts and/or reduce the number of queries, you can lower the load on the database, but without having specific data we cannot properly answer your question. To lower a 300+ qps load to under 200 qps, you should on average lower your total queries by at least 1/3rd.


Optimizing a script can do wonders. I've taken scripts that took 3 minutes before to .5 seconds after simply by optimizing how the calls were made to the server. That is an extreme situation, of course. I would focus mainly on minimizing the number of queries by combining them if possible. Maybe get creative with your queries to include more information in each hit.

And going from 300 to 200 qps is actually a huge improvement. That's a 33% drop in traffic to your server... that's significant.


You should not focus on the script, focus on the server.

You are not saying if these 300+ querys are causing issues. If your server is not dead, no reason to lower the amount. And if you have already done optimization, you should focus on the server. Upgrade it or buy more servers.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号