im getting many hits from google, but its not like user gets always hits the same page. The user gets always a different page. So ca开发者_Python百科ching with memcached will not working i think cause the user reaches always a other page, which needs to generate and cached.
So would could i do to reduce the load from the server. I already implement sphinx search which helped me to reduce the load and the speed of the search page.
Any ideas to reduce the load?
- Use MySQL partitions
- Use opcode cacher (for example eaccelerator)
- Use nginx as front-end (for all static content), apache as back-end
- Maybe use some NoSQL decisions
- Setup your MySQL server in the right way(for innodb)
- I suggest use InooDB engine for tables with many rows
- Setup right index on tables
MySQL replication
You could setup some limitation for search bot's
Well there two approaches to caching web applications
- You cache the request, I mean using ob_start(), ob_flush(), etc you get the data to response, put this data to file and just response this file next time, you should take care of disk space and delete files using cron... ( benefits: decrease i/o operations and memory usage )
- You can cache aggregated data in order to prevent heavy aggregation next time...
choose))
精彩评论