开发者

THE FASTEST Smarty Cache Handler

开发者 https://www.devze.com 2022-12-24 16:39 出处:网络
Does anyone know if there is an overview of the performance of different cache handlers for smarty? I compared smarty file cache with a memcache handler, but it seemed memcache has a negative impact

Does anyone know if there is an overview of the performance of different cache handlers for smarty?

I compared smarty file cache with a memcache handler, but it seemed memcache has a negative impact on performance.

I figured there would be a faster wa开发者_C百科y to cache than through the filesystem... am I wrong?


I don't have a systematic answer for you, but in my experience, the file cache is the fastest. I should clarify that I haven't done any serious performance tests, but in all the time I've used Smarty, I have found the file cache to work best.

One this that definitely improves performance is to disable checking if the template files have changed. This avoids having to stat the tpl files.


File caching is ok when you have a single server instance or using shared drive (NFS) in a server cluster, but when you have a web server cluster (two or more web servers serving the same content), the problem with file based caching is not sync across the web servers. To perform a simple rsync on the caching directories is error prone. May work flawlessly for awhile but not a stable solution. The best solution for a cluster is to use distributed caching, that is memcache, which is a separate server running a memcached instance and each web server has PHP Memcache installed. Each server will then check for the existent of a cached page/item and if exists pulls from memcache otherwise will generate from the database and then save into memcached. When you are dealing with clusters, you cannot skimp on a good caching mechanism. If you are dealing with clusters, then your site already has more traffic (or will be) for a single server to handle.

There is beginners level cluster environment which can be implemented for a relative low cost. You can set up two colocated servers (nginx load balancer and a memcached server), then using free shared web hosting, you create an account of the same domain on those free hosting accounts and install your content. You configure your nginx load balancer to point to the IP addresses of the free web hosts. The free web hosts must have php5 memcache installed or the solution will not work.

Then you set you DNS for the domain with the registrar to point the NGINX IP (which would be a static ip if you are colocating). Now when someone access your domain, nginx redirects to one of your web server clusters located on the free hosting.

You may also want to consider a CDN to off load traffic when serving the static content.

0

精彩评论

暂无评论...
验证码 换一张
取 消