开发者

compressing entire webpages (HTML and JS)

开发者 https://www.devze.com 2023-04-03 14:11 出处:网络
I have found some tools like this one that let me create \"auto-extracting\" javascript for javascript code in a web page, which employ a variety of techniques to minimize transfer size.

I have found some tools like this one that let me create "auto-extracting" javascript for javascript code in a web page, which employ a variety of techniques to minimize transfer size.

I have a webpage which does have a rather large chunk of javascript code in it. But since I haven't gotten around to optimizing the filesize yet I was thinking about doing the same sort of thing with the HTML bits of my website too. On my blog page the PHP script pulls HTML snippets from a large number of text files, and concatenates them into one giant HTML file which is sent out. Chrome tells me that compressing it with gzip would reduce the filesize by two-thirds.

How开发者_JS百科ever I did turn off the gzip compression because what was happening was if you downloaded any of my zip archives I hosted via Internet Explorer, it would herp derp neglect to gunzip them so the file downloaded by IE is always corrupted. I guess I can turn gzip back on if I fix this little issue, but for the time being I'd like to try to see if I can make a self-extracting HTML page. Is it possible to have javascript extract a giant HTML string and add the entire chunk as child of the body element? Would that work?


It will be slower to do that, and very error prone. Any Javascript error will cause the entire page to not render, and your SEO will be absolutely destroyed. Stick to regularly rendered HTML: as the browser is downloading / parsing the HTML, it will begin fetching other resources (images, scripts, css) and rendering the layout. Don't be so focused on strictly smallest download size, but rather quickest overall experience.

Make heavy use of the freely available CDNs. There are the big two: Google and Microsoft, that host a variety of scripts like jQuery and Modernizr. Stick to Google where possible, they seem to have a much higher adoption than Microsoft and thusly a higher chance of a warm cache. Past that, use CDNJS for other publically-available libraries -- they have a lot.

Minify your existing Javascript, and enable content compression for static and dynamic pages. Don't force it on, let the browser request it. What version of IE are you seeing corruption on? I haven't seen that be an issue since IE6...

Using the Javascript packers will make your site appear slower at the expense of saving a few more bytes of transfer on your end. Not only does a script have to run, but now you're asking the users' browser to perform an additional (potentially large) extra step before your scripts can run.

If you're trying to download individual files (with the Save-As dialog), you can't use gzip with a content type of 'application/zip'. The actual Zip format is available with PHP, use those libraries instead.


As a quick win Cloudflare has an auto minify feature for HTML, JS and CSS. We've been using it for a little while now with good results. Definitely worth a look.

0

精彩评论

暂无评论...
验证码 换一张
取 消