Sign up to our Newsletter
Four out of six, we are half way over through a series of blog posts inspecting Google’s web performance best practices in detail. We have already discussed caching, minimizing round trip time, downsizing on cookies. All with the goal of eliminating with ease the components that are adding to your page load time unnecessarily. And to the impatience of an end user.
Faster websites result in better user experience, better search engine rankings and more conversions. 1 out of 4 visitors won’t wait for the page to load for more than 4 seconds.
How One AI-Driven Media Platform Cut EBS Costs for AWS ASGs by 48%
The essential data that is being carried within a packet across the internet, i.e. cargo of a data transmission, is known as payload. Most network packets are split into three parts: header, trailer and payload. Payload is the body of a packet, as well, and if a packet is fixed-length, then the payload may be padded with blank information to make it the right size.
If you minimize on the payload size, network latency can be significantly reduced. And not only that, your bandwidth bill will be smaller.
Steps to Take
Your best chances in reducing overweight in a payload involve compression and minimization of text based files such as scripts or styles, re-compression of some downloadable files, zero-body components and more.
1. Turn on gzipping!
This seems to be the easiest and the most effective step. How much improvement can you expect from gzip? On average – 70%. Google Speed Page will warn you if gzip is not on.
You should gzip javascripts, css, plain text, html, xml… basically anything that’s not a binary file.
To enable compression, configure your web server to set the Content-Encoding header to gzip format for all compressible resources. You can also use deflate, which uses the same compression algorithms, but it is not that widely used, and gzip is thus more recommended.
To ensure that your content compresses well, Google’s Best Practices also advise to specify CSS key-value pairs in the same order where possible, i.e. alphabetize them, also specify HTML attributes in the same order. Use consistent casing, i.e. lowercase wherever possible, and consistent quoting.
2. Minify: JavaScript, CSS, HTML
Minifying JavaScript code can save many bytes of data and speed up downloading, parsing, and execution time, according to Best Practices. This actually means removing unnecessary bytes, extra spaces, line breaks, indentations. In a nutshell, stripping extra code from your programs that is not essential for execution.
Minifying the CSS code has the same effect as minifying javascripts, and same benefits: reducing network latency, enhancing compression, and faster browser loading and execution. You can use free tools for this, and this is the best way to get it done, such as YUI Compressor, or cssmin.js. Finally, minifying the HTML code, that may include inline JavaScript and CSS contained in it, saves many more bytes of data.
How much size reduction can you expect from minification? It can actually make your scripts up to 85% leaner.
3. Other recommendations
Other recommendations include:
- remove unused CSS: many web sites reuse the same external CSS file for all of their pages, even if many of the rules defined in it don’t apply to the current page.
The best way to minimize the latency caused by stylesheet loading and rendering time is to cut down on the CSS footprint; or actually remove or defer CSS rules that aren’t used by the current page. Most handsomely, when you run Page Speed against a page referencing CSS files, it identifies all CSS rules that don’t apply to that page!
- defer loading of javascripts: for some browsers, while JavaScript is being processed, the browser blocks all other resources from being downloaded and this adds considerably to latency. Deferring the loading of the JavaScript until it’s actually needed can help.
- optimize images: proper formatting and compressing. Improperly optimized images can take up more space than they need to; for users on slow connections, it is especially important to keep image sizes to a minimum, as advised.
Basic optimization includes cropping unnecessary space, reducing color depth to the lowest acceptable level, removing image comments, and saving the image to an appropriate format. This can be done with any image editing program. Advanced optimization involves further (lossless) compression of JPEG and PNG files.
- serve scaled images: for times when you want to display the same image in various sizes. You will serve a single image resource and use HTML or CSS in the containing page to scale it.
- serve resources from a consistent URL: a unique one, to eliminate duplicate download bytes and additional round trip time. For times when you need to reference the same resource from multiple places in a page, for example, ensure that one resource is always assigned a single URL.
They say that the average site today is 1.5 mb in size too big! This adds to the importance of keeping your payload clean. The effort to do that includes minifying the code and compressing files with gZip basically, and serving images in scale, and optimized. Not too much, if you consider the results.
Read more: