Other Cool

Patience on Web: How to Make a Website Faster!

DRAFT

Patience is a virtue!

Unless It costs you billions. It’s been well documented and understood. That long load times, sluggish UI, and unresponsive app is the best way to loss users/buyers/customers. Some years ago Amazon calculated that a page load slowdown of just one second could cost them $1.6 billion in sales per year. In general, If an eCommerce site is making $100,000 per Day then 1 second delay can cause loss of $20,000. Expectations of users are only going to go up; faster load times, smooth UX and intelligent interaction aren’t an afterthought for serious businesses whose bottom line relies on the technology. In fact, Reliability and Performance can be distinguishing features for the startups, not to mention Search Engines now take into account the loading time of sites into Search Page Ranking. Overall Good experience improves the satisfaction of their Users.

Arvind Jain
“Every millisecond matters.”
Arvind Jain, a Google engineer

Improving Performance and Load Times

Measure first optimize second, start with highest solvable bottleneck.

Outline :

  • Images
  • Svg
  • Spritesheet
  • Minification (css,js,html)
  • Http Compression (gzip)
  • Caching On Client
  • Caching On Server
  • Cdn
  • Ajax
  • New Protocols: Http2, Websocket
  • Blocking stylesheets (link media print)
  • Blocking javascript, Async, Defer
  • Streaming response on Server
  • Less specific CSS rules are faster
  • Redirects
  • Server Side Rendering of View
  • Service Worker & Offline
  • Streaming Api

First, Easiest things you can do are:

Install Pagespeed module: https://developers.google.com/speed/pagespeed/module/

Run your site through Pagespeed Insights: https://developers.google.com/speed/pagespeed/insights/

Images

Use Correct Format

Use jpg over png where it’s possible, mainly when transparency isn’t required. And use svg over all other formats when possible, usually only possible for simpler graphics.

Compress Images automatically and manually

Use build tools to minify images, like gulp-imagemin/grunt-contrib-imagemin and gulp-imageoptim both together. For large images Designer should compress them manually using better compressing algorithm while keeping eye for quality.

There are also couple of standalone tools paid and free, cli, online, gui and photoshop plugin all of which can help further and perhaps better.

more information: http://addyosmani.com/blog/image-optimization-tools/
more information: https://youtu.be/pNKnhBIVj4w?t=170
more information: https://www.udacity.com/course/viewer#!/c-ud892/l-5332430837/m-5325220785
more information: http://jamiemason.github.io/ImageOptim-CLI/comparison/jpeg/jpegmini-and-imageoptim/desc/

Serve Images Responsively and Generate Different sizes of images for different resolutions and screen widths

grunt-responsive-images (generates images of varying sizes) with imager.js (lazy loads appropriate image, minimum size and resolution needed)
checkout html img tag attribute src-set and picture element : http://alistapart.com/article/responsive-images-in-practice
Though it has a bad support, and is very complex. It allows to define various formats of images, different src for different sizes and resolutions.

 

Detect and Use Webp

On Server : It’s becoming more common for web clients to send an “Accept” request header, indicating which content formats they are willing to accept in response. If a browser indicates in advance that it will “accept” the image/webp format, the web server knows it can safely send WebP images, greatly simplifying content negotiation

https://developers.google.com/speed/webp/faq#how_can_i_detect_browser_support_using_javascript

On Client: User modernizr to load correct image format. http://www.stucox.com/blog/using-webp-with-modernizr/

 

Serve low Resolution and then load high resolution in background.
If speed trumps quality in your case then just load lower quality of image  by default then load second higher quality image using javascript and replace src of lower quality image to higher quality image. Only downside to this is that more bandwidth gets used, so only recommended for desktop users.

Enable client side caching
Set max age to high number on static resources using server side configuration. And whenever files change update the name of file. The easiest way to do this is use gulp-rev, it attaches file hash to filename thus if file changes its name also changes which invalidates the cache.

Use Progressive Jpg
Save images as progressive jpg which is not a default in Photoshop and other programs. Progressive jpgs take longer to load overall. But image becomes visible altogether instead of being loaded pixel by pixel it’s loaded layer by layer.

Inline small image
small size images can be converted to data uri and inlined right into css. sparing extra http request.

https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/image-optimization#eliminating-and-replacing-images

 

Svg

Use build tools to minify svg files, like gulp-svgmin or grunt-contrib-svgmin.

Inline svg files (note inlining disable caching, so test and compare the benefits).

Manually optimize, Sometimes, it’s possible to reduce points in svg without affecting quality which can be done with tools like https://github.com/svg/svgo-gui

 

 

Spritesheet

Spritesheet is basically all pictures of a site combined into one big image then they are referred to using background image property in css and clipped. This reduces 13 requests for 13 assets to 1 request for 13 assets.

There are many tools that assist in the process.

Minification (css,js,html)

Inline Critical css and js. Minify and concat rest of js and css into one big file.

Minify simply removes all white space and simplify variables name. check out https://github.com/gmarty/grunt-closure-compiler

For css, minifiers can remove unused css rules, remove white space, and merge duplicate css rules. Check out https://github.com/ben-eb/gulp-uncss

 

Http Compression (gzip)

Wether using ngnix or apache, both easily allow to enable gzip for all resources it helps in reducing file size. Network speed is slower than computing speed of today’s devices. And there is a breaking point.

However, measure and test. Gzip sometimes can result in higher file sizes for some files, especially smaller files, due to how gzip works.

Pre compress files using gzip, ngnix uses lowest compression ratio for per request gzipping, understandably. check out https://www.npmjs.com/package/gulp-zopfli

cloudflare cdn automatically gzip files that it’s caching.

Caching On Client

Enable caching of resources by setting expiration date on all resources. Good practice is to set long expiration date and then change filename when update has been done. Setting expiration date on files tells browser how long they are expected to not change and browser saves them in memory and next time instead of hitting the server it just serves from memory.

However, page load time performance shouldn’t completely based on client side caching. Page needs to be fast as is, most of the time cache aren’t reliable and easily invalidated. Therefore a bad footing to place performance strategy on.

 

Caching On Server

On server side implement a through caching strategy. One of the best ways to do this is use memcache.

Cache static pages or pages that change slowly. Caching is faster than hitting database, because of two reasons, cache is in memory whereas database stores data in hard drive. And secondly cache often stores end result of data. Database often have to ‘calculate’ the final data.

View udacity course on web development.

Domain Sharding

Browser has limit to how many connections it can have simultaneously to one host. Allocate subdomains to certain resources then requests won’t be backlogged until some other resource frees up the connection.

Another benefit of it as that cookies are sent along every http request. There is no point in getting cookie data for static assets. So domain static.example.com can be cookie free. Where as http://www.example.com can have cookies. If however cookies has been set for example.com then they will be sent with static.example.com also. Therefore, in that scenario it’s best to just buy another domain and allocate it to cookie free resources.

However, there is dns lookup cost associated with too many domains. So measure and test.

Cdn

Use cdn for all static resources, cdn usually have domain sharding built in by default. Cdn databases and servers are closer to user physically and delivery time is less. CDN cache results and serve them faster. As with everything test and measure.

 

Ajax

Post Load essential resources html, css, js first. Then load everything afterwards using javascript. Make sure to enable caching on ajax response by setting expires on date.

Preloading is loading assets of pages ahead just once all for current page has been loaded, this puts content in browser’s cache if it wasn’t there.

Configure ETags

Remove eTags from http response headers. It leads to browser interpreting same resource as being different due to eTag being generated on different server as is the case on CDN and proxies.

 

Iframes slow down things

if possible dont use iframe, they just slow down everything and there is no way around that.

 

 

Delegate Events

It’s better one element listening to click event and then determine what to do based on what event target was, then to have to 10 buttons listening to click event. Too many event listener clogs js event loop unnecessarily.

 

 

New Protocols: Http2, Websocket

Blocking stylesheets (link media print)

Screen Shot 2015-12-05 at 3.50.13 PM

Normal Critical Path: Get html → Parse/Construct Dom tree → Get Css & JS → Parse/Construct CSSOM tree  → Run JS  → Merge & Render CSSOM and whatever amount of DOM is present → Paint.

Put media query on link tags, this tells browser to not wait for them to be parsed if they don’t apply immediately, and stylesheet downloading is already async natively. (Js downloading is synchronous & blocks parser).

<link rel="stylesheet" type="text/css" href="print.css" media="print">

Inline The critical CSS in header tag and load rest of CSS using javascript which can be done by hiding a main div and then when CSS is loaded using JS to show it.

Just Moving stylesheet LINK tag down in body wont help since, unlike DOM, CSSOM isn’t built incrementally..all css is downloaded parsed then applied.

JS is ran after the CSSOM is done, and JS blocks DOM parsing until it has finished. Large CSS will delay JS fetching/execution and that will delay DOM parsing which delays paint. So if CSS file is big and hasn’t loaded + parsed it will stop the DOM parser if it encounters the script tag.

Script tags however aren’t the same. They don’t block parsing but they do block rendering. So putting them in at bottom of body tag wont prevent browser form constructing DOM by waiting for all JS files to download but their execution time would still delay the Rendering and Painting of Web page.

more information: https://www.youtube.com/watch?v=hW4FDYeONdg

Blocking javascript, Async, Defer

Normal Critical Path: Get html → Parse/Construct Dom tree → Get Css & JS → Parse/Construct CSSOM tree  → Run JS  → Merge & Render CSSOM and DOM → Paint

Here’s what happens when a browser loads a website:

  1. Fetch the HTML page (e.g. index.html)
  2. Begin parsing the HTML
  3. The parser encounters a <script> tag referencing an external script file.
  4. The browser requests the script file. Meanwhile, the parser blocks and stops parsing the other HTML on your page.
  5. After some time the script is downloaded and subsequently executed.
  6. The parser continues parsing the rest of the HTML document.

Step 4 causes a bad user experience. Your website basically stops loading until you’ve downloaded all scripts. If there’s one thing that users hate it’s waiting for a website to load.

Why does this even happen?

Any script can insert its own HTML via document.write() or other DOM manipulations. This implies that the parser has to wait until the script has been downloaded & executed before it can safely parse the rest of the document. After all, the script could have inserted its own HTML in the document.

…Add async attribute to any Script tag which refers to Javascript which isn’t critically important. Async tags prevents parser blocking. Putting script tag at bottom of body alone isn’t enough since it prevents the downloading of script tags till very end of html file.

defer attribute is the same as async except order of script tags matters.

more information: http://stackoverflow.com/questions/436411/where-is-the-best-place-to-put-script-tags-in-html-markup

Streaming response on Server

To start getting data to user as soon as possible streaming is an easy solution. Even if db is taking time to respond back you can probably start sending header and navigation related html.

It’s quite easy in nodejs, you just write or pipe whenever you get data.

Or using http2’s multiple data frames.

Less specific CSS rules are faster

Though this could be seen as micro optimization, but it’s something to keep in the mind nonetheless and efforts shouldn’t be wasted on unless it’s really easy to avoid or through testing it’s apparent that CSS files are being the bottleneck in Critical Rendering Path.

Basically idea is that more specific the Css selector is the more time it takes browser to apply that style to DOM.

div p { .... }  #> is slower than
p { ... }

Reason is that for every node that browser come across in DOM tree browser has to access other DOM nodes to determine if the constraints are true.

Redirects

There are three types of redirects DNS, Html, Javascript. We are talking about Html and Javascript Redirects. These Redirects increases the page load time, as browser has to make a new request which involves DNS lookup, TCP handshake and TLC negotiation. So it’s even more costly for https sites. The worst case scenario might look like:

http://example.com → https://example.com → https://www.example.com → https://m.example.com

Solutions Include: 

Using Responsive Design so mobile site and desktop site is the same site.

Use Adaptive Design serve custom site to mobile users by sniffing http headers.

Server Side Rendering of View

Sending script to client that in turn request more data and then renders it overall accumulates more requests overhead. Rendering the content on server often reduces the load time by reducing the extra requests needed.

However, it needs to be measured and rendering on server could increase the time of whole html. Where as before first html was loading faster. So even though overall it takes smaller time, it might appear that as a whole site took longer since first thing appeared much later.

more information: https://youtu.be/d5_6yHixpsQ?t=223

 

Streaming Api

need to do more research on this..but main idea is that your client side makes ajax request and gets data back in streaming fashion.

https://github.com/whatwg/streams

 

Web Service Worker and Offline viewing

It’s a new improvement over app cache. Acts as a cache between browser and server which you can control using javascript.

need to do more research..

https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers

https://www.youtube.com/watch?v=4uQMl7mFB6g

 

PreResolve DNS

This is new, by heading this in header tag you indicate browser to start DNS look up for these domains. So when actual resources request something and their domain is among these domains the dns lookup wont be needed since the ip address to server would already be cached

    <link rel="dns-prefetch" href="//www.domain1.com">
    <link rel="dns-prefetch" href="//www.domain2.com">

 

 

 

 

All resources

http://stackoverflow.com/questions/436411/where-is-the-best-place-to-put-script-tags-in-html-markup

http://jamiemason.github.io/ImageOptim-CLI/comparison/jpeg/jpegmini-and-imageoptim/desc/

http://addyosmani.com/blog/image-optimization-tools/

https://kinsta.com/learn/page-speed/

https://developers.google.com/speed/articles/optimizing-javascript

https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/image-optimization#eliminating-and-replacing-images

https://developer.yahoo.com/performance/rules.html

https://developers.google.com/speed/pagespeed/module/filter-domain-rewrite

https://developers.google.com/web/fundamentals/performance/critical-rendering-path/render-blocking-css

https://developers.google.com/web/fundamentals/performance/critical-rendering-path/adding-interactivity-with-javascript

https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers

https://developers.google.com/web/fundamentals/performance/critical-rendering-path/?hl=en

https://developers.google.com/web/fundamentals/performance/critical-rendering-path/analyzing-crp?hl=en

https://developers.google.com/web/fundamentals/performance/critical-rendering-path/measure-crp?hl=en

http://www.smashingmagazine.com/2015/11/why-performance-matters-part-2-perception-management/

http://www.pocketjavascript.com/blog/2015/11/23/introducing-pokedex-org

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s