Google is working on a way for Chrome to label slow websites

Shawn Knight

Posts: 15,284   +192
Staff member
A tough task: While Google’s intentions seem good-hearted, the successful implementation of such a system could be incredibly tricky. Really, aside from encouraging site owners to adopt best practice with regard to optimizing their site for speed, I’m not sure how effective it can really be. For example, will it be able to take into account server hiccups and traffic spikes and do so without bias (including its own sites and services)?

Speed is a core attribute of any web browser worth using but with so many factors contributing to a website’s perceived zippiness, how can you reliably differentiate between poor network conditions and a truly slow website? Google wants to help.

The tech giant is in the early stages of creating a badging system that’ll identify websites that offer high-quality experiences – namely, those that are optimized for speed and have a solid track record of delivering on a consistent basis.

In announcing the initiative, members from Google’s Chrome team said the feature may take a number of forms, adding that they are experimenting with different options to see which provide the most value for users. For example, they may lean on historical load latencies as one metric.

More advanced iterations may even identify when a site is likely to be slow based on the device you are using or network conditions.

Google highlights a couple of tools – PageSpeed Insights and Lighthouse – as starting points to evaluate the performance of your website and encourages owners not to wait to optimize their sites.

Masthead credit: Loading by Yuttanas

Permalink to story.

 
Will be interesting to see their results.

Personally, I've found two metrics which have a major impact on page loading:

  1. number of objects loaded
  2. loading from non-site resources (like a CDN)

The number of unique
'<img ... >' ,​
'<script ..> </script> ' and the​
'<style > ... </style>'​
tags cause lots of server file I/O. There's only three really needed; one for scripts, one for css and one for the graphics moved into a single sprite file.

The other problem which the site developer can not control is the effects of a CDN deployment which pushes static content to edge servers and if there are network issues with those references, the website itself takes the blame.
 
I like this idea, it will encourage more web developers to use GTMetrix and optimize their sites.
 
Back