Smartial Wayback Machine Text Extractor



Live version of this page exists.


This article contains 1 images. You will find them at the very end of the article.

This article contains 882 words.

The importance of performance - Web Directions

When I first started developing for the web, networks were glacially slow by today’s standards. In the early to mid 1990s a dialup modem speed of 14400bps (bits per second) was common, a 56K (kilo bits per second) a very fast connections. That’s bits, so divide by 8 to get bytes per second–we’re talking 1800-7,000 bytes per second.

So developers had to be particularly parsimonious with their use of this bandwidth, or pages would effectively never load. Now, there wasn’t much if any JavaScript being delivered, but there were images–so many images. We used images for text (because the fonts available were so limited, we thought it would be great to do ‘typography’ by loading up tens of kilobytes worth of pictures of text, impacting not only page load times but accessibility as well). We used images for the borders of things (go and take a look at the amount of markup, and the number of images needed to make an element with rounded corners before border-radius.)

The img element even gained a lowsrc attribute, where you’d put the URL of a low resolution, and so smaller, image to be downloaded and shown while the browser downloaded the bigger higher resolution image file which you linked to then as now in the src attribute.

At some point we worked out due to the way HTTP1.1 works that a single large file had in most cases higher network performance than a number of smaller files, so we started concatenating our files together–several CSS files became one larger CSS file (a related technique was ‘domain sharding‘ where files from a site would be hosted at different domains to get around the limitation of HTTP1.1 where there was a maximum number of connections from a browser to a given domain at any given time).

And then there was minifying our files, by removing redundant whitespace, and in JavaScript making variable names as short as possible, and gzipping our files as well.

In the early 2000s, at least in much the global north, the rise of cable and other broadband internet services and the dramatic associated increase in network speeds saw performance take a backseat, and perhaps not coincidentally a huge increase in the use of JavaScript (which today accounts for more or less the same amount of page weight as images do). Couple this with the rise of smart phones, and the associated lower performance of mobile networks (the original iPhone didn’t even use 3G networks) and performance once again became a focus for developers, as it has been ever since.

In the last few years, a lot of the responsibility for improved web site and application performance has moved away from the backend and operations, and toward front end developers. While some of the older techniques are still valid and useful, others, like concatenating, now that HTTP2 is widely adopted, is detrimental to performance, as a number of smaller files downloaded in parallel gives better performance than a single larger file.

Performance as a page rank signal

We’ve long known (often more anecdotally than scientifically) that performance correlates with important commercial metrics like shopping cart abandonment rates, and site bounce rates. Meanwhile, the tools (like those in our browsers, but also lighthouse, the venerable and ever evolving WebPageTest, and a slew of great commercial services as well) and techniques for improving performance have improved enormously over the years.

But this month sees the importance of performance step up to another level, as Google begins using performance as a signal as part of their ranking systems.

Google’s focus is on Core Web Vitals, (presently) 3 metrics Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS), all relatively new user-centric metrics, that focus on perceived page load performance.

So now more than ever, paying attention to web page performance is a critical aspect of front end development.

Lazy Load has you covered

New in 2021, and coming up mid June Lazy Load is our frontend performance focussed conference that covers these issues and more in real depth. With Annie Sullivan who leads the team that develops from Core Web Vitals at Google going in depth–as well as a dozen other presentations on the latest in Web performance, MC’d and curated by Henri Helvetica.

It couldn’t be better timed, and with early bird pricing of just $145 before Friday April 28th it couldn’t be better value.

In 2021 we have a whole series of events for Front End Developers

Across 2021 Web Directions is presenting a series of online conferences for front end designers and developers. Focussed deep dives, they go far beyond what you might expect from conference presentations.

Learn more and register now

Across the year we have 6 conferences planned, the first 4 announced, focussing on Front End Performance, JavaScript, Progressive Web Apps, Accessibility, and Security, privacy and identity.

Priced individually from $145, or attend all 6, plus get access to our conference presentation platform Conffab for just $595, or $59 a year.



Images:

The images are downsized due to limited space here. The original dimensions may differ.
Click on the image to open it on a new tab.



Please close this window manually.