Page bloat update: The average top 1000 web page is 1795 KB in size


Page bloat. It’s insidious, and it’s one of the single greatest causes of slow load times. Today let’s talk about why the average web page has grown by 186% since 2010, and what we can do to help mitigate its impact on performance.

Remember last spring, when I speculated that an intriguing downward spike in the size of the average top 1000 web page was probably a blip, not a trend? It looks like I was right.

It was pretty exciting news at the time: according to the HTTP Archive, in May of this year the average top 1,000 page was 1491 KB in size, down from the 1575 KB noted in November 2013. At the time, I was inclined to take the pessimistic view that this was an isolated incident, triggered by a decrease in “other” content types (caused by either a decrease in the use of video on home pages, or else an undocumented change in the HTTP Archive’s testing process).

1. The average top 1000 web page is 1795 KB in size.

The average page has grown by 20% in just six months — from 1491 KB in May to 1795 KB. If you’re not kind of blown away by this (and, just my two cents, you should be blown away by this), then perhaps you’ll be blown away by this: the average page has grown by 186% since 2010.

This graph offers a good snapshot of how much of this payload is comprised of various content types:

Page bloat - November 2014

As you can see, “other” accounts for a growing portion of payload, so let’s start by discussing that…

2. “Other” content type is back in growth mode.

The average page contains 210 KB worth of “other” content — primarily third-party scripts is my best guess. (I don’t have any guesses as to what caused the dip in this area back in the spring. If you do, please throw your ideas my way.)

I talk a lot about third-party scripts (here, here, and here, for example), so I don’t want to rehash all that. TL;DR version: Not only do third parties increase page weight and latency, they also represent the single greatest potential point of failure for web pages. All it takes is one non-responsive, unoptimized third-party script to take down an entire site.

If you’re not already deferring your scripts or using asynchronous versions, then you need to do that. You should also be constantly auditing and monitoring your scripts and pruning the dead wood. (I was at a conference recently where someone told me about doing an audit on a site that had dozens of old scripts that were making dead-end server calls. He visibly shuddered.)

Page bloat - November 2014

3. Images account for 56% of the average page’s total size.

When I started using the web, it was all text, no images. I remember the day I was first able to add an image to a page — what a rush! That rush makes sense. Human beings love visuals. And that’s not going to change. So if we’re going to continue to use them, we need to make them work smarter, not harder.

Right now, the average page is 1795 KB, and exactly 1000 KB of that payload is images. And odds are that many/most of those images are unoptimized — uncompressed, unconsolidated, incorrectly formatted, and wrongly sized. Go here and here to get started on fixes.

Page bloat: Images

4. Flash continues to decline, while custom font use continues to rise.

Most of us know about Moore’s Law. And some of us know about Wirth’s Law. I was joking with a fellow Velocity conference-goer that I want to coin Tammy’s Law: for every performance-leaching content type that we see fall out of use, a new one rises to take its place. And the new one has the potential to be an even worse performance problem than the one it displaces.

To illustrate, note the decline in the use of Flash over the past four years. And then note the rise of custom fonts. To clarify, I’m not saying that custom fonts are inherently evil. They don’t have to be performance bad guys. Get tips here, here, and here on how you can have your gorgeous custom-font cake and eat it, too.

Page bloat: Flash versus custom fonts

Conclusion

If page bloat is hurting desktop performance — which it certainly is — just think of the pain it’s causing for mobile. Mobile use has overtaken desktop use, yet 2 MB pages are waiting for us just around the corner. And we’re increasingly aware that m-dot sites are not a cure-all. This is a problem that’s only going to get worse.

Related posts:

Tammy Everts

As a former senior researcher, writer, and solution evangelist for Radware, Tammy Everts spent years researching the technical, business, and human factor sides of web/application performance. Before joining Radware, Tammy shared her research findings through countless blog posts, presentations, case studies, whitepapers, articles, reports, and infographics for Strangeloop Networks.

Contact Radware Sales

Our experts will answer your questions, assess your needs, and help you understand which products are best for your business.

Already a Customer?

We’re ready to help, whether you need support, additional services, or answers to your questions about our products and solutions.

Locations
Get Answers Now from KnowledgeBase
Get Free Online Product Training
Engage with Radware Technical Support
Join the Radware Customer Program

CyberPedia

An Online Encyclopedia Of Cyberattack and Cybersecurity Terms

CyberPedia
What is WAF?
What is DDoS?
Bot Detection
ARP Spoofing

Get Social

Connect with experts and join the conversation about Radware technologies.

Blog
Security Research Center