main

WPO

Stop the presses: Has the average web page actually gotten SMALLER?

May 21, 2014 — by Tammy Everts8

According to the HTTP Archive, the average top 1,000 web page is 1491 KB in size, 5% smaller than it was six months ago, when the average page reached a record size of 1575 KB.

But let’s not start celebrating yet.

Does this finding represent the start of a new trend toward smaller pages, or is it just an isolated incident? To answer this question, we need to take a look into the Archive’s other findings.

 

HTTP Archive: Page growth

1. The average top 1,000 web page is 1491 KB, down from 1575 KB six months ago.

Yes, this represents only a 5% decrease in page size, but after watching pages explode over the past three years, any news about smaller pages seems like good news. But as I’ve already said, let’s not put on our party hats just yet.

BLOG-HTTPArchive-May2014-pagesize

2. The only content type that experienced significant shrinkage was “other”.

The “other” content type is a bit of a grey area. As the graph below indicates, it surged dramatically in size in late 2012, when the HTTP Archive changed the way it gathers data. Prior to the change, the tests stopped at document complete (windows onload). Afterward, the tests ran until the end of network activity, which increased the size and number of requests per page, and which led to an increase in the catch-all “other” category. The best guess is that this category includes video and third-party content.

Let’s try to understand why “other” content has shrunk by more than half — from 262 KB last November to 121 KB now. It’s unlikely that this shrinkage is due to any decrease in third-party content, since by all accounts, third-party scripts are on the rise. (Steve Souders recently shared findings that third-party calls can make up more than 50% of page requests.) So ruling out third-party content leaves us speculating that either the shrinkage is due to a decrease in use of video (quite possible) or an undocumented change in the testing process (somewhat possible).

BLOG-HTTParchive-May2014-other

3. Growth due to images is rampant.

We’re in love with images. Right now, they comprise a whopping 57% of the average page’s weight. Six months ago, images comprised 51% of a page’s weight.

If our love affair with images is only going to increase — and I’m sure it will — we need to get a handle on how we use them. Images are a major performance hurdle. Too often, they’re in the wrong format or uncompressed or unoptimized — or all three. We can do much better.

HTTP Archive: Page growth due to images

4. Custom fonts have overtaken Flash.

The graph below beautifully illustrates an aspect of the human condition that I find both amusing and frustrating: You can count on the fact that if you have a team of geniuses in room A working to fix one problem, there will be a team of equally brilliant people in room B developing a technology that will create a new problem.

In this case, we see how sites are dropping Flash from their pages, thereby negating one type of performance problem. But then along came custom fonts to create a brand-new set of performance challenges. See that point where the two lines intersect? That point represents the crux of the problem with our great big hominid brains.

(Note that custom fonts don’t have to be performance bad guys. See here and here for tips.)

HTTP Archive: Flash and customs fonts

Conclusion: Is this page size decrease a trend or just a one-off?

I’m inclined to take the pessimistic view that this is an isolated incident. Given that almost every other content type is on the rise — particularly images and custom fonts, both of which can incur major performance penalties — their growth will ultimately overshadow the minor win we’re seeing right now with the decrease in “other” content types.

Also bear in mind that smaller pages don’t necessarily mean faster pages. While egregiously large payload is definitely a contributing factor to slow load times, there are other variables — such as sluggish third-party content — that can slow down or block a page from rendering. And as our own research has found, many site owners are failing to build pages that render key content first, a critical user experience flaw.

Related posts:

Tammy Everts

As a former senior researcher, writer, and solution evangelist for Radware, Tammy Everts spent years researching the technical, business, and human factor sides of web/application performance. Before joining Radware, Tammy shared her research findings through countless blog posts, presentations, case studies, whitepapers, articles, reports, and infographics for Strangeloop Networks.

8 comments

  • Chris Love

    May 23, 2014 at 9:34 am

    My guess is since this is scoped to the top 1000 sites is a video content provider changed the way video is delivered. For example I read the Dallas Morning news site everyday following the decline of the Texas Rangers. They auto play a video in the right column. Changing that video to not autoplay and download an image instead would change each page’s payload. So a common service changing from autoplay to a screenshot could represent the change of other here. So maybe it is Youtube or someone like that.

    Reply

  • Tammy Everts

    May 26, 2014 at 9:42 am

    I hadn’t thought of that, and I agree — it’s a good guess. Thanks for the food for thought, Chris.

    Reply

  • Alexandru C

    May 26, 2014 at 12:49 pm

    Well I think we will see even more scripts in the future and even less html. For example if you will analyze facebook you will notice that after login you will just see some parts of the website and the rest are downloaded in async from their api. That’s a good example on how to improve your bandwith usage and how to improve your site functionality. Plus new features from html5 are being used right now, so we could see more and more ajax requests in the future.

    Reply

  • Carlo

    May 26, 2014 at 2:33 pm

    What software did you use to make these graphs?

    Reply

  • Chris Wilson

    May 26, 2014 at 10:57 pm

    Custom fonts are sometimes not necessary to understand the page (purely decorative) – browser makers could provide a way to turn them off, or defer loading them.

    Sometimes they’re a more efficient way of delivering scalable symbols and monochrome diagrams than using SVG. (e.g. http://thenounproject.com)

    Would also like to know which software you used to produce these beautiful graphs!

    Reply

  • Tammy Everts

    May 28, 2014 at 11:05 am

    Thanks for the kind words about the graphics. I can’t take much credit. I use https://infogr.am/, which does all the cool stuff for me. 🙂

    Reply

  • Pingback: There are more mobile-optimized sites than ever. So why are mobile pages getting bigger? | Web Performance Today

  • Sheila

    May 28, 2014 at 10:07 pm

    I understand HOW @font-face works, but I don’t really understand why it gets used as much as it does. Call me old-school, but I prefer listing multiple fonts.

    font-family: “The nicest-looking easy-to-read font that fits the design”, “A similar, but more common font that fits the design”, “The closest font to the first choice that is included with Windows’ OSs”, “Ditto, but for Mac”, “serif or san-serif, depending on if it was for the main content or for headers”;

    That’s as heavy as I go for fonts, and I’ve been know to occasionally just use font-family: san-serif; for main content. There’s nothing quite so nice as seeing your favorite fonts used on a website, even if it’s only because it’s styled to use, say, Gabriola and Corbel, because that what you chose as the default in your browser settings.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *