main

WPO

Why does a typical ecommerce page take 6.5 seconds to load primary content?

October 15, 2014 — by Tammy Everts15

Every quarter at Radware, we release a new “state of the union” report, with key findings about the web performance of the world’s most popular ecommerce sites.

Every quarter, we find that the median top 100 ecommerce site takes longer to render feature content than it took the previous quarter.

Every quarter, we field the question: But how could this possibly be happening? Networks, browsers, hardware… they’re all getting better, aren’t they?

The answer to this question is: Pages are slower because they’re bigger, fatter, and more complex than ever. Size and complexity comes with a performance price tag, and that price tag gets steeper every year.

In this post, I’m going to walk through a few of the key findings from our latest report. Then I’m going to share a few examples of practices that are responsible for this downward performance trend.

First, some background

Since 2010, we’ve been measuring and analyzing the performance of the top 500 online retailers (as ranked by Alexa.com). We look at web page metrics such as load time, time to interact (the amount of time it takes for a page to render its feature “above the fold” content), page size, page composition, and adoption of performance best practices. Our goal is to obtain a real-world “shopper’s eye view” of the performance of leading sites, and to track how this performance changes over time. We release the results four times a year in quarterly “state of the union” reports. (You can see a partial archive of past reports here.)

Finding 1: The median page takes 6.5 seconds to render feature content, and 11.4 seconds to fully load.

Fall_2014_SOTU-2013-vs-2014

If you care about the user experience, then Time to Interact (TTI) is a metric you should care about. TTI is the amount of time it takes for a page to download and display its primary content. (On ecommerce sites, primary content is usually some kind of feature banner or hero image.) Most consumers say they expect pages to load in 3 seconds or less, so this sets the bar for TTI.

We found the median top 100 ecommerce page has a Time to Interact of 6.5 seconds — more than twice as slow as the ideal TTI of 3 seconds. This means that when a visitor comes to that median page, this is what she or he sees (rendered in a timed filmstrip view, one of my favourite WebPagetest features):

Fall_2014_SOTU-filmstrip2

Finding 2: The median page has slowed down by 23% in just one year.

Not only is this much slower than users expect, it’s gotten slower over time. In our Fall 2013 report, we found that the median page had a TTI of 5.3 seconds. In other words, the median page has slowed down by 23%. That’s a huge increase.

Case study after case study has proven to us that, when it comes to page speed, every second counts. Walmart found that, for every 1 second of performance improvement, they experienced up to a 2% conversion increase. Even more dramatically, at the recent Velocity NY conference, Staples shared that shaving just 1 second off their median load time yielded a 10% conversion increase.

Fall_2014_SOTU-walmart-staples2

Finding 3: “Page bloat” continues to be rampant.

This finding is consistent quarter over quarter: pages keep getting bigger, both in terms of payload and number of resources. With a payload of 1492 KB, the median page is 19% larger than it was one year ago.

Fall_2014_SOTU-page-size

Looking at page size is focusing on just one part of the problem. In my opinion, page complexity is arguably a much bigger problem than page size. To understand why, you need to know that while 1492 KB is a pretty hefty payload, this number is actually down considerably from the peak payload of 1677 KB we reported three months ago, in our Summer 2013 state of the union. At that time, the median page contained 100 resource requests, fewer than the current median of 106 resources. So the median page today is significantly smaller but has 6% more resources than it did three months ago. Here’s what that means…

Some of the most common performance culprits

Every resource request introduces incremental performance slowdown, as well as the risk of page failure. Let me illustrate with waterfall snippets showing a few real-world examples. (If you’re new to waterfall charts, here’s a tutorial on how to read them. TL:DR version: long blue bars are bad.)

Before you look at these, I want to point out that the goal here is not to publicly shame any of these site owners. Not only are these examples typical of what you might find on many ecommerce sites, they’re most definitely not the most egregious examples I’ve seen. I chose them specifically because of how typical they are.

1. Hero images that take too long to download.

waterfall-proflowers

2. Stylesheets that take too long to download and parse. Stylesheets that are improperly placed and block the page from rendering.

waterfall-weightwatchers

3. Custom fonts that require huge amounts of CSS, or heavy Javascript, or are hosted externally.

waterfall-page7

4. Poorly implemented responsive design. (Related to 2, but merits its own callout given the popularity of RWD.)

Untitled-neiman

Takeaway

Back when I started using the web, you could make pages any colour you wanted, as long as that colour was grey. I still remember the excitement I felt the very first time I was able to use colour on the background of a page. (It was yellow, if you’re wondering.) I remember when the <center> tag came along. And good golly, I remember when we were first able to add images to pages. That was a heady (and animated gif-filled) time.

Sure, pages used to be leaner and faster. They also used to look really, really boring. I don’t long for the return of a grey, graphic-less web.

This is all to say that I love images, stylesheets, custom fonts, and responsive design. They give designers and developers unprecedented control over the look and feel of web pages. They make pages more beautiful across an ever-increasing number of platforms and devices. But they can also inflict massive performance penalties — penalties that cannot be completely mitigated by faster browsers, networks, and gadgets. And this impact is increasing, rather than decreasing, over time.

As site owners, designers, developers, UX professionals, or whatever your role is, we need to be mindful of this. Performance is the responsibility of every single person who touches a web page, from conception through to launch.

Get the report: State of the Union: Ecommerce Page Speed & Web Performance [Fall 2014]

Tammy Everts

As a former senior researcher, writer, and solution evangelist for Radware, Tammy Everts spent years researching the technical, business, and human factor sides of web/application performance. Before joining Radware, Tammy shared her research findings through countless blog posts, presentations, case studies, whitepapers, articles, reports, and infographics for Strangeloop Networks.

15 comments

  • stephen

    October 20, 2014 at 9:45 pm

    very great articles, rich information, well organization and logical methodology

    Reply

  • non-believer

    October 21, 2014 at 7:29 pm

    Hello Tammy –

    Please, allow me challenge the data you present in this post. How did you manage to stretch the objects served from Akamai to 5..7 seconds?

    Here’s a quick test for Neiman Marcus’ page that has nearly 500 (!!) objects:
    http://www.webpagetest.org/result/141022_NM_5PZ/1/details/

    indeed, there are several ~1mB images that take ~1second to download, but nothing like the numbers you presented above. Could you share a webpagetest.org URL illustrating your point, please?

    Reply

  • Tammy Everts

    October 22, 2014 at 9:11 am

    I’m happy to share. The waterfall comes from the median run (test 2) in this set of nine tests:

    http://www.webpagetest.org/result/140922_JN_9dd6c87536e91c5f52870dd6a56a4fda/

    This test was performed a month ago. It looks like yours was performed yesterday, so clearly different pages were tested.

    I want to re-state what I said in my post: The goal here is not to publicly shame any of these site owners. Not only are these examples typical of what you might find on many ecommerce sites, they’re most definitely not the most egregious examples I’ve seen. I chose them specifically because of how typical they are.

    Reply

  • Thomas Stensitzki

    October 23, 2014 at 6:52 am

    After working in the web Performance field for more than five years now I must admit that web Designers still have not understood the dependency between the number and size of page objects and the time required to render a page properly. A fews years ago we had to deal with a small number of client types and therefore cared about the load times for these few clients only. Today we have to deal with a larger number of clients and different types of data Transmission types/qualities.
    CDN are a solution to enhance page load times, but CDNs are not *the* solution. The overall answer is: it depends.
    I am looking forward to next years measurements on the page load times and the number of objects added to a web page by innovative web designers.

    Reply

  • non-believer

    October 23, 2014 at 2:33 pm

    Tammy –

    Thank you for the details. So, your test was executed over simulated 1.5mBps DSL connection, where as the same test w/o limiting the bandwidth takes under 3 sec to complete: http://www.webpagetest.org/result/141023_NA_11ZV/

    According to OOKLA (http://www.netindex.com/download/2,1/United-States/) a typical US household connection is significantly better: 30mbps. Even their mobile speed is way better than 1.5mbps you simulated in the test.

    Given the above facts, would you still call your test results “typical”?

    Reply

  • non-believer

    October 23, 2014 at 2:49 pm

    is the only way to promote yourself and your company is to moderate the comments that you’re incapable of arguing with?
    please, do not be a journalist, but an honest geek…

    Reply

  • Tammy Everts

    October 23, 2014 at 4:16 pm

    Non-believer: I don’t censor comments. Your first comment was flagged for moderation because it contained more than one link. Standard spam protocol.

    To respond to the question in your previous comment:

    In all my research into US connection speed numbers, I’ve found that the numbers vary hugely depending on the source. OOKLA’s data is at the more optimistic end of the spectrum, and I can’t find specifics on the numbers they cite. They call it a “household download index”, but they don’t define what that means. For example, they don’t state whether they’re basing it on average speeds or peak speeds. I suspect it’s the latter, based on comparison to Akamai’s “state of the internet” reports (http://www.akamai.com/stateoftheinternet/ ), which give numbers for both average and peak Mbps. OOKLA’s index loosely correlates to Akamai’s peak broadband numbers.

    OOKLA also focuses solely on broadband connection speed. Broadband is, by definition, an internet connection that is above a certain speed (4Mbps), regardless of the connection being used. So if they’re focusing solely on broadband connections, then yes, their numbers are going to skew higher.

    Akamai’s “state of the internet” reporting also focuses primarily on broadband. According to Akamai, average (not peak) broadband connection speeds run 12-16Mbps for the 10 fastest states. The slowest states perform much worse.

    But again, that’s focusing just on broadband. Neither OOKLA nor Akamai report on the huge population of users who experience download speeds below the threshold for broadband. Non-broadband users aren’t outliers. According to Akamai, in many states broadband adoption is only around 50%. And in some states, broadband adoption rates are actually declining.

    As someone who lives some distance from a major urban center, I can tell you that 1-2Mbps connection speeds are a (sadly) common occurrence in my house, especially in the evenings when everyone in my neighborhood gets home and starts streaming movies and doing whatever else it is they do online. There’ve been many, many times when I’ve gotten incredibly frustrated waiting for a file to download or stream, and I’ve performed many, many speed tests with sub-2Mbps results. My experience is a reality for many people. If you’re one of the lucky people with access to good broadband, then it’s easy to forget that many others don’t have this luxury.

    For the reasons I’ve cited so far, I do believe that testing over a DSL connection with a 1.5Mbps connection is a valid way to measure the user experience and raise a red flag for site owners that their sites may be delivering a sub-par experience to a considerable swath of their visitors.

    You didn’t ask about this, but I want to mention that I think WebPagetest’s default RTT for DSL (50ms) is extremely optimistic. All the latency research I’ve been privy to points to numbers more in the 75-150ms range.

    An issue that’s gotten lost in this comment thread is a trend that, to me, is much more concerning — the fact that, given the same set of sites and identical test parameters, median load times and time to interact has dramatically slowed down over the past four years (the amount of time we’ve been performing these tests). Page size and complexity are huge issues. If I were a site owner, this is what I’d be worried about.

    Reply

  • non-believer

    October 23, 2014 at 8:48 pm

    Hello Tammy,

    First of all, allow me to apologize for the emotional comment as I mistook the “moderating” message. I am utterly embarrassed…

    According to the FCC’s Consumer Broadband report (http://www.fcc.gov/reports/measuring-broadband-america-2014) that covers all kinds of residential DSL/cable/fiber/sat connectivities, average latency is ~35msec and “the average subscribed speed is now 21.2 Mbps”, which is well in-line with OOKLA’s and Akamai’s reports.

    Thus, I am afraid that 1.5mbps used for your report is a definitive outlier. This is a great technique to amplify deficiencies of sloppy web-design: inflated images that contain unnecessary metadata, not all objects are compressed, CSS/JS are not minified and block browser’s critical rendering path, JSs are not carefully async’ed, not all 3rd party content is deferred, etc etc etc.

    I also believe that retailers like NeimanMarcus paranoidly compare their users’ experiences against competitors. Needless to mention, that NeimanMarcus’ target audience is way more likely to live in large metropolitan areas where any entry-level home internet starts with at least 3..6 mbps.

    Again, my point is that end-user experience along with waterfall illustrations are far from being “typical”.

    Reply

  • Tammy Everts

    October 23, 2014 at 9:50 pm

    I’m not contradicting the FCC’s report. Again, they’re focusing exclusively on measuring broadband averages, not on connection speeds for all internet users. As I mentioned earlier, broadband is by definition any connection speed greater than 4Mbps, regardless of whether it’s delivered by cable, DSL, etc. And as I also mentioned, according to Akamai’s data about broadband adoption, adoption rates are around 50% in many states, and rates are actually declining in some states.

    “Typical” is a loose word. It clearly means different things to different people. What’s typical for you may differ from what’s typical to me. And I appreciate that there can be more than one kind of “typical” experience. But in my experience, and bearing in mind that the research I’m writing about is intended to represent the broadest possible range of internet users, not just those using broadband, I stand by the argument that these numbers represent a typical user experience for a significant cohort of internet users.

    To address your point about Neiman-Marcus, you may very well be correct. I’m not familiar with their online demographic, so I can’t speak to that. But most retailers recognize that a significant number of people who shop at their sites live outside urban centers.

    Thanks for your comments and questions. I really do appreciate your taking the time to challenge our test assumptions. It’s not the first time we’ve fielded questions like yours, and it’s good for us to keep reviewing and rationalizing our test parameters. We monitor broadband adoption/speed, as well as other test variables, with the understanding that we’ll change our test parameters when any of these variables reaches a tipping point. The DSL/connection speed parameter will require updating at some point, probably in the not-too-distant future.

    Reply

  • Mohamed A. Hassan

    October 27, 2014 at 2:41 pm

    Hello Tammy,

    I think the graphics presented in your article if they point to an issue that would be an Infrastructure issue or latency in serving the assets. what kind of CSS that takes 7 sec to load!

    Another good example is the 960_grid.css over 6 sec to load! http://cl.ly/image/0s0V3T1Z2W1o

    UX or Designers Can’t take care of these issues. that’s what the Front End Developer Does.

    Thanks,
    Mohamed

    Reply

  • Pingback: The Week That Was – Week Ending 18th Oct 2014 | Practical Performance Analyst

  • Pingback: Ecommerce Links: October 2014

  • Pingback: Ecommerce Links: October 2014 - Matthew Hardesty

  • Jason Goldberg

    November 3, 2014 at 6:39 am

    Great stuff as always Tammy. While WPT is a great benchmarking tool, we always have internal debates about testing parameters for synthetic users, so I’m fascinated to see non-believer’s arguments, but I do think they obscure the main point. You’ve been running tests with consistent parameters for multiple years, and pages are getting slower. If you changed your parameter as non-beleiver seems to be asking, then your Y-O-Y data would be compromised.

    Non-believer can argue that a segment of shoppers get a better experience than the benchmark, but their experience is still worse that it used to be and than it could be. Non-beleive, I can assure you that most e-commerce sites have large segments of users that are absolutely getting slower web-perf experiences than the mean runs with these parameters. Sites don’t (or shouldn’t) use benchmark tools to measure their own customers experiences (that’s what real user testing is for), the benchmarking tools allow us to see how the rest of the world is doing, and to fairly compare changes over time, which is exactly what “State of the Union: Ecommerce Page Speed & Web Performance” is for. I for one find it to be a very valuable tool.

    (Written from the crappy 1M connection in my Las Vegas hotel room)

    Reply

  • Pingback: Ecommerce Links: October 2014 | ColderICE: Ecommerce News, Hints, Tips & Tricks

Leave a Reply

Your email address will not be published. Required fields are marked *