Understanding the Meta, Comcast, and LinkedIn Outages

Internet and Cloud Research

Measuring Web Performance and Digital Experience

By Alex Henthorn-Iwane
| | 7 min read


Pretty much everyone in business today knows that websites and related digital channels like mobile apps and service APIs are mission critical. It's also dead obvious that strong web performance is foundational to delivering the seamless digital experiences that produce optimal business outcomes. And it doesn't take a rocket scientist to know that website performance is built on and dependent on the Internet.

Yet, performance management across the Internet is less known than you might suspect. Now, that's not because there's been an absolute lack of performance data or experience monitoring. Web operations teams have long collected user experience metrics like page load time, application performance metrics like HTTP response time, and infrastructure performance monitoring metrics like network latency. And let's not forget that web teams generally use Content Delivery Network (CDN) providers to cache web pages closer to users and deliver better digital performance. Managed DNS providers offer faster response times to end user DNS queries.

But what constitutes "good" Internet performance for digital business? And what constitutes "competitive" Internet performance for digital services? In talking with customers, it became clear to us that performance management discussions can often get pulled into subjective opinions and points of view. Everything could always be better, but you don't have unlimited time and budget to invest. What operations teams need is comprehensive and objective benchmarking data on which to make trade-off decisions as they seek to continuously improve customer digital experience. Unfortunately, most operations teams don't have the time to give to this sort of endeavor and commissioning a study by an analyst firm can be very costly.

The DXPBR is Born

This need is why we decided to create the Digital Experience Performance Benchmark Report (DXPBR). We measured the performance of 20 top U.S. consumer websites across three industry verticals in the U.S. market: retail, media & entertainment and travel & hospitality. We collected Internet, network, server and experience metrics every ten minutes for 60 days from 69 vantage points across 36 cities and multiple broadband and other ISPs. All in all, we took over 300M unique measurements.

From these measurements, we were able to chart the comparative performance of websites for each vertical and examine performance statistics across all sixty sites. Here are a few interesting takeaways that you can read about in the report.

  • Distinct performance patterns exist within each industry: If you look at the scatter plots around metrics like network latency vs. HTTP response time, you'll see distinct clusters and spectrums of performance across the three industry verticals. For example, top retail sites fall into two different groups of HTTP response times along a mostly uniform range of network latency, while media & entertainment sites saw a more consistent range of HTTP response times but with two distinct clusters of network latency.
  • Even though the U.S. market is quite mature in terms of fiber routes and metro broadband, Internet performance varies, in some cases significantly, across CDN providers, ISPs and geographies.
  • Building a robust stack of DNS, network and HTTP response time performance is a sound strategy for most organizations to deliver optimal digital experience. For example, we saw a significant correlation between those sites that had strong DNS and network latency performance and those that landed at the very top of the HTTP response time rankings.

An Internet Performance Bar

There's one more takeaway that you'll find in the report. After looking at average and median performance ranges and the correlation between various metrics, we decided to define a minimum Internet performance bar of 25ms for DNS response time, 15 ms round trip network latency to the CDN edge, and 350ms HTTP response time for the U.S. market. As we've shared this concept with customers, the response has been positive. Customers we spoke with have told us that it's helpful to have a rule of thumb for these KPIs.

Beyond the Benchmark

Benchmarks are essential to help you make sound investments in optimizing your digital customer experience. But don't forget that the Internet and your many third-party providers aren't a static environment. They're a complex and unpredictable ecosystem that lies outside of your control. Things can and will go wrong (just read some of our outage blog posts) and when they do, you need to be able to quickly isolate the provider or factor that is causing performance impacts and escalate effectively with multi-layer data so that your providers can help you recover from performance issues fast.

So, to get benchmarking and baselining to build competitive web performance, download the 2019 Digital Experience Performance Benchmark Report today. And to understand how to keep delivering excellent digital experience even when the going gets rough, check out some of our case studies from companies like Zendesk.

Subscribe to the ThousandEyes Blog

Stay connected with blog updates and outage reports delivered while they're still fresh.

Upgrade your browser to view our website properly.

Please download the latest version of Chrome, Firefox or Microsoft Edge.

More detail