News Sites Are Fatter and Slower Than Ever

Frederic Filloux
Monday Note
Published in
6 min readJul 13, 2015

--

An analysis of download times highlights how poorly designed news sites are. That’s more evidence of poor implementation of ads… and a strong case for ad blockers.

Websites designers live in a bubble, they’re increasingly disconnected from users. Their work requirements include design (fonts, layouts elements), advertising (multiple ad serving and analytics), data collection (even though most sites collects way more data that they are able to process), a/b testings, and other marketing voodoo.

Then, when a third party vendor shows up with the tool-everyone-else-uses, the pitch stresses simplicity: ‘Just insert a couple of lines of code’, or ‘A super-light javascript’. Most often, corporate sales and marketing drones kick in: ‘We need this tool’, or ‘Media-buying agencies demand it’. The pressure can even come from the newsroom struggling to improve its SEO scores, asking for new gadgets “To better pilot editorial production”, or “To rank higher in Google News”.

Quite often, these are contraptions are used to conceal professional shortcomings that range from an inability to devise good ads formats that won’t be rejected by users (and better, clicked on), to a failure to provide good journalism that will naturally finds its way to users without needing titillating stimuli.

This troublesome trend also reveals a leadership failure: As no one is given the final authority over performance, a web site (and sometimes an app) ends up as a disorganized pile of everyone’s demands. Lack of choice leads to anarchy.

In the process, publishers end up sacrificing a precious commodity: SPEED.

Today, a news site web page of a consists of a pile of scripts, of requests to multiple hosts in which relevant content only makes up an insignificant proportion of the freight. (On the same subject, see this post by John Gruber, and Dean Murphy’s account of An hour with Safari Content Blocker in iOS9)

Consider the following observations: When I click on a New York Times article page, it takes about 4 minutes to download 2 megabytes of data through… 192 requests, some to Times’ hosts, most to a flurry of others servers hosting scores of scripts. Granted: the most useful part — 1700 words / 10,300 characters article + pictures — will load in less that five seconds.

But when I go to Wikipedia, a 1900 words story will load in 983 milliseconds, requiring only 168 kilobytes of data through 28 requests.

To put it another way: a Wikipedia web page carrying the same volume of text will be:
— twelve times lighter in kilobytes
— five times faster to load relevant items
— about 250 times faster to fully load all elements contained in the page because it will send 7 times fewer requests.

And the New York Times is definitely not the worse in its league.

I hear cohorts of technical people yelling that I’m comparing oranges — a barebones, advertising-free, non-profit web site — and peaches — a media-rich commercial site that must carry all the bells and whistles. I surely do, and this is exactly my point. When I spend hours measuring web sites from different perspectives — and knowing how the sausage factory works — my certainty is we went way too far when overloading our web sites with poorly implemented, questionable features.

Two major industry trends should force us to reconsider the way we build our digital properties. The first one is the rise of ad blockers that pride themselves at providing faster navigation and at putting less strain on computers. Users made publishers pay the hard price for giving up browsing comfort and speed: in some markets, more than 50% of visitors use ad-blocking extensions.

The second trend is the rise of mobile surfing that account for half of pageviews in mature markets. And, in emerging countries, users leapfrog desktops and access the web en masse through mobile.

Let’s consider the ad blockers problem.

To assess their impact of page-loading time, I selected eight news sites, each with different characteristics: free and mostly ad-supported (Buzzfeed, Spiegel, HuffPo and Guardian), subscription-based (FT.com), mixed model (NYT), or no-ads and text based (Medium).

The table below shows how the activation of an ad blocker impacts requests and data transfers. In my test, since a page can still continue to download elements 15 minutes after the initial request, I arbitrarily measured the number of requests and amount transferred after the site comes to some kind of rest, i.e. once the most important elements have been downloaded.

375_wnoadblock

Note the number of servers requests: it ranges from 99 (Medium) to 400+ for the HuffPo or the FT.com. Requests by a site like the Washington Post are impossible to measure since the site never stops downloading, endlessly calling data for auto-play videos; these megabytes are often rendered by Flash Player, well-known for CPU overheating (you can hear your laptop’s fans make a drone-like noise).

One caveat: This is a crude assessment. I used Chrome’s developer tools, setting my AdBlockPlus on and off, and making about fifty downloads on these URLs; the values must only be considered as indicative, even if the ad blockers’ impact seems quite reliable. Another note: ad blockers cut requests and transfers of ad formats, but many scripts embedded in a page are not affected.

Overall, depending on the advertising load of the site’s home page, an ad blocker cuts data transfers by half to three quarters,.

Now, let’s have a look at mobile issues.

Unfortunately, today’s analysis is restricted to mobile sites and does not include applications. But it gives an idea of how web sites are implemented for mobile use, whether through responsive design (dynamic adjustment to the window’s size) or dedicated mobile sites.

First, let’s consider the effect of network conditions encountered in real life, and rarely factored-in by developers who enjoy super-fast networks in their offices.

I took an article page of the New York Times mobile site, to see how fast it loads.

375_netwrk_cond

No surprise, the page takes six times more times on a regular 2G network that on 4G. Based on these observations, most mobile news sites would need to be redesigned to deal with the harsh realities of many cellular networks in Africa or Asia. (By ignoring this, publishers leave an open field to Google and Facebook ultra-light apps.)

For the test, I took seven news sites (FT.com doesn’t have a mobile version, nor a responsive one — it relies on its great web app for mobile use).

The test sample is an article page, with no ad-blocking activated. I measured the difference between web and mobile versions, simulating an iPhone 5 operating over a “good 3G network” with a 1 Mb transfer rate and a 40 milliseconds Round Trip Delay Time. Again, I quit measuring requests and byte transfers after 30 seconds, which coincides with apparent completion for most sites (even if many continue to download minor stuff almost endlessly).

375_mobile_vs_web

For mobile use, this shows most analyzed properties reduce the number of requests and bytes transferred — by up to 80% for the Guardian as an example. However, if we consider that the speed ratio between 3G networks and wifi access is 1:20, there is room for serious improvement.

Since its initial release 22 years ago, the Hyper Text Markup Language (HTML) has gone through many iterations that make web sites richer and smarter than ever. But this evolution also came with loads of complexity and a surfeit of questionable features. It’s time to swing the pendulum back toward efficiency and simplicity. Users are asking for it and will punish those who don’t listen.

— frederic.filloux@mondaynote.com

--

--