What is Latency and How Does It Affect My Site’s Ranking?

Webfor Default Image


frustrated-woman-laptop

When you input a web address and wait for your page to load, you may experience what feels like an excruciating, few seconds or less-than-a second delay before the page opens. This terrible tickle of waiting is called latency.

What latency reveals is a strange and wondrous truth about the internet: your information bounces around from several servers in various places across the globe before settling into your screen, depending on the location of the server of origin and a series of other internal factors.

You Won’t Rank if Your UX Sux

Load time is a major player in user experience. As users expect the internet to think as quickly as we do and react as an extension of our bodies and minds, reducing latency becomes THE optimization priority.

Not only that, but user experience is a primary ranking factor in Google’s algorithm. This search engine may choose not to index your site if the onsite experience is sub par. For these reasons, business owners in particular should be aware of the mechanics of latency and employ a good web developer who can keep track of latency issues and industry updates.

While latency was more of an issue in the early 2000s, constant behind-the-scenes improvements are being made to hone user experience as our standards for human-tech synchronization increase.

The Latency Map

High latency is primarily due to propagation delay as your request for a “page object” travels at the speed of light to router points across the US or, in the case of some government agencies, a satellite station, and then returns to deliver the goods to your home network. Once the page loads, you can rest assured that the data requested will now arrive as quickly as your bandwidth internet connection allows.

As User Experience Solution Evangelist Tammy Everts states:

“To put this in real-world terms, say you visit a web page and that page contains 100 objects — things like images, CSS files, etc. Your browser has to make 100 individual requests to the site’s host server(s) in order to retrieve those objects. Each of those requests experiences at least 20-30ms of latency. (More typically, latency is in the 75-140 ms range, even for sites that use a CDN.) This adds up to 2 or 3 seconds, which is pretty significant when you consider it as just one factor that can slow your pages down.”

Other causes of latency include transmission delays in the physical medium transmitting the information. Processing delays can occur as information passes through proxy servers, or computer systems or applications that act as intermediaries for requests for information from individuals or other servers. Information sometimes “globe trots” from network to network across the internet, causing further delay.

At low latencies, data transfers are nearly instantaneously. The result? Instant satisfaction and zero discussion. Most users have no idea that the information he just received made many stops in far-flung locations to get to his machine in response to command.

broadband-latency-map

Breaking it Down

The step-by-step causes of latency can be broken down into the following, in order:

  • Propagation delay: As discussed above, this lag is a function of how long it takes communication media information to travel at the speed (3 x 105 km/sec) of light from source to destination. Propagation delay = distance / speed.

 Interesting fact: the speed of light is actually lower in copper wire of fiber optic cable, slowed by what is called the velocity factor (VF).

  • Serialization delay: Serialization delay is the conversion of bytes (each equaling 8 bits) of data stored in a computer’s memory into a stream that is transmitted across communications media. This factor can create a significant delay on links with lower transmission rates, but for most links, the delay is fractional compared with other latency causes. Serialization delay = packet size in bits / transmission rate in bits per second.

  • Data protocols: Protocols exists that serve as “handshakes” to synchronize the transmitter and receiver so they can update each other on link status and correct transmission errors. These exchanges take time to propagate across a link and can contribute to latency.
  • Routing and switching: IP networks like the Internet forward IP packets are forwarded from source to destination through routers and switches that continually update decisions about which router would best be used next to get the information to its destination. To conceptualize this process, think frogs hopping lilypads and strategizing between leaps.

  • Buffer management and queing: Queing latency refers to the amount of time an IP packet spends in the queue waiting for transmission. This waiting time is due to over-utilization of the outgoing link after the routing / switching delay has been received and can contribute 20 ms of latency.

Measuring Network Latency

If latency is a problem for your site, network tools like ping tests and traceroute can measure your latency rate by determining round-trip time, or how long it takes a given network packet to travel from it’s source to destination.

If you and/or users access your site through DSL or cable connections, your latency is likely less than 100 milliseconds (ms), with less than 25 ms as the most desirable rate. Satellite connections, on the other hand, produce latencies of 500 ms or more, a frustrating delay for any user.

Because network latency can affect the ranking of your site (due to time lags between search engine review of cached versions of your site) as well as user experience, doing all  you, your SEO, and web-developer can do to correct the issue is paramount.

The Fix-It Plan

 snail-latencyThere are several ways to correct latency:

  • Shorten server round trips by making content available closer to users.

  • Allow more requests to occur at once.

  • Reduce the number of round trips.

  • Make improvements to the browser cache so that it can store files and share them for current and repeat visits.

Make no mistake: industry experts are working around the clock to correct latency at its source to improve user experience as the demand for – and expectation of – instant, accurate data becomes more and more increases. A few ways in which latency is actively reduced include:

1. Creation of multiple connections: Browser vendors use multiple connections to create simultaneous requests to host servers. Since 2008, most browsers have updated from 2 connections per domain to 6. These vendors are also working to improve the browser cache.

2. SPDY Protocol: This protocol by Google adds a session layer over the SSL that allows for multiple concurrent streams over a single connection, amplifying the abilities of your browser.

3. Local content: Content delivery networks, or CDNs (because there’s a slick abbreviation for everything these days), cache content in distributed servers based regionally or worldwide, bringing content closer to users and reducing round trip time. Effective in correcting desktop latency, it should be noted that CDNs do not improve mobile latency.

4. Front-end optimization: Finally, an SEO solution! By consolidating page objects into bundles, latency is greatly reduced. Why? Less page objects require fewer trips to the server. Tools that help with this process include Strangeloop’s Site Optimizer. [amazon web services – woobox.com- you get data from where you are, wherever you are].

Whether your user’s online activities consist of day trading or gaming, latency greatly affects their experience. By reducing your site’s load time to less than 3 seconds, you do your users and your business a significant favor.

One could read about latency for days. If that sounds like your best day ever (or if you’re even moderately interested) here are a few excellent resources:

A Look at Latency and Search Engine Ranking by Eric Enge

How Latency Can Make Even Fast Internet Connections Feel Slow (and some great tips for measuring latency).

Happy optimizing!