Latency

In computing, "latency" describes some type of delay. It typically refers to delays in transmitting or processing data, which can be caused by a wide variety of reasons. Two examples of latency are network latency and disk latency, which are explained below.

1. Network latency

Network latency describes a delay that takes place during communication over a network (including the Internet). For example, a slow router may cause a delay of a few milliseconds when one system on a LAN tries to connect to another through the router. A more noticeable delay may happen if two computers from different continents are communicating over the Internet. There may be a delay in simply establishing the connection because of the distance and number of "hops" involved in making the connection. The "ping" response time is a good indicator of the latency in this situation.

2. Disk latency

Disk latency is the delay between the time data is requested from a storage device and when the data starts being returned. Factors that effect disk latency include the rotational latency (of a hard drive) and the seek time. A hard drive with a rotational speed of 5400 RPM, for example, will have almost twice the rotational latency of a drive that rotates at 10,000 RPM. The seek time, which involves the physical movement of the drive head to read or write data, can also increase latency. Disk latency is why reading or writing large numbers of files is typically much slower than reading or writing a single contiguous file. Since SSDs do not rotate like traditional HDDs, they have much lower latency.

Other types of latency

Many of other types of latency exist, such as RAM latency (a.k.a. "CAS latency"), CPU latency, audio latency, and video latency. The common thread between all of these is some type of bottleneck that results in a delay. In the computing world, these delays are usually only a few milliseconds, but they can add up to create noticeable slowdowns in performance.

NOTE: It is important to not confuse latency with other measurements like data transfer rate or bandwidth. Latency refers to the delay before the data transfer starts rather than the speed of the transfer itself.

Updated March 3, 2017

Definitions by TechTerms.com

The definition of Latency on this page is an original TechTerms.com definition. If you would like to reference this page or cite this definition, you can use the green citation links above.

The goal of TechTerms.com is to explain computer terminology in a way that is easy to understand. We strive for simplicity and accuracy with every definition we publish. If you have feedback about the Latency definition or would like to suggest a new technical term, please contact us.

Want to learn more tech terms? Subscribe to the daily or weekly newsletter and get featured terms and quizzes delivered to your inbox.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe or change your frequency setting at any time using the links available in each email.

Questions? Please contact us.