Home : Technical Terms : Latency Definition


In computing, "latency" describes some type of delay. It typically refers to delays in transmitting or processing data, which can be caused by a wide variety of reasons. Two examples of latency are network latency and disk latency, which are explained below.

1. Network latency

Network latency describes a delay that takes place during communication over a network (including the Internet). For example, a slow router may cause a delay of a few milliseconds when one system on a LAN tries to connect to another through the router. A more noticeable delay may happen if two computers from different continents are communicating over the Internet. There may be a delay in simply establishing the connection because of the distance and number of "hops" involved in making the connection. The "ping" response time is a good indicator of the latency in this situation.

2. Disk latency

Disk latency is the delay between the time data is requested from a storage device and when the data starts being returned. Factors that effect disk latency include the rotational latency (of a hard drive) and the seek time. A hard drive with a rotational speed of 5400 RPM, for example, will have almost twice the rotational latency of a drive that rotates at 10,000 RPM. The seek time, which involves the physical movement of the drive head to read or write data, can also increase latency. Disk latency is why reading or writing large numbers of files is typically much slower than reading or writing a single contiguous file. Since SSDs do not rotate like traditional HDDs, they have much lower latency.

Other types of latency

Many of other types of latency exist, such as RAM latency (a.k.a. "CAS latency"), CPU latency, audio latency, and video latency. The common thread between all of these is some type of bottleneck that results in a delay. In the computing world, these delays are usually only a few milliseconds, but they can add up to create noticeable slowdowns in performance.

NOTE: It is important to not confuse latency with other measurements like data transfer rate or bandwidth. Latency refers to the delay before the data transfer starts rather than the speed of the transfer itself.

Updated: March 3, 2017

Cite this definition:


TechTerms - The Tech Terms Computer Dictionary

This page contains a technical definition of Latency. It explains in computing terminology what Latency means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Latency definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!

Subscribe to the TechTerms Newsletter to get featured terms and quizzes right in your inbox. You can choose to receive either a daily or weekly email.

Sign up for the free TechTerms Newsletter

How often would you like to receive an email?

You can unsubscribe at any time.
Questions? Please contact us.