Jitter
Jitter, in networking, refers to small intermittent delays during data transfers. It can be caused by a number of factors including network congestion, collisions, and signal interference.
Technically, jitter is the variation in latency — the delay between when a signal is transmitted and when it is received. All networks experience some amount of latency, especially wide area networks that span across the Internet. This delay, typically measured in milliseconds, can be problematic for real-time applications, such as online gaming, streaming, and digital voice communication. Jitter makes this worse by producing additional delays.
Network jitter causes packets to be sent at irregular intervals. For example, there may be a delay after some packets are sent and then several packets may be sent all at once. This may cause packet loss if the receiving system is unable to process all the incoming packets. If this happens during a file download, the lost packets will be resent, slowing down the file transfer. In the case of a real-time service, like audio streaming, the data may simply be lost, causing the audio signal to drop out or decrease in quality.
The standard way to compensate for network jitter is to use a buffer that stores data before it is used, such as a few seconds of an audio or video clip. This will smooth out the playback of the media since it gives the receiving computer a few seconds to receive any packets lost due to jitter. While buffers are an effective solution, they must be extremely small when used in real-time applications such as online gaming and video conferencing. If the buffer is too large (greater than 10 ms), it will cause a noticeable delay.