Video Latency
Video latency refers to the degree of delay between the time a transfer of a video stream is requested and the actual time that transfer begins. Networks that exhibit relatively small delays are known as low-latency networks, while their counterparts are known as high-latency networks.
Fundamentally, video latency refers to the amount of time it takes for a single frame of video to transfer from the camera to the display.
Low latency is typically defined as less than 100 milliseconds (ms) and is preferable when operating remoted devices, video conferencing, and streaming live events. High latency, on the other hand, is considered acceptable for applications such as recording and streaming previously recorded events.
Network routers are the most notorious devices responsible for latency on the end-to-end path. Satellite communications, on the other hand, often add large amounts of latency as a result of the time it takes a packet to travel across the link.
Latency is typically measured cumulatively — the more nodes that data must pass through, the more latency there is. For example, the more links and router hops a data transmission must go through, the larger the end-to-end latency can potentially be.
Generally, users want latency to be as low as possible. For instance, an operator manipulating a remote device wants to avoid any delay in seeing the delay in frames, audio. etc. Similarly, broadcasters doing live remote interviews do not want delays between speaking and hearing a response.