What is Latency?
Latency is the time it takes for data to travel from your device to a server and back. Learn how latency affects gaming, video calls, and browsing, what jitter is, how to measure latency with the ping command, and how to reduce it.
Latency is the time delay between an action and its result on a network. When you click a link, latency is the wait before any data starts arriving. When you press a button in an online game, latency is the gap before the server registers the action. Measured in milliseconds (ms), latency is one of the two fundamental metrics of internet connection quality, alongside bandwidth.
While bandwidth determines how much data your connection can carry, latency determines how quickly any single piece of data makes the journey. A connection with massive bandwidth but high latency feels sluggish. A connection with modest bandwidth but low latency feels responsive. For interactive activities like gaming, video calls, and remote desktop sessions, latency matters more than bandwidth.
How Latency Works
Latency is the total round-trip time for data to travel from your device to a destination and back. It is composed of several delay components that add up along the path.
Propagation delay is the time it takes for a signal to physically travel through the medium (fibre optic cable, copper wire, or radio waves). Light through fibre covers roughly 200,000 km per second. A packet travelling from New York to London (5,500 km through undersea cable) incurs about 27 ms of one-way propagation delay. Round-trip, that is roughly 55 ms that no technology can reduce, because it is limited by the speed of light.
Transmission delay is the time to push all the bits of a packet onto the wire. On high-bandwidth connections, this is negligible. On low-bandwidth connections (like old DSL), large packets take measurable time to transmit.
Processing delay is the time each router and switch along the path spends examining the packet, checking headers, and deciding where to forward it. Modern networking equipment processes packets in microseconds, but each hop adds a small increment.
Queuing delay occurs when packets wait in a buffer at a congested router or switch. This is the most variable component and the primary cause of latency spikes. When a link is saturated, incoming packets queue up like cars at a traffic light. Queuing delay can add anywhere from zero to hundreds of milliseconds depending on congestion.
The total latency you experience is the sum of all these delays across every hop between your device and the destination server. A typical home internet connection to a server in the same country produces 10-40 ms of latency. Cross-continent connections range from 50-150 ms. Connections involving satellite links can exceed 500 ms.
How Latency Affects Different Activities
Different internet activities have vastly different sensitivity to latency. What feels fine for one activity can be unusable for another.
Online gaming is the most latency-sensitive common activity. In fast-paced multiplayer games, every millisecond counts. A player with 20 ms latency sees and reacts to events 80 ms sooner than a player with 100 ms latency. In competitive shooters, that gap is the difference between hitting a target and missing. Fighting games and rhythm games are even more sensitive, with some becoming unplayable above 50 ms.
Video calls require consistent low latency for natural conversation. When latency exceeds 150 ms, participants start talking over each other because the delay makes it hard to judge when the other person has finished speaking. Latency above 300 ms makes real-time conversation extremely difficult. Video call platforms like Zoom and Teams buffer audio and video to smooth out small variations, but they cannot hide large delays.
Web browsing is affected by latency in a way most people do not realize. Loading a modern webpage requires dozens of sequential requests: the HTML, then CSS and JavaScript, then images, then fonts, then API calls. Each request incurs a round trip of latency. On a connection with 100 ms latency, a page requiring 30 sequential requests adds 3 full seconds of latency-induced delay before everything finishes loading, regardless of how fast your bandwidth is.
File downloads and streaming are the least latency-sensitive. Once a large download or video stream starts, data flows continuously and latency only affects the initial buffer time. A movie on Netflix buffers for a second or two at the beginning and then plays smoothly even on a 100 ms connection, because the bandwidth keeps the buffer full.
What is Jitter
Jitter is the variation in latency over time. If you send 100 packets and they take between 18 ms and 22 ms each, your latency is consistent and jitter is low (about 4 ms). If those packets take between 15 ms and 120 ms, jitter is high (105 ms) even though the average latency might look acceptable.
Jitter is particularly destructive for real-time communication. Voice and video calls send packets at regular intervals. The receiving device expects them to arrive at roughly the same intervals. When jitter is high, packets arrive in bursts: some too early, some too late. The application’s jitter buffer tries to smooth this out by holding packets briefly, but a large jitter buffer adds its own latency. If packets arrive too late even for the buffer, they are discarded, causing audio glitches, frozen video frames, and robotic-sounding voice.
Gaming is similarly affected. Consistent 60 ms latency is playable because your brain adjusts to the fixed delay. Jittery latency that jumps between 20 ms and 150 ms causes unpredictable behaviour: characters rubber-band, shots that looked accurate miss, and the game feels broken.
Common causes of jitter include Wi-Fi interference (competing signals from neighbours), network congestion (competing traffic on your ISP’s infrastructure), and buffer bloat (oversized buffers in networking equipment that absorb bursts but create variable delays).
How to Measure Latency
The most basic latency measurement tool is the ping command, available on every operating system. Ping sends small ICMP packets to a destination and reports the round-trip time for each one.
Windows: Open Command Prompt and type ping google.com. The output shows the round-trip time for each packet in milliseconds, along with minimum, maximum, and average values.
macOS/Linux: Open Terminal and type ping google.com. Press Ctrl+C to stop. The results show the same round-trip statistics.
Ping gives you a quick snapshot. For more detail, use traceroute (macOS/Linux) or tracert (Windows). Traceroute shows the latency at every hop between your device and the destination. This identifies where delay is occurring: your local network, your ISP, a transit provider, or the destination’s network.
For ongoing monitoring, tools like PingPlotter, MTR (My Traceroute), and various online latency testing services provide continuous measurement with graphs that show latency and jitter over time. These are valuable for diagnosing intermittent problems that a single ping test might miss.
To test your connection to game servers specifically, many games display real-time latency in their settings or HUD. This is the most relevant measurement for gaming because it reflects the actual path to the server you play on, which may differ from the path to Google’s server.
How to Reduce Latency
Some causes of latency are outside your control (physical distance to the server, ISP routing decisions), but several practical steps can reduce latency on your end.
Use a wired Ethernet connection. Wi-Fi adds 1-10 ms of latency under good conditions and significantly more under poor conditions. Switching from Wi-Fi to an Ethernet cable connected directly to your router provides the most consistent, lowest-latency connection available to a home user.
Reduce network congestion. When your bandwidth is fully utilized, queuing delay increases latency for everything. If someone is downloading a large file while you are gaming, the download saturates the connection and your game packets wait in queue. Quality of Service (QoS) settings on your router can prioritize latency-sensitive traffic (gaming, voice, video) over bulk transfers.
Choose closer servers. Physical distance is the largest fixed component of latency. Connecting to a game server or video call endpoint in your region rather than across the world can cut latency in half or more. Most services automatically select the nearest server, but you can often override this manually.
Upgrade your router. Old routers with slow processors create processing and queuing delays. Modern routers with better CPUs, more RAM, and features like SQM (Smart Queue Management) handle traffic more efficiently, reducing latency under load.
Restart your modem and router. Accumulated routing table entries, memory leaks, and stale connections can degrade performance over time. A periodic restart clears these issues. If latency has gradually increased over weeks, a restart often brings it back to baseline.
Contact your ISP. If traceroute shows high latency at a specific hop within your ISP’s network, the problem is on their end. Reporting consistent high latency with traceroute evidence gives the ISP’s support team actionable data to work with.
Frequently Asked Questions
What is a good latency for gaming?
For competitive online gaming, latency under 30 ms is excellent. Under 50 ms is good for most games. Between 50-100 ms is playable but noticeable. Above 100 ms causes visible lag that affects gameplay. The exact threshold depends on the game type. Fast-paced shooters are more sensitive to latency than turn-based or strategy games.
What is the difference between latency and ping?
Ping is a tool that measures latency. The ping command sends a small packet to a destination and measures the round-trip time in milliseconds. People often use 'ping' and 'latency' interchangeably. Technically, latency is the concept (delay) and ping is the measurement tool, but in casual usage they mean the same thing.
Does faster internet reduce latency?
Higher bandwidth does not directly reduce latency. A 1 Gbps connection and a 100 Mbps connection can have identical latency if they share the same physical path. However, if your connection is saturated (all bandwidth in use), packets queue up and latency increases. Upgrading bandwidth reduces congestion-related latency but not the base propagation delay.
Why is my latency high even with fast internet?
High latency with fast internet usually indicates distance from the server, routing inefficiency, Wi-Fi interference, network congestion at a hop between you and the destination, or ISP routing issues. Running a traceroute can identify where the delay occurs. Switching from Wi-Fi to Ethernet often reduces latency by 5-15 ms.
What is jitter?
Jitter is the variation in latency between consecutive packets. If your latency fluctuates between 15 ms and 80 ms, the jitter is high. Consistent 50 ms latency with low jitter is often better than average 30 ms latency with high jitter. Video calls and VoIP are especially sensitive to jitter because they need packets to arrive at regular intervals.