Sign up to our Newsletter
When it comes to running a fully functional website there are certain positives and negatives. Low bandwidth and high latency are certainly among the negatives. But which one is more fatal and crippling for your website? Let us start by explaining what high latency and low bandwidth actually mean and how they affect your website’s performance.
What is latency?
Latency is one of the most important factors that impacts the speed of the network. Latency is measured by the time it takes for a packet of data to travel from a client device to a website server and come back. A low latency network connection is one that generally experiences small delay times, while a high latency connection generally suffers from long delays. Latency is also referred to as a ping rate and typically measured in milliseconds (ms).
How One AI-Driven Media Platform Cut EBS Costs for AWS ASGs by 48%
Excessive latency creates bottlenecks that prevent data from filling the network pipe, thus decreasing effective bandwidth. The impact of latency on network bandwidth can be temporary (lasting a few seconds) or persistent (constant) depending on the source of the delays.
Think of latency as a journey on a road. The longer the road is, the longer will it take you to travel or reach your destination. If the road is shorter, your trip will be shorter and you will reach your intended destination quickly. If we use the same analogy, then the width of the road can be compared to bandwidth. The wider the road is, the easier it is for more traffic to travel on the road at the same time.
What Affects Latency?
Latency is affected by the type of connection, distance between the user and the server and the width of bandwidth. The Internet connection is impacted by the type of service you use to access the Internet.
Let’s assume that you are browsing the Internet on different types of connections. This is how latency can affect your browsing:
Satellite Internet Connection (High Speed / Bandwidth, High Latency): You would click a link on a web page and, after a noticeable delay, the web page would start downloading and show up almost all at once.
Theoretical Connection (Low Speed / Bandwidth, Low Latency): You would click a link on a web page and the web page would start loading immediately. However, it will take a lot of time for the web page to load completely. While the information in text will be loaded quickly, images will be shown to the user one-by-one. Depending on the size of the images, the images will load in a phased gradual manner.
Cable Internet Connection (High Speed / Bandwidth, Low Latency): You would click a link on a web page and the web page would appear almost immediately, downloading almost all at once.
Distance: The closer you are to the server, the faster the information gets to you. This is typically achieved by using Content Delivery Networks or CDNs. CDNs enable enterprises to place servers where their maximum users reside. CDNs are extremely popular and are used by enterprises that receive large amounts of traffic on their websites.
Bandwidth: If you have small bandwidth, you are more likely to experience congestion which means a slower Internet connection.
What causes Internet latency?
As one can see, Internet latency can be caused by multiple factors. Besides the distance between the computer and the servers that are serving information, Internet latency can also be due to the type of Internet connection used. The medium through which the Internet is accessed also makes a huge difference. For example, a computer or a laptop connected via an Ethernet cable can reduce Internet latency significantly, when compared to a wireless or a Wi-Fi connection. Old routers or the number of people connected to a particular router can also be a significant factor in Internet latency.
How can latency be reduced?
Internet latency can be reduced in a big way by investing in a Content Delivery Network (CDN). By caching content in nearby servers, a CDN can reduce latency in a significant manner. Similarly, web administrators can also take efforts to reduce the latency by optimizing images for faster loading and reducing actual file sizes. On the user end, enterprises need to check if there are any specific applications that are consuming more bandwidth and putting pressure on the network.
How to measure network latency?
Network administrators typically use network monitoring tools to measure network latency. There are a number of tools available in the market to measure network latency. Network latency is measured by the time required for a packet of data to travel across a network from a client (sender) to a website or server (receiver).
Network latency metric | Description |
Round Trip Time (RTT) | RTT or Round Trip Time is a metric that is measured in milliseconds to calculate the time required for a packet of data to travel from the source to the destination and come back to the source. |
Time to First Byte (TTFB) | TTFB or Time to First Byte is a metric that is used to measure the time taken for a web browser to receive the first byte of information as a response from a website or server |
What is bandwidth?
Bandwidth is just one element of what a person perceives as the speed of a network. People often mistake bandwidth with Internet speed mainly because Internet Service Providers (ISPs) claim that they have a fast ’50 Mbps connection’ in their advertising campaigns. True Internet speed is actually the amount of data you receive every second and that has a lot to do with latency too.
What Affects Bandwidth?
Bandwidth issues can be caused due to a number of reasons or activities. A network monitoring technology can be used to identify or troubleshoot a network related issue. Some popular “flow-based” technologies are NetFlow and sFlow. Bandwidth Issues can almost always be traced to one or two specific activities. These activities almost always have two characteristics: large amounts of data, and extended duration.
Some of the common activities that cause bandwidth problems include:
- Watching or streaming videos from the Internet
- Transferring large files
- Activities which require real-time monitoring (such as surveillance footage from CCTV cameras)
- Downloading large files from the Internet
All the above activities can contribute greatly to bandwidth issues in a network, and should be done only when there is light network traffic. Large file transfers or data streams within a network should ideally be placed on a separate network. This helps in preventing a bottleneck for other users. Bandwidth is important when you have a lot of data to send/receive and it doesn’t really need to be real-time, such as transferring large amounts of data to an off-site backup. (You don’t really care in what order the data arrives or how quickly the other side can respond, you just need all the data to get there.)
What is Ping?
In simple terms, Ping is a command line utility that is used to test the latency of a network or Internet connection. This is done by sending a packet of data over a network to a specific computer or device. If the packet of data is received successfully by the target computer, it sends a response back to the computer from where the packet of data was received. The Ping command is used to test the time taken in milliseconds for a packet of data to reach another device and come back. This command is also used to measure the quality of the network, as enterprises can determine how many bytes were received in response, the time taken and the bytes lost (packet loss).
Quality of Service
QoS (Quality of Service) can be measured by the speed and reliability of the network. A service provider promising a good QoS has to demonstrate the capability of the network to deliver good and stable accessibility. Elements of network performance within the scope of QoS often include availability (uptime), bandwidth (throughput), latency (delay), and error rate.
While low latency and high bandwidth is the ideal to strive for, high latency has a deeper impact on load times than low bandwidth. At low latencies, data should transfer almost instantaneously and we shouldn’t be able to notice a delay. As latencies increase, we begin to notice more of a delay. You can measure the latency between your computer and a web address with the ping command.
At low latencies, data should transfer almost instantaneously and we should not be able to notice a delay. As latencies increase, we begin to notice more of a delay.
Conclusion
Understanding the different parameters which affect your content delivery is crucial to provide the best possible experience to your end-users. Choosing the right web performance strategy & solutions is your next step.
With over 17 years of experience, GlobalDots has an unparalleled knowledge of today’s leading web technologies. Our team knows exactly what a business needs to do to succeed in providing the best online presence for their customers. We can analyze your needs and challenges to provide you with a bespoke recommendation about which services you can benefit from.
Contact us to take this step with true experts beside you. Together, we’ll wow both your end-users and your competitors.