When surfing the web these days, it’s all about speed.
Contemporary online experiences are all about rich media. And online multimedia content is being uploaded in higher-quality, and with increasing speed and volume every day.
- 2 years ago, YouTube announced that they upload 35 hours worth of content every minute. When adjusted for Moore’s Law, this number is now probably 3 or 4 times that high. And this doesn’t include all of the other content being uploaded to Facebook, Vimeo, and countless other social media outlets.
- In addition to this, files haring, online storage, synching and online backup have also grown in popularity.
- Live streaming content is also extremely popular, but it requires a clean, consistent connection in order to optimize the quality of these entertainment experiences.
- Now, everything is about mobile. We want to access the web from smart phones, tablets and laptops, without being tethered to the wall with a big ugly cable. We expect connectivity at home, on the road, or anywhere else we go.
Top-quality bandwidth is no longer a luxury. If you want to enjoy everything that the Internet has to offer, you need to take measures to ensure that your bandwidth connection is the best that it can possibly be.
It’s now common to see people with 50Mbps or 75Mbps connections in their homes. And even mobile wireless connections can commonly reach speeds of 25Mbps.
But these numbers are only part of the story. When looking at broadband speeds, you also need to consider the latency of your connection.
So what is latency and how does it work?
When you transmit a piece of data over the Internet, they must bounce around and pass through many different computers and networks before arriving at its final destination. Each of these bounces takes time, and add “lag” to the speed of transmission.
In addition to this, things can sometimes go wrong. Due to electromagnetic interference and a number of other problems, packets can sometimes get rejected or become corrupted in transit. When this happens, you need to resubmit the information again.
The clarity of the line being used also has an impact on the signal quality. If the signal has to travel a long distance – or through an area with heavy electrical interference – before reaching a connection point, there is an increased chance of the signal becoming corrupted.
Finally, you must consider congestion and bottlenecks. If you take 2 wide pipes and join them together with a connection that only has a ½ inch opening, this will limit how much liquid will flow through the pipes. Likewise, data can only flow from point A to point B at the speed of the slowest connection which exists in between these points… regardless of the connection speed on either end.
(There are other quantum physics effects which can affect your overall data transfer speeds, but I won’t go into detail about those today)
These delays and risks increase exponentially as the distance and the number of connection points increases. This is why it helps to limit your latency by minimizing the distance travelled by each data packet.
- If you’re an online business, you can limit the latency for your customers by renting a Content Delivery Network (CDN) which will create geographically local caches of your data in order to ensure the fastest possible delivery times. Fast load times also help improve your search engine rankings.
- If you’re an individual looking to optimize your bandwidth usage, you should try to work with online services which have geographically local servers. If you download a large file from a server inside of your own country, the data will travel a shorter distance, take fewer hops in between, and be exposed to less damaging electromagnetic interference.
This will ensure the greatest overall data transfer rate, and maximize the value you get from your fast broadband connection.
About The Author: SaaSCanada.ca helps Canadians select online backup service providers which have latency optimized services.