So your fast internet is running super slow, dragging down your productivity and performance.
In this article we try and explain in more detail what slows down your fast internet speeds, and define what makes some internet connections, the “fastest.”
It turns out this is actually a complex topic that first needs to be defined. First, it’s important to know what “fast” means when talking about internet speed. It’s a common misconception that internet speeds will be fast if the upload and download speeds are a big number and slow if the upload and download speeds are a small number. However, that’s only part of the story.
There are actually two different measurements that must be taken into account. But first, let’s get into the basics for a moment.
How does fast internet actually work?
When you open a website, or check your email, your computer is sending a “packet” of information to your router. For each packet the router receives, it will send a packet back to the computer saying “I got it”.
The computer waits for this reply and then sends out another packet. The router will then take the packet and send it to its router (the isp).
Then the same thing will happen, the router re-transmits the packet to the next “hop”, waits for a reply before sending another packet.
The reason for this send, send back method is that sometimes it’s possible that the packet gets “dropped” or lost somehow. Maybe the transmission wasn’t clear and when it gets to the next hop (or doesn’t get there) there is no “acknowledgment,” so the router assumes the packet didn’t make it and re-transmits until it gets the return signal.
This continues many times each “hop” extending and retransmitting the packet, until it finally gets to where it’s going. This process continues in reverse when the computer at the other end finally gets your packet and then replies back with the requested info.
The Fastest Internet Speeds Have Reduced Latency
So now that you understand the oversimplified version of how the ‘fastest’ internet works, let me get into the speed vs latency part of it.
Speed is the number you normally see advertised by most providers. That 100 down / 20 up or 300 down / 100 up. This is going to be a measurement of how many simultaneous packets can be sent at the same time.
In a connection that can support 10 meg down, we are saying that every second of time that passes, the ISP is capable of transmitting X megabits per second of data. As a simple analogy, think of a tall skyscraper with an elevator. The packets are people trying to get in and out of the building. In this example, the speed / capacity is how many people the elevator can fit in one trip. If you have a huge cargo elevator that can hold 10000 pounds, you could easily imagine being able to fit 20 people or more.
So now we’ve got you thinking, great! You’re probably thinking “All I need to do is create a big enough elevator and I can move 5000 people in and out of the building in one shot.”
Looking for Fast Internet Speeds?
Not so Fast! There are Limitations to Your Fast Internet
Here is the 2nd and most important aspect of what makes an internet connection feel fast. Its the latency.
This number, harder to understand for most consumers and therefore less often talked about is the time it takes for a packet to leave your computer, go through all the “hops” across the internet, and come back again.
The lower the latency (the time it takes to complete the journey) the faster the internet will feel.
Lets go back to our analogy of the tall skyscraper and elevators.
Latency is the measurement of how long it takes to push the button to call the elevator, get on the elevator, and get down to the bottom floor so you can leave the building and then come back again. Having lower latency is akin to getting high-speed elevators that can zip down to the bottom floor and back again in a shorter amount of time. (higher speed elevators)
Ok so what is the difference between speed (capacity) and latency. When do we want to optimize for one or the other?
Speed = capacity. Trying to download or upload that huge file? You are going to want more speed. When transmitting large datasets speed is king. This is good for video, and more speed is also required as you scale up the number of people who are using the bandwidth simultaneously. 10 users do not need as much speed as 100 users who don’t need as much speed as 1000 users.
Latency = how fast something feels. Are you primarily doing “cloud” based applications or voice over IP? Those applications mostly use lots of small packets. This is where lower latency is king. If each packet takes less time to get where its going and come back again, the internet is going to “feel” faster. Your waiting less. In general, lower latency is going to be a higher quality internet connection.
Get a Quote on Blazing-Fast Internet Speeds
What Causes Higher Latency?
Latency is the time it takes for a packet to go from point A to point B. Each time the packet gets re-transmitted to its next “hop” that will add latency. Interestingly, packets traveling through fiber optics or microwave will travel at the speed of light which according to modern physics is a fundamental limit of how fast data can be transmitted from one place to another.
Signals that travel over electrical “metal” wires travel much slower: only about 1/100th the speed of light. This is in itself a complex topic but has to do with how much time it takes for the magnetic field to collapse in a metallic material (which is not 0). Photons don’t have this problem.
So now we know that latency will be lower in light-based systems such as fiber and microwave. What else do we need to take into account to figure out which one will have the smallest amount of time and therefore the “fastest”? We have the speed of light to take into account.
In any system the amount of time it will take for a signal to go from point A to point B is the distance divided by the speed of light. Yes the speed of light is extremely fast but in the time scales we are talking about, this time actually matters a great deal. So by this logic, light traveling a shorter distance will have a lower latency, your waiting time will be less, and therefore will be faster.
Still With Us? You are Hardcore.
Now to get back to the claim we made at the beginning. “Microwave is the fastest internet available.” We know that light-based systems transmit at the speed of light, and that those transmission times vary only based on the distance the light has to travel and the number of times it must be re-transmitted. Ok great. In a nutshell, microwave is faster because it’s the only medium that travels in a straight line. Fiber bends around corners, microwave only travels in straight lines. Putting a fiber internet side by side with a microwave, the microwave, having the theoretically shortest path between point A and point B will have the lowest latency and feel the fastest.
Fiber will be and close 2nd with latency which might be double or triple that of microwave. A distant third will be all copper-based systems such as cable modems, DSL, which are about 100 times higher latency than light-based systems.
The most prominent example of this is In the world of “high-frequency trading” where latency is at an absolute premium and milliseconds matter hugely, microwave is what they use.
So there you have it, Microwave is the fastest internet available.
Obviously, this is a drastic oversimplification of how it actually works, and there are of course more factors to consider, however, the principal is correct. The fastest internet connection is made possible by Microwave.
At ZTelco we are experts in delivering high-speed, high-performance applications to help your business grow. Let us know if you need help getting more performance out of your cloud-based setups or existing infrastructure technology. Visit our business internet page to learn more.
Leave a Reply
You must be logged in to post a comment.