Editor’s Note: This article was last updated on March 27, 2019
Edge computing is somewhat of an enigma. It’s a buzzword that can mean 50 different things depending on who you ask.
“It’s a new kind of data center.”
“It’s a new way to build a network”
“It’s a method to speed up the cloud.”
The proposed implications of it are just as nebulous.
“It’s going to make my youtube videos load faster.”
“It’s a new way to improve bandwidth”
“It helps location based applications.”
The truth is, it’s all of those things and much, much more.
If you’re technically savvy and would like a full breakdown, there are quite a few 30 page publications like this one that can explain edge computing comprehensively.
For the rest of us, this article is like the clif notes for the technology taking over a multi billion dollar industry.
What is Edge Computing In Plain English?
Edge computing is ultimately a method to improve the performance of internet-based electronic devices.
Think phones, computers, sensors, pressure gauges, smart meters, traffic management systems, autonomous vehicle controllers, etc.
Anything that has any connection to the internet whatsoever will be affected by edge computing.
With edge computing, Netflix won’t buffer, websites load faster, and in general, electronics devices do what they’re supposed to do more quickly.
Edge computing provides that boost in one simple way:
It reduces the distance that data has to travel from point A to point B.
For most devices, point A to point B can be far.
20 years ago, your videos stored on VHS tapes.
Now those videos are stored miles away in a data center.
Our society depends on these data centers for virtually everything.
Data Centers & the Cloud Explained
Data centers are the bedrock of the internet. You may also know them as “the cloud,” which basically just means it’s someone else’s data center.
Through the 90s and early 2000s, many companies began outsourcing all of their computing needs to these “cloud” data centers.
A data center is a giant building stacked full of big computers, called servers, that handle all of the data from users like us.
Facebook needs to store billions of profiles and pictures from all its users somewhere. The best way to do that is to put all of that data in one place. But it’s not just Facebook.
This website? Loaded from a data center. Your Amazon profile? Data center. Your emails? You get the point.
Classically, that data that your device receives travels from the data center through a network cable until it gets to you.
While networks are getting faster with things like fiber optic cables and faster networking devices, Netflix can only load so fast if the video data from the data center has to travel across 12 state lines to get to your laptop.
That’s where edge computing comes in. With edge computing, data centers (the cloud) can be extended, and those distances are cut down by keeping data close to you.
Did you know that Microsoft has an underground data center? Learn more from our blog on Everything you need to know about the Microsoft Underwater Data Center.
How Does Edge Computing Work?
Edge computing essentially works like modern grocery stores.
Bear with me.
Imagine if there was one massive grocery store in the middle of a country, but none in any other state.
To buy groceries, everybody would have to drive all the way to the middle of the country and back. Not very efficient. But this is exactly how data centers operate. We would call this kind of network “centralized.”
But at least for our food, we don’t have one big “centralized” grocery store. Instead what we do have is grocery stores all across the country that people can drive just down the street to. This is what is known as “distributed.”
Edge Computing Devices
Edge data centers, edge locations, edge devices, etc., or the “grocery stores” in this analogy, can take many forms.
In the industry there are many different buzz words thrown around in edge computing like micro data centers, prefabricated, modular, converged infrastructure, hyperconverged infrastructure, cloudlets, and the list goes on…
The important thing is simply that they’re all trying to do the same thing. Match the growing needs of devices and spread the network out. Do more of the computing locally without relying on an external network.
As one example, if Alexa was retrofitted with an AI chip that allowed it to answer your questions without all the lag time of talking to Amazon’s main data center, then your Alexa could be considered an edge computing device.
That being said, an edge location doesn’t necessarily have to be a different format than a regular data center, though an “edge” location is almost surely going to be smaller by definition.
It could be a small, traditional data center facility managed by a colocation. Colocation is like having a commercial lease to a storefront in a mall, where owning the mall is like owning the data center.
It could also be an edge enclosure could also be a micro data center in a small storage unit.
Again, they’re still attempting to do the same thing.
Edge Computing Jargon Lesson: Definitions
That being said, we’ll break down some of the common jargon for edge computing devices here:
Fog computing: somewhat analogous to edge computing, but data is generally processed closer to the centralized cloud as opposed to more local to the edge devices themselves
Micro data center: blanket term for a “containerized” or “modular” data center. Essentially a portable rack that contains the same things as a typical data center, but condensed into one unit.
Containerized data center: traditionally is a literal container that houses a data center. The container provides the backbone for the IT equipment like the servers
Modular data center: blanket term, can mean containerized data center, can also be a data center made of “prefabricated” components that are assembled where they’re needed. Think Ikea chair.
Prefabricated Data Center: a data center made up of at least one pre-assembled unit comprised of things that are usually installed separately
Converged infrastructure: combines several IT sections, such as servers and storage, into one system that pools resources together dynamically
Hyper-converged infrastructure: Basically the more virtualized version of Converged infrastructure; instead of having a bunch of physical units, one unit is split into several units virtually.
Cloudlet – a smaller, distributed cloud data center
There are a number of ways to accomplish the same thing. But if all edge computing does is make stuff load faster, then why is there so much hype about it?
Well, it’s a bit more complex than that.
Long Term Significance of Edge Computing
Augmented reality (AR), virtual reality, autonomous vehicles, IoT (internet of things), AI (artifical intelligence), tactile internet, and countless other technologies are impossible without edge/fog computing.
Take an autonomous car, for example.
It would probably need to know what traffic patterns are like in real time. In fact, your life might even depend on it.
Do you really want to wait for data to travel 1000 miles before your car can decide if it wants to turn away from the road with a fresh multi-car accident?
If there’s an edge device processing all of the traffic data from every car within a one-mile radius, it doesn’t have to wait.
While countless emerging technologies will likewise find a boon in edge computing, some older practices may find themselves struggling.
Many businesses opt not to offsource their computing to the cloud, preferring to maintain some applications that are bandwidth intensive (need a large volume of data transferred all at once) or latency sensitive (need the data to arrive quickly) in their own, local data center.
This is known as a “hybrid cloud.”
The problem is, businesses need more and more out of their local data centers.
The total number of devices is exploding. Practically everything is digital and on the network now.
In a world where you can hack a network through the smart thermostat, there’s no room for error when managing your own data center.
And not only do companies have to account for added vulnerabilities, but they have to handle all of the data from all of these devices.
This device explosion is coined the “Internet of Things,” (IoT) which is just to say that every “thing” is on the internet these days.
The simultaneous need for low latency, high bandwidth, as well as sufficient volume and security is where edge computing comes in.
It’s been estimated that 5.6 billion IoT devices will be run through edge computing by 2020.
The Winners from Edge Computing
Big Tech Companies
Many would claim that the cloud giants such as Microsoft, Amazon, etc., would be hurt by this trend.
The reality is, these companies have pivoted in countless ways already, and have visionary leaders. Not only will this trend not hurt them, it will help their companies build even better relationships with consumers and grow more than they have already.
Even in specialized areas where they don’t have competencies, they can always just acquire new companies for things like micro data centers.
By processing things at the edge, companies like Google can leverage the technology they’ve developed in other areas.
For example, their massive machine learning department driven by data centers has allowed them to create specialized chips to process image data locally on their new Pixel phones. The result is the best phone on the market.
Nvidia and Autonomous Cars
Nvidia is poised to use everything it’s developed in AI hardware/software to dominate the autonomous car space. They’ve already created self-driving car supercomputers that can process close to half a billion commands per second while remaining compact enough to fit in the core of a vehicle.
And while that’s all very exciting for tech stakeholders, those wins should also extend to the consumer.
By processing more data at the edge, less of your private information is sent (theoretically) to Google HQ.
Everything works more quickly.
And beyond that, the technology it facilitates is groundbreaking.
Imagine if car accidents were practically eliminated. All of the time spent stuck in traffic could be spent reading, watching shows.
Imagine if an augmented reality eyepiece could transform your walk through the park into an adventure through Wonderland.
Imagine if cell phones could load 4k video live with no buffering, no matter where you went.
These are all things poised to happen in the near future as a result of edge computing.
Regardless of how edge computing changes the landscape of IT, know that Exit Technologies will work to stay ahead of the curve to help IT companies recoup the maximum value from their assets!
Do you want to know more about the future of Amazon and Cisco? Come read and learn about Amazon Web Services and Cisco Partnership.
Have something to add? Let us know your thoughts in the comments below!