The Internet as we have all known it mirrors the design of old mainframes with dumb terminals: The data path is almost entirely geared toward data coming down the network from a central location. It doesn’t matter if it’s your iPhone or a green text terminal, the fast pipe has always been down, with relatively little data sent up.
The arrival of IoT threatens to turn that on its head. IoT will mean a massive flood of endpoint devices that are not consumers of data, but producers of it, data that must be processed and acted upon. That means sending lots of data back up a narrow pipe to data centers.
For example, an autonomous car may generate 4TB of data per day, mostly from its sensors, but 96% of that data is what is called true but irrelevant, according to Martin Olsen vice president, global edge and integrated solutions at Vertiv, a data center and cloud computing solutions provider. “It’s that last 4% of what’s not true that is the relevant piece. That’s the data we want to take somewhere else,” he said.
So does this mean a massive investment in rearchitecting your network for fatter pipes into the data center? Or can the advent of edge computing take the load off central data centers by doing much of the processing work at the edge of the network?
What is edge computing?
Edge computing is decentralized data processing specifically designed to handle data generated by the Internet of Things. In many cases, the compute equipment is stored in a physical container or module about the size of a cargo shipping container, and it sits at the base of a cell tower, because that’s where the data is coming from.
Edge computing has mostly been to ingest, process, store and send data to cloud systems. It is the edge where the wheat is separated from the chaff and only relevant data is sent up the network.
If the 4% Olsen talks about can be processed at the edge of the network rather than in a central data center, it reduces bandwidth needs and allows for faster response than sending it up to the central server for processing. All of the major cloud providers – like AWS, Azure or Google Compute Engine – offer IoT services and process what is sent to them.
In many cases, the edge can perform that processing discard the unneeded data. Since cloud providers charge by how much data they process, it is in the customer’s financial interest to reduce the amount they send up for processing.
“We need much more compute out at the edge of the network. This drives profound change, but interesting in that while we’ll see far more data generated out at the edge, a very limited amount of it needs to travel very far,” said Olsen.
“Edge data centers tend to aggregate data, and perform actuation functions to give an answer in low latency,” said Jim Poole, vice president of business development for Equinix. “What most companies are still doing is aggregating metadata from all these edge locations at a central location to do machine learning and analytics.”
Prashanth Shenoy, Cisco’s vice president of marketing for enterprise networking and IoT, agrees that more computing should be pushed out to the edge.
“Compute has gotten cheaper and faster than the network, which suggests that compute should now be at the edge,” he said. “Also, in cases where bandwidth is at a premium or users are in remote locations, like offshore or a mine, and you don’t have connectivity, you need compute and analytics at the edge.”
Artificial intelligence in edge networks
Another important element to reducing the data load will be the use of artificial intelligence in edge networks, said Jeff Loucks, executive director at the center for tech, media and telecom at Deloitte.