Verizon’s engineers have successfully tested edge computing on a live 5G network in Houston—achieving 15 milliseconds’ worth of latency, which is significantly better than what they could have done otherwise, validating the power of edge compute capabilities within the network.
The feat was achieved using multi-access edge computing equipment and MEC platform software in a network facility closer to the network edge, thereby decreasing the distance the information needs to travel between a wireless device and the compute infrastructure.
Of course, 5G is going to bring great advantages in latency—it’s built into the New Radio (NR) spec, after all. But with an edge architecture, Verizon will be able to create opportunities in the market it otherwise would not have, according to Adam Koeppe, senior vice president of network planning at Verizon, who chronicled the Houston experiment in a blog post last week.
Part of figuring out how low the latency needs to be depends on the use case. In the Houston scenario, they used an Automated Intelligence (AI)-enabled facial recognition application to identify people. In fact, the engineers were able to identify individuals twice as fast as when they duplicated the experiment using the centralized data center.
For virtual reality (VR), the frame assembly has to occur within 20 milliseconds or less—otherwise, it’s going to make the user nauseous and make for an unpleasant experience. Verizon’s Intelligent Edge Network, which is its moniker for the technology, allows it to basically put the software that does the VR processing as close to the customer as it needs to be in order to meet that 20-milliseconds-or-less criteria.
By way of example, Koeppe points to how facial recognition might be used in a public safety situation. In the case of an Amber Alert, authorities would be given a photo of the missing child and that photo would be synced up with all the cameras installed around the town or a structure, like a shopping mall. The pictures that the cameras take will need to be compared against the photo of the child, and this all has to happen very fast.
With edge compute capabilities, Verizon is looking at emerging use cases in both the consumer and business landscape. One might be facial recognition for public safety; another might be VR for the consumer, Koeppe suggested.
Verizon built its cloud platform a few years ago and it’s comprised of core and edge elements, so it’s not as if it has to go out and buy a bunch of data centers or something of that nature. If Verizon were building a network from scratch, it would have to install all new hardware and more, but “we already have our Verizon cloud platform for network functions deployed all over the globe,” and it houses virtual network functions that Verizon uses to run its network, he said. “Those locations are already there,” and the infrastructure is already in place to support whatever software Verizon decides to deploy.
Koeppe described the test in Houston as kind of a rudimentary network slice because it was preprogrammed, but it is certainly related to the kind of network slicing the industry is talking about for 5G.
In the future, with network slicing and edge compute, when Verizon has to activate a scenario where it’s got to feed a lot of facial recognition information to public safety, for example, it can automatically allocate the resources it needs for a particular situation, and when the event is over, it can essentially tear that slice down and put the network resources back to normal, so to speak, he said.