Why the Internet of Things and artificial intelligence will reinvent network connectivity

Next-generation wireless networks will deliver faster speeds, broader coverage, and less lag. These features promise to breathe new life into existing applications and spawn entirely new businesses and services with tremendous implications for jobs and the economy.

“The industry is poised to deliver high-capacity, high-throughput, low-latency networks,” says Trey Hanbury, a partner at Hogan Lovells in the Washington, D.C. office. “Each of these features — capacity, throughput, and latency — is important because different applications need different functions from a broadband network.”

“Operators can also divide — or ‘slice’ — the next-generation 5G networks to customize performance for a variety of services,” explains Hanbury. “For example, autonomous vehicles and remote sensors need different sets of applications. You want an autonomous vehicle to be self-aware and have a lot of intelligence about its surroundings. But it’s also important for that vehicle to communicate back to a central processor to update it with the latest information about how to respond to a changing environment. The 5G air interface can support both of these functions and many more.”

Other communications infrastructure is advancing too. Satellite innovations, such as Smallsats, are making connectivity in remote areas more affordable while establishing new, more affordable means for providing high-performance remote sensing, weather observation, and global tracking. In this hoganlovells.com interview, Hanbury explores the future of terrestrial and satellite networks and the capabilities that will be required for IoT to continue its forward momentum.

What sort of infrastructure is necessary for AI’s future needs? For example, is there a network that can support fully autonomous vehicles?

Trey Hanbury: The demands of the automotive environment require lightning-fast responses of less than one millisecond. Much of the necessary intelligence for autonomous vehicles will be placed onboard the vehicle, but some network interaction will need to occur and, over time, even certain types of mission-critical functions may have to interact with the core network. 

Today’s 4G networks can’t supply the sub-millisecond latency, or the degree of reliability, that vehicle makers would want to offer to their customers. Tomorrow’s 5G networks will support that kind of speedy response, and it’s that type of low-latency environment that will be needed for autonomous vehicles of all sorts, not just cars, but drones and other types of transportation networks, too.

Another feature of this new 5G New Radio architecture is the ability to change configurations and support numerous devices with limited data throughput. 5G networks will be able to support as many as 100,000 connections per square kilometer. This increase in density will enable an expansion of IoT to include access points in vehicles, homes, factories, and drones — throughout cities, on farms, and elsewhere. These billions of devices spread through the economy will generate massive data traffic as will data-intensive applications such as connected cars — all of which 5G cellular networks are being designed to accommodate. The new network infrastructure will be so affordable and pervasive that we anticipate companies will embed sensors in asphalt or sow them in fields and have them report back local environmental or infrastructure conditions. Robust network coverage and carefully designed network infrastructure means those sensors can last ten years or more.

You’ve talked about another way to slice the network. What other benefits might emerge from that capacity?

Hanbury: Yes. Another primary application would be multichannel video or immersive 4K or 8K video. So if you just bought your 4K television, get ready, because 8K televisions are on their way. That’s at the very limits of what the human eye can perceive, though I’m sure we’ll find some reason to be drawn to 12k or 16k televisions soon enough. 

In any case, high-resolution video requires a great deal of bandwidth to transmit. These new 5G networks will be able to support the throughput necessary to deliver an immersive experience, whether it’s 4K or 8K television, or virtual or augmented reality, where you put on glasses to help you puzzle out the function of a complex instrument panel. Augmented reality could put labels on all the buttons and knobs, or guide you through how to press the buttons, turn the right levers, and restore power. This feature would open up numerous commercial and social applications.

How will a 5G network serve as a platform for AI?

Hanbury: Some AI will be incorporated into devices, but much of the intelligence is likely to be remotely located. The end user product will be supported by a device where all the intelligence is located hundreds of miles away. Establishing high-throughput, low-latency connectivity is an important part of deploying AI because you’ve got to have a way for the brains of the operation, which are probably located in server farms around the country or across the world, to have something to act upon with the AI that’s embedded in it. 

If you have a virtual assistant at home, for example, all the intelligence of that product is not located inside of it, it’s way off in the distance. It’s got connectivity through your Wi-Fi network and probably through your wired home broadband provider, to a remote location. So the better we can make that connectivity, the more able we are to have AI that’s more intelligent, lower latency, more responsive, more real time, and more visually engaging for what we actually need to do. The networks are important in the sense that they connect the widgets that we’re going to use to interact with the world to the intelligence, which is going to be situated remotely. 

These concepts are also being embedded in the network itself so that it can respond in real time to changing conditions. There are, for example, self-healing networks, which can identify problem areas and then re-route communications. So if a sector goes bad, you can configure your system to cover that area without necessarily having human intervention, at least as an interim measure. There are other types of configurations so it can respond dynamically to environmental conditions that are being imposed on the networks.

If we have a lot of IoT applications, it’s going to push more signals out into the field and focus less on delivering high-capacity video support. If we’ve got a lot of demand for high-capacity video support, it’s going to reconfigure itself to support that type of network. But this will be deployed depending on what’s involved, on almost a user-by-user basis, and that’s more than any human brain could possibly process in real time. So embedding a lot of intelligence into the network is an important component of running this very complex network. It’s very agile, but it wouldn’t happen if we didn’t have intelligence built in.

What role do satellites play in bandwidth delivery?

Hanbury: Satellites are a very important component of connectivity, but the most familiar satellite infrastructure is located in geostationary Earth orbit, which is about 26,000 miles about the surface of the Earth. The time it takes a signal to travel from Earth to space and back introduces a mutisecond lag — which is not so good if you need to apply those brakes on a car in real time. So putting the space station closer to the end user is an important component of delivering lower-latency performance. 

Even the closest satellite is further away than the most distant terrestrial base station. But nongeostationary satellites, particularly “small sats,” offer enormous promise to cover areas of low-population density, and they are particularly well suited for data-services delivery. 

Similar principles apply to terrestrial infrastructure. Having smaller cells means a shorter distance to travel — only a few thousand meters. The most important benefit of small cells, though, is the increase in capacity that small cells offer. Instead of one big cell that covers a huge chunk of territory with ten megahertz of spectrum, operators use and reuse the same spectrum by deploying many smaller cells in lieu of one big one. We can reuse that spectrum again and again and have much more capacity as a result.

One aspect of 5G terrestrial networks is deploying very small cells, maybe on every street corner or above every bus shelter. That will allow us to reuse the same spectrum over and over again, which not only increases capacity, but also helps reduce transit time relative to space-based systems.

One advantage of satellites is that they can provide broad area coverage at very low cost. Please explain.

Hanbury: There are areas of the country and the world that are very expensive to reach, have poor infrastructure, are environmentally sensitive, have constraints on power and tower availability, or don’t have sufficient numbers of trained technicians to support the infrastructure during an outage. Satellites can solve those types of challenges because we don’t need to have a terrestrial facility in those locations to support communications. We can just take that space-based connection and establish connectivity on the ground. 

A couple of other things are happening in the satellite space. One, we’re seeing fewer geostationary Earth orbit satellites that are fairly distant from the Earth, and more emphasis on nongeostationary low Earth orbit (LEO) satellites, which are closer to the Earth, oftentimes just outside the stratosphere. They might have a life of only two years or less, but they’re much closer to the user, reduce latency — which is a good thing — and because they’re LEO, don’t take nearly as long to assemble or cost nearly as much to get into space as larger systems. 

You can also design them to be almost disposable. A geostationary satellite is a 10-year development process, so what you put into space can be dated by the time it actually gets there. They tend to be big, there are fuel requirements, launch risks, and so on. Small sats in the LEO orbit could be just a few inches long and a few inches wide. They’ve been standardized into what’s called the U format, a little cube that’s 10 X 10 X 10 centimeters, originally called CubeSats. That kind of standardization has brought costs down quite a bit.

Because of the rapid turnaround cycle, you don’t have that 10-year lag; it may be only six months or a year, so you can put much more sophisticated technology onboard these satellites. Instead of taking 10 years and costing US$2 billion for a geostationary satellite to develop and launch, these CubeSats can go for US$150,000 each. You need many more of them because they’re closer to the Earth, but the cost of a traditional satellite system will buy you a lot of CubeSats. You can literally just throw them out on your way to a higher Earth orbit, like from a Falcon X rocket on the way to the International Space Station.

But a lot of the small sats and different satellite structures are not about providing connectivity for your cell phone. If you were indoors and you had a satellite phone, any meaningful obstruction is likely to prevent you from getting a clear line of sight to a nongeostationary Earth orbit satellite. There are a lot of obstacles, and it’s hard to get a signal in an urban canyon. You probably can’t ride in an elevator and maintain your connection with a terrestrial system, so you definitely won’t be able to do that with a satellite. 

But there are a lot of things you can do with a satellite. It’s great for maritime coverage; terrific for applications like radio occultation — measuring the time of arrival of a signal from point A to point B. You can beam from one satellite to another and pick up differences in performance that are caused by things like depleted ozone, higher concentrations of sulfur in the atmosphere, or storm clouds. You can do interesting things with weather predictions and environmental measurements as a result. These features could be important to industries like earth observation and even shipping, which depends on a precise understanding of environmental conditions by measuring the time of arrival of the signals, how they pass through the atmosphere, and how the atmosphere changes density depending on weather conditions.

Can satellite services compete with terrestrial providers?

Hanbury: It’s hard to compete because you can’t bring satellites closer — at some point they stop orbiting. There are attempts to do that, like with the High-Altitude Platform Station, or HAPS, which is basically a plane that flies at 60,000 feet in the upper stratosphere. Google has the Loon, the high-altitude weather balloons, which will beam connectivity down. But all these suffer from the same problem: they’re costly to maintain and also have a long route to the end user, and you’ve got to have scale at the retail level. You have to actually buy the handset or product that would establish the connectivity to these links. And the terrestrial operators have a huge leg up in that they can deploy where and when they need it.

Satellites generally can’t offer the capacity you need in urban areas. While you would offer coverage, there are not enough users to justify that standalone business. So a lot of the satellite and alternative infrastructure has been finding those applications that they’re especially good at delivering: weather measurements, shipping, tracking, or one-to-many communications.  These types of services are not 5G applications necessarily, but it’s about using the comparative advantages of these different types of infrastructure for different types of services. 

What is the U.S. Federal Communications Commission’s (FCC) role in this?

Hanbury: The FCC has an important role to play because this all takes place using radio frequencies that are in very short supply. There are bands where, for instance, systems can look down from different frequencies and pick up mineral content in the soil or identify oil deposits. If we allow certain frequencies to be used for, say, broadband connectivity, the window may start to close on our ability to see into the soil, or under the sea, or where uranium deposits are, depending on the frequencies involved. 

The stewardship of those frequencies is important, because you want to create opportunities for that type of subsurface radar, for broadband connectivity, as well as for applications like public safety and GPS, all of which are vying for the same real estate. Anything with an electrical current, even light bulbs and computer monitors, emits radiofrequency radiation, so you have to manage the noise that comes out of consumer electronics against unique frequencies that are used by the Federal Aviation Administration (FAA) or space research — it has to be pretty quiet, and if there’s lots of noise in that band you can’t really listen to what’s coming in from outer space. 

So there are trade-offs that exist among these radio frequencies, and that’s where the FCC and the National Telecommunications and Information Administration (NTIA) come into play because it might be very desirable to push all of that for terrestrial infrastructure. But then we lose the opportunity to have the redundant infrastructure for connectivity and do earth imaging or other services that add value to different industries.

About Trey Hanbury:

With more than 20 years in the field, Trey Hanbury has a wealth of experience helping clients tackle their most challenging technology, media, and telecommunications policy issues. He recognizes that companies operating in the TMT sector face increasing competition and an ever-changing regulatory environment. Whether his clients require assistance with spectrum auctions, licensing, and allocation; mergers and acquisitions; regulatory compliance; procurement; or competition policy, Hanbury brings his extensive background and a deep understanding of technology policy to help solve their most pressing and complex issues.

Download PDF Back To Listing
Loading data