5G: Can You Hear Me Now?

The first 5G silicon has been announced. 5G network field trials are underway. The next generation of wireless everything is just about ready. Maybe.

If you listen to other voices, standards groups are still wrestling with important technical issues. Big questions—small cells or large, smart base stations or streamlined ones, centralized computing or distributed—still hang over network development. And the first “real” 5G might not go live until 2021. So where are we really?

The question is clouded by the existence of several legitimate, but quite different, visions of what 5G really means (Figure 1). For most cellular service subscribers, the term is obvious: just like 3G or 4G, 5G will be an incremental improvement in service. The handset will deliver more consistent service and a more compelling media experience—like 4K UHD video—with imperceptible delays. But ask a network operator, an automotive engineer, or a power engineer, and you may hear very different answers.

Figure 1. 5G is likely to be deployed in a series of nearly discrete phases over time.

The Other 5Gs

Better mobile broadband is an easy concept to convey. Not so for some of the other applications that are betting on 5G, many of which wouldn’t appear to the non-technical user to have anything at all to do with smartphones.

One such application is fixed broadband: establishing a broadband, wireless link between a base station and a set of non-moving remote terminals. This might seem rather pointless, as the function is already performed by cable connections and telco digital subscriber lines. But there are two reasons why fixed wireless is important. The first and more obvious is the substantial pool of potential subscribers—many in developing world cities or sparse rural areas—who still have no wireline connection capable of supporting broadband. Fixed wireless can extract fees from them even when running cable or fiber to them would be impractical.

The second reason is that fixed wireless gives wireless carriers a way to compete directly against cable companies and telcos for the huge access business of existing customers. It is no coincidence that one of the first fixed-wireless 5G trials will be held by US cellular heavyweight Verizon Wireless.

So if this is 5G, why not just let customers use their smart phones as local WiFi hubs? Because the base station—or the central data center—knows where each customer is, and knows they will stay in place, many of the complexities of mobile service go away. You don’t need beam tracking, or on-the-fly cell handoffs, or continuous negotiation of transmitter power. Not that the channel is static: changing weather, passing trucks, even waving trees can alter the channel characteristics. But it is easier. And compared to packing all the computing necessary for 5G into a handset, a box with an AC supply is a far easier power management challenge. “Fixed wireless is a strong candidate for early deployment,” says Intel® general manager for 5G business and technology Robert Topol.

Easier, however, does not mean trivial. Even with its simplifications, fixed broadband will be a great platform for field testing the new technologies that define 5G. These include new carrier frequencies, carrier aggregation, new radio transceivers, new channel coding, spacing, and forward error correction, massive multiple-input multiple-output (MIMO) antennas, and low-latency frames (Figure 2).

Figure 2. Several significant technical challenges together define the 5G development program.

None of these is trivial, although potential solutions to each of them are ready for trials. Take frequency bands, for example. 5G can use both bands below 6 GHz and bands at 28 GHz. These carrier frequencies are far higher than all but a few of the 4G LTE frequencies, and have the potential to support many more channels. But as frequency increases, propagation gets worse. By 28 GHz, connections are almost limited to line-of-sight by the beam’s inability to penetrate solids. And free-air attenuation can limit line-of-sight range significantly. So transceivers will have to rely on beamforming to pick out a specific path from antenna to antenna, and cells will have to be closer together.

Instead of dedicating wide channels within each band to handle the maximum data rate for a client, 5G uses carrier aggregation. It can scatter data packets across several different channels, maybe in different bands with different propagation characteristics and antennas, in order to get all the needed bandwidth. So a 5G hub downloading at 10 Gbps is actually more like a cluster of multi-antenna radios running in parallel, at different frequencies.

MIMO antennas and their RF front ends will be a key part of this picture. Many designs for trials are using 4-by-2-element antenna arrays on the client side. For base stations the antennas will probably be 64- or 128-element arrays. The mathematics of MIMO at this scale are well understood, as are the signal-processing implementations—given adequate power. Of course there is always the possibility of breakthrough changes in algorithms to upend the well-understood.

Similarly, 5G channel coding and error correction are well-understood algorithms. And the standards proposals are solidifying. Topol says the 5G air interface Release 15, expected to be sufficiently stable to commit to silicon, appears to be on track for late 2018.

Center or Edge?

Given the steady progress in aire interface, transceiver, and baseband technology, it is surprising that one major issue seems to still be a point of contention: the network’s computing architecture. Some parties advocate nearly complete centralization, with waveforms from base-station transceivers digitized and sent all the way back to metro data centers for baseband processing and switching. This centralized radio access network (CRAN) architecture virtualizes baseband processing and backhaul: it replaces specialized hardware in the base stations with software in the service provider’s data center.

At the other extreme, some parties advocate pushing as much processing as possible—not just baseband, but packet processing and some switching and management—as far as possible toward the network edge. These advocates are often thinking about the expected proliferation of small cells in 5G networks, and at the latency-constrained needs of Internet of Things (IoT) and machine-to machine connections.

The discussion may end on middle ground. “There is no question that it makes sense to virtualize the Evolved Packet Core (EPC),” observes Intel Programmable Solutions Group director of access and wireless technology Mike Fitton. After all, every packet must traverse the EPC and receive its attention. And setting aside some conjectured services like machine-learning-based deep packet inspection, adequate packet speed is achievable in existing data centers in software.

But not every 5G task is so accommodating. “It is hard to do all the Layer 1 processing in software,” Fitton cautions, “especially as you start asking for shorter latencies. You may need hardware acceleration on the servers. And some low-latency connections may need processing or caching at the network edge.”


The solidifying standards and the continuing debate should both benefit from early deployment of fixed broadband wireless service. The knowledge from that effort will increase the momentum toward 5G’s most recognizable goal, mobile broadband service.

The challenges are many. Mobility implies small form factors, and they in turn imply aggressive power management. If the mobile device is a luxury sedan, the constraint is not so serious. Auto vendors are resigned to electronics modules dissipating tens of Watts. But if the device is a notebook or tablet, managing modem power calls for all the tools at the SoC designer’s disposal. And if the device is a handset, today’s state of the art is not enough.

There is a further issue with the handset. The 5G MIMO antenna will strive to keep beams formed directed at the antennas of the base stations it is using. But what if the path between the antennas happens to pass through the user’s ear? Not only is the human brain rather opaque at these frequencies, that is not a place through which most users will want a beam to be formed. So the mobile devices will have a further quite dynamic constraint on their MIMO algorithms: don’t irradiate the user.

This is part of a larger issue. Mobile devices are mobile. Cars reach 100 km/hr on urban freeways, passing between tall buildings and highway signs. Pedestrians scurry from cozy coffee shops down tree-lines sidewalks and into subway entrances. Especially at 5G’s higher frequencies, such antics mean base stations will have to continually beam-track each active mobile client and hand each one off crisply when another cell has a better line of sight or more available capacity. And this on-the-fly juggling must be done simultaneously with multiple carriers.

This challenge is helped—and exacerbated—by the proliferation of small cells. The miserable propagation characteristics of 29 GHz and the need to rely on the tight directionality of MIMO antennas in dense urban areas make small cells mandatory in urban and campus settings. Add in carrier aggregation, and a girl walking around city sidewalks watching a movie is going to be targeted by an army of small-cell MIMO antennas, all cooperating to keep enough beams aimed at her handset to keep the movie from freezing when she turns her head or walks around a corner. Building interiors will have similar issues, probably requiring indoor small cells to be as dense as WiFi hubs are today.

On the road the situation will be different. Longer sight lines and distances make conventional macrocells a better solution. “I think we will see a mix of macrocells, small cells, and local clusters in buildings,” Fitton conjectures.


So far we have described a remarkably heterogeneous network. We have tiny, room-sharing cells, small cells, and macrocells. We have conventional base stations, CRAN, and enhanced edge processing. And it is all supposed not just to coexist, but to interoperate seamlessly. Now let’s make things a lot more complicated with three new letters: I, O, and T.

Almost from the beginning some IoT partisans have claimed that 5G would provide the missing link of connectivity between the Internet and the Things. Different people meant different things by that claim, though. Some merely saw 5G broadband links between Internet access points and IoT hubs. Some saw direct 5G connections to Things. And some have advocated machine-to-machine links through the 5G network.

All of these ideas bring with them new requirements. Any sort of IoT connectivity that involves interaction over the channel requires low latency. Accordingly, 5G imposes shorter frames than LTE and hence a shorter transmission time interval (tti). Implied is a guarantee on round-trip latency for some connections.

But this latter point contradicts trends we’ve already discussed, such as virtualization, centralization, and the agility necessary to keep mobile devices connected. The answer to the contradiction is network slicing: providing several very different virtual networks on one physical medium. Most experts believe that slicing, in turn, requires software-defined networking as well as virtualization of many functions.

So now we may have broadband, restricted-data-rate, and limited-latency virtual networks all running over the same air interface and equipment. But it gets better. Many IoT endpoints, such as remote sensors, have very low connection duty cycles and extremely tight power or energy budgets. The cost for them to set up a connection, transfer a block of data, and disconnect would be prohibitive. So we may see dedicated IoT networks such as Sigfox layered into the 5G network as well—either as a separate network with gateways, or as an actual slice.

A Roadmap?

When does this all happen? “Pretty much everyone has test benches now,” Fitton says. Intel and Qualcomm have announced modem chipsets for delivery this year. Verizon is commencing fixed wireless field trials in the 28 GHz band early this year, too. AT&T, Ericsson, and Intel are mounting a field trial providing a range of broadband services to an Intel office building on the 15 GHz and 28 GHz bands. If these trials go well, there could be large-scale fixed wireless deployments sometime after the Release 15 spec comes out next year, or at least by 2020.

Full mobile service is further away, as there are more formidable questions still being answered. “We may see mobile service in 2021,” Fitton suggests. In the meantime, many of the individual pieces of 5G technology—carrier aggregation, massive MIMO, and perhaps exploitation of the 5 GHz ISM band, where WiFi lives today—will be backfilled into 4G LTE networks. This should give users a taste of 5G performance and coverage well before real 5G mobile. And it will, not incidentally, provide a fall-back network for 5G devices once they are in operation, much as current LTE devices can drop back to 3G networks when conditions require them to.

So yes, there is significant progress, and the stated goal of deploying something with the letters 5 and G on it in 2020 remains plausible. But it will still be a while before you can pick up your new smart phone and hear something over a 5G network.

CATEGORIES : IoT, Wireless/ AUTHOR : Ron Wilson

4 comments to “5G: Can You Hear Me Now?”

You can leave a reply or Trackback this post.
  1. Nice article. Simple, yet technical enough to pass to my undergraduate students in the business track to help them understand.

  2. Nice article,
    Thank you!

  3. ” Fixed wireless can extract fees”

    I can describe a layer 1/ layer 2 solution right now using 20 year old technology that will blow the wheels off 5G, 6G, 7G …

    It’s not likely to catch much interest because it requires “no carriers”, cannot be made non-anonymous, cannot be listened in on, “extracts no fees” from anyone, requires no provisioning, and no one controls it.

  4. Well done and nice reality check. 5G will certainly be a much longer path to reality than 4G was. At the far end IoT applications will abound and challenge the original expectations and architectures, not the least of which will be security and privacy, almost always at odds.

Write a Reply or Comment

Your email address will not be published.