Ecclesiastes 1:9 reads “What has been will be again, what has done will be done again; there is nothing new under the sun.” Or in other words, 5G is mostly marketing nonsense; like 4G, 3G, and 2G was before it. Let’s not forget LTE, 4G LTE, Advance 4G, and Edge.
Technically, 5G means that providers could, if they wanted to, install some EHF antennas; the same kind we’ve been using forever to do point to point microwave internet in cities. These frequencies are too lazy to pass through a wall, so we’d have to install these antennas in a grid at ground level. The promised result is that we’ll all get slightly lower latency tiered internet connections that won’t live up to the hype at all. From a customer perspective, about the only thing it will do is let us hit the 8Gb ceiling twice as faster on our “unlimited” plans before they throttle us. It might be nice on a laptop, but it would be a historically ridiculous assumption that Verizon is going to let us tether devices to their shiny new network without charging us a million Yen for the privilege.
So, what’s the deal? From a practical standpoint we’ve already maxed out what a phone needs. For example, here’s a dirty secret of the phone world: you can’t tell the difference between 1080p and 720p video on a tiny screen. I know of more than one company where the 1080p on their app really means 640 or 720 displayed on the device and 1080p is recorded on the cloud somewhere for download. Not a single user has noticed or complained. Oh, maybe if you’re looking hard you can feel that one picture is sharper than the other, but past that what are you doing? Likewise, what’s the point of 60fps 8k video on a phone? Or even a laptop for that matter?
Are we really going to max out a mobile webpage? Since our device’s ability to present information exceeds our ability to process it, is there a theoretical maximum to the size of an app? Even if we had Gbit internet to every phone in the world, from a user standpoint it would be a marginal improvement at best. Unless you’re a professional mobile game player (is that a thing yet?) latency is meaningless to you. The buffer buffs the experience until it shines.
So why should we care about billion dollar corporations racing to have the best network for sending low resolution advertising gifs to our disctracto cubes? Because 5G is for robots.
Bruteforce, But With Antennas
Yes, 5G was always for robots and never for human-smartphone hybrids. More specifically, a massive grid of microwave antennas delivering low latency gigabit connections to cities and farms is for robots. Right now if you want to hook up to an EHF internet band you need to send a team of wizards onto multiple rooftops to align antennas and do all sorts of other chanting, dancing, cursing, and runework to get the damn things to go. Every time a breeze knocks the antennas out of alignment or a tree grows too tall they need to go back up there to do it again. It’s not that the FR2 band on 5G is going to rock our world by itself, it’s that we’ll have an agreed upon hardware solution that handles all the difficult things like swapping antennas on this band that’s going to do that.
I remember my robot team in university digging through catalog after catalog for GPS antennas. We finally discovered that a growing market for reliable antenna modules in agriculture had dropped the price down to ridiculously low prices like $1,500 dollars. At this time, the iPhone had just come out the year before and the Motorola Razer was still the cool kids’ phone. A few years later I ditched my Nokia 3310 equivalent for an HTC Thunderbolt. Fast forward to today and a GPS module is fifteen bucks with prime shipping. That’s a 100x drop in price. We can expect these microwave modems and antennas to fall similarly.
Todays Robots Store Their Memories Somewhere Else
Lets look at a modern robot. We’ve had a machine learning boom these last few years. This has changed the model for robots dramatically.
Before you absolutely had to have a ton of compute sitting on the robot itself. Also the robot ran on human generated algorithms and didn’t necessarily require a lot of storage space to improve its performance. The performance was tied more directly to human engineering time.
Now what you need is enough computer to run a fairly deterministic model for real time decisions only. The rest of the data is sent to a server farms somewhere in the world where it’s fed into the learning algorithms which improve the results. Some of the more expensive but lest time intense decisions are also run there. This however sometimes has the flip equation of how much data you collect being the primary driver in how well your robot can perform.
A better example is that there’s quite a market for the hard drives that live inside an autonomous car. If you have a fleet of Cruises, Zooxs’, Waymos, Ubers, or Teslas, how the heck do you get the data off the car to the cloud for processing? Especially when this data can rack up terabytes from a single run. The current solution is a very expensive hot-swappable cartridge full of hard drives. When you drive your test car back in the garage they pull out the spent clip and slot in another one. It’s pretty dang cyber punk, but inherently inefficient.
5G networks are the perfect solution for this kind of problem. Not only does it have enough bandwidth to get all this data straight off the device as it’s being generated, but its extremely low latency also allows for those cloud processed decisions to come back to the robot even faster. My bet is that in the next twenty years we will see a few billion dollar companies spring up doing things like building interconnect layers, hardware modules, and setting up massive agricultural 5G networks for robots.
All in all this technology will likely be the weight that tips the scale for tech like those drone deliveries the universe has been threatening us with for the past few years. It will make useful augmented reality more likely, and it will dramatically boost the capabilities of robots living in the 5G grids. It just won’t make our phone experience any better.