Last week, ARM — the company sitting at the very headwaters of the wireless endpoint ecosystem, the architectural well from which most mobile phones (and many other devices) spring — gathered its minions in San Jose, California, for the British firm’s annual technical conference TechCon. Press and analysts mingled with engineers and scientists, developers and partners from around the world for several days of philosophizing and explaining, panoramic visions and micro details, summarizing and forecasting, revealing and stoking. In short, we basked in the dawn of the Internet of Things (IoT), the next big digital market opportunity.
As those keeping track will recall, in the first great wave of computing, Intel’s x86 architecture accounted for more than 90% of the personal computer market. However, in the second wave — the smartphone era, spearheaded by Apple and its iPhone — the market shifted to ARM. Both Apple and Google (Android) mobile phones are based on ARM architecture. Even other smartphone variants fielded by companies like Samsung and Huawei trace their technical heritage back to ARM.
The great divide between x86 and ARM really falls on the cusp of wireless. When a device is untethered — that is, when it relies on a battery rather than an AC outlet — power consumption matters so much it ranks ahead of performance. Whereas the x86 world always viewed performance as outweighing power consumption, ARM saw the problem from the other direction: get as much performance as possible out of a severe power-consumption constraint. This philosophy has led ARM to a highly disciplined approach toward introducing performance improvements to its designs. But as chip geometries have gotten smaller, ARM has been able to squeeze ever greater performance out of any given power envelope, however highly constrained.
To make a long story short, in the PC and smartphone markets — the two great sequential opportunities in computing in the past 30 years — the first was addressed by x86 and the second by ARM. The interesting question is: What happens next? The IoT is generally viewed as the next great opportunity, but that rubric covers a host of sins, from smart doorknobs to instrumented factories, from intelligent water meters to autonomous vehicles, from urban infrastructure to precision farming systems, from comprehensive security monitoring to human geofencing and animal tracking. It’s not really one thing, but many things.
One prediction I think no one will dispute is that ARM, with its low-power but increasingly high performance systems, will likely anchor most of the perimeter IoT devices, the billions of smart sensors and other devices communicating inward toward the edge node aggregators and beyond. The company licenses its intellectual property for such modest rates and in such an open manner that the entire industry has already adopted the architecture. No one thinks ARM is taking more out of the pot than it is delivering in value, which makes for good customers relations. In addition to its silicon designs, the company provides tools and custom services to large customers like Qualcomm, Apple, and Samsung and various degrees of self-service to others.
The battle lines for the IoT market, then, are being drawn in a zone the industry is calling the “edge.” The edge concept is still fluid, potentially encompassing a smartphone, a gateway, or a smartcam. The definition gets more twisted when a thing out there at the perimeter that may have smarts and a radio as well as one or more sensors is downclassed to a peripheral or some other tethered concept. But both the x86 and the ARM camps have a clear idea of the edge gateway, a device one step up from the perimeter that may aggregate field sensor data, analyze it, access models and databases from the cloud, and make local decisions.
So, while ARM will likely get most of the up to a trillion devices expected to be at the perimeter in 2025 and quite a few inward toward the edge of the cloud, x86 will be coming the other way, from its deep entrenchment at the core of the cloud out toward its edge. There will be a battle royale over the IoT edge.
Meanwhile, the ARM ecosystem is forging ahead with its multicolored IoT program. I love all my IoT children equally, but I have a special place in my heart for automotive — the biggest, most coherent, and most exciting market in the near term. Dipti Vachani, ARM’s senior vice president of automotive and IoT, gave the keynote crowd the company’s overview. She cautioned enthusiasts that fully autonomous driving is still a ways off, noting that “the [safety] driver is still the backup plan” in today’s prototypes. I spoke with one of the safety drivers on the floor in the automobile zone. She told of her journey from Michigan to San Jose. “It’s more work than actually driving a car,” she said, relieved that one of her colleagues would be the safety driver on the return trip.
But fully autonomous cars are somewhere on the horizon. Ian Bratt, ARM’s senior director of machine learning made a good case in his keynote that neural networks have begun to replicate the behavior of biological systems. Bratt told of brain-imaging technology that has mapped networks of neurons that represent physical space, like an actual map in the brain that looks in some way like the terrain it represents. These same discernible patterns have also been derived through training of machine-learning models. In short, machines using mathematical algorithms have come up with the same patterns as animal brains trying to learn and remember where they are. ARM’s contribution is in processors optimized for machine-learning operations, including a new instruction — MatMul (matrix multiply) — introduced by Ian Smythe, ARM’s vice president of marketing, last week at the show. This type of artificial intelligence technology will have a direct effect on self-driving cars, which will make use of all the senses available to humans as well as others they don’t have — like lidar, radar, radio transceivers, cameras all around, and unwavering attention. At some point, autonomous cars will be safer than vehicles driven by humans.
The silicon line that ARM is developing for the automotive industry has at its center — not surprisingly — safety, since that is the number-one criterion in the automotive market. Among other special features, the purpose-built processors include dual-core lockstep execution, so that if one processor fails the other can take up its load instantly. Vachani, the automotive exec, highlighted ARM’s partnership with Swift Navigation, a firm whose demo on the show floor illustrated how the company uses satellite data combined with on-board camera feedback and adjustment to obtain centimeter-accurate, precision locations in non-urban areas, thus opening up the way for Level 2+ driver assistance between cities.
According to Danny Shapiro, nVidia’s senior director of automotive, “Level 2 … provide[s] steering and brake/acceleration support, as well as lane centering and adaptive cruise control,” but requires a human driver at the wheel who monitors and supervises the automated features. “Level 2+ adds in surround perception and AI to improve … safety and convenience. … While the driver is still responsible …, the platform can perform automated maneuvers … such as making highway entrances and exits, lane changes and merges. Level 2+ also includes intelligent cockpit services, such as driver monitoring, AI copilot technology using voice and gesture recognition, and advanced in-cabin visualization of the vehicle’s perception.”
Personally, I can’t wait for some Google-like entity to take over my driving completely. How else can an ordinary middle class guy like me afford a chauffeur? I would like nothing better than to sit back and enjoy the ride, make phone calls, eat and drink (yes! Drinking while driving!), spoon with my sweetie, or whatever else comes to mind while my cloud-connected Jeeves takes me wherever I want to go. Oh, and the added benefit will be that all these Jeeveses will be extraordinarily skilled and polite, and traffic will move that much more smoothly and faster.
Meanwhile, there’s a question of how all these new IoT devices will roll out. Some will be nodes (e.g., home thermostat, home router, automobile), some will be simple perimeter devices (e.g., cam-activated door locks, certain industrial robots), and some will also be endpoints (e.g., kiosks, POS devices). In addition, they break down into two large categories along another dimension: green field vs. brown field. Green field applications are brand new, never been done before, ways to harness information obtained at the perimeter to effect an outcome. A self-driving car is a good example of a green field application. Brown field applications involve the instrumentation of an existing piece of equipment to make it smarter. Many areas in industrial management are currently experiencing investment in this type of instrumentation. For example, a turbine in a power generating station can be outfitted with a vibration sensor to help detect incipient failure.
Initially, there will be plenty of activity in both arenas. But over time, all the brown fields that can be will be instrumented, and the proportional shift will swing over to green field. The x86 camp, with its advantage in existing wired networks, will likely dominate brown field applications. But the ARM faction, with its pole position in wireless networks, will have the inside track on green field. Exciting times we live in.