Skip to content
robotaxis in the desert

The car of the future is taking shape—and it will know how we feel about it

Love it or hate it, CES is where you see the future of the automobile.

Jonathan M. Gitlin | 90
Mercedes-Benz auf der CES 2018. Mercedes-Benz auf der Consumer Electronics Show (CES) in Las Vegas. smart vision EQ fortwo. Im Rahmen der CES 2018 präsentiert Mercedes-Benz in Las Vegas auch das Showcar smart vision EQ fortwo. Mercedes-Benz at the CES 2018. Mercedes-Benz at the Consumer Electronics Show (CES) in Las Vegas. smart vision EQ fortwo. As part of the CES 2018, Mercedes-Benz will also present the showcar smart vision EQ fortwo in Las Vegas. Credit: Daimler
Mercedes-Benz auf der CES 2018. Mercedes-Benz auf der Consumer Electronics Show (CES) in Las Vegas. smart vision EQ fortwo. Im Rahmen der CES 2018 präsentiert Mercedes-Benz in Las Vegas auch das Showcar smart vision EQ fortwo. Mercedes-Benz at the CES 2018. Mercedes-Benz at the Consumer Electronics Show (CES) in Las Vegas. smart vision EQ fortwo. As part of the CES 2018, Mercedes-Benz will also present the showcar smart vision EQ fortwo in Las Vegas. Credit: Daimler
Aptiv and Lyft's robotaxi, CES 2018
Credit: Aptiv

Few people want to go to Las Vegas immediately after the New Year. Never a fan of the place at the best of times, I dutifully boarded the plane anyway. Like it or not, if one wants to see everyone's ideas for the car of the near future, there's no better time and place to do that than CES.

There's an irony to hearing about smart mobility at CES, considering all the dumb reality outside. The show has grown so much that getting from the convention center to anything offsite now takes an hour if you're unlucky. Figure in a lot of needed—but unwanted—rain that caused havoc with self-driving demos and electrical transformers, and the whole thing became an ordeal.

Chips ahoy!

That ordeal kicked off days before the main exhibit hall even opened. It's fitting that Nvidia started the proceedings on Sunday; its graphics chips bear more responsibility than most for the blossoming of autonomy. The latest of these is called Xavier, and if things go Nvidia's way, they'll be found under every robo-taxi's access panel. Nvidia is forming big partnerships: Baidu, Uber, and Volkswagen Group are three of the latest names to be announced.

Fellow silicon seller Intel has similar designs on the car of the future—it wouldn't have paid more than $15 billion for MobilEye otherwise. Intel's new EyeQ4 system-on-chip is now fully able to do the near-real-time, crowd-sourced map updating that we've written about, and the technology is soon to hit the roads in new BMWs, Nissans, and Volkswagens.

Automotive names don't get much bigger than Ferrari, and that, too, was a card Intel plucked from its hat. The two are going to use AI to enhance the racing experience, something I'm going to have to dig into real soon.

Ars Video

 

Johnny Cabs

As it happened, my first robo-taxi ride of 2018 was several hours old by the time the lights dimmed and the embargoes ended. You might not have heard of Aptiv yet, but you probably know its work: until recently, it was known as Delphi before spinning off its powertrain division under the old name (hence the rebranding). Aptiv and Lyft had teamed up for CES to offer a real-life Johnny Cab service.

"We need to build trust in the technology," explained Jada Tapley, VP of advanced engineering at Aptiv. How better to do so than putting that tech straight into service?

Aptiv's fleet of BMW 540s, studded and jeweled with sensors but more immediately noticeable for their safety orange wheels, spent CES ferrying people from the convention center to any of around 20 hotels. Pulling into and out of parking lots was a job for Patrick, who was the safety monitor on my ride with Tapley. Private property concerns meant only humans were allowed to drive the final few hundred feet.

Aptiv had several of these driverless BMW 540i test vehicles, which were integrated into Lyft's network for CES. They wouldn't take you everywhere, but you could get a ride from the convention center to one of the big hotels.
See, it says self-driving ride!
The infotainment screen in the Aptiv car showed a stripped-down version of the car's lidar scans. As befits a Tier 1 supplier, you wouldn't have known you were in an R&D hack were it not for the red "stop" button in one of the cupholders.
A Navya vehicle.

Once on the public roads, Patrick had a boring old time, poised to take the wheel but denied a reason. From the front passenger seat, Tapley talked us through the 30-minute ride to Caesar's Palace and back. (By Tuesday, this route must have taken twice as long.) The BMW's infotainment screen displayed a stripped-down graphical representation of the environment around us as sensed and then perceived by the car.

With her attention toward us in the back—I was sharing this ride with another correspondent—Tapley missed the sole moment of excitement. As we traveled up the Strip, a bus one lane over had stopped to disgorge itself of passengers. The driver had chosen to stop at a jaunty angle rather than neatly parallel to the curb on one side and the flow of traffic on the other. So although the back left corner of the bus was still actually in its lane, had I (or you) been driving we'd probably have slowed a little and moved over to give it added room.

The BMW—with its Intel silicon brain, nine lidars, 10 radars, two GPS antennas, and the rest of it—isn't you or me. As if to confirm the superior accuracy possible with the latest in sensor fusion and high-accuracy localization, it knew exactly how much free space there was between its right-most extremity and the corner of the bus, and it didn't see fit to slow or alter our line.

Tapley's first idea that anything out of the ordinary had taken place was when she saw the pair of us in the back instinctively flinch. That level of machine-enabled precision is one of the main selling points for autonomous driving, but it's probably going to take some getting used to.

Beyond those few seconds of excitement, I would have had no idea Patrick hadn't done all the driving if I had not been watching his hands.

"We want it to feel remarkably unremarkable," Tapley told me—consider that mission achieved. As I would confirm by week's end, even if Lyft doubled its effective fleet that week with thousands of such vehicles, they wouldn't have helped. The real holdups—the 15 minutes either side of the actual driving—were spent being poorly directed around Vegas-sized parking lots.

That's why a later demo of Hitachi's Auto Valet Parking system found such a receptive audience. Vehicles equipped with Auto Valet authenticate with a central parking control center, which then directs them around the multilevel garage. The system knows where every car and every free space is, and it knows how best to get all the cars into spaces and then back to their owners as efficiently as possible. (Of course, the drawback is Auto Valet only works if every vehicle is able to be directed by central control.)

Late night, bright lights

The flashiest ride was undoubtedly a (very) late-night trip down the Strip in Daimler's idea of a fully autonomous Smart car. It's called the Smart Vision EQ, and it's a charming little thing. Some bits—like the fabulous clamshell doors—will probably never see production. Others might need a quick rethink; an all-white interior looks great under the lights of an auto show, but after just a handful of rides the carpet showed every single footprint.

On the outside, the Smart Vision EQ emotes at passersby, using LED displays that replace unneeded headlights. On the inside, the UI encourages passengers to interact with it and maybe even each other; thought has gone into the idea of how a shared ride hailing could make us more social. I can feel some of you rolling your eyes pretty hard right now, but think of the EQ like any bus or metro train—some people just like being chatty. At least with this concept, all it takes is the flick of a setting on my phone to let my fellow passenger know I want to be left alone.

The Daimler demo didn't start until 2am, when the combined effects of sleep-deprivation, jet-lag, and bright-as-day illumination from the signs and screens of the Strip lent the ride a surreal quality. Perhaps for the best—the "autonomous" part really is still conceptual—this vehicle was actually remote-controlled. But the real thing is coming, and Daimler is one of the better-placed car companies to adapt, thanks to its ongoing experience with Car2Go.

The reason? It got permission from the city to close down some of the Strip for a few hours.
The display in the Smart Vision EQ has now been outstripped by the production screens in the forthcoming A-Class. This UI was just an animated demo, but it showed the design team has been thinking about how AI cars can encourage human interaction. The one downside to the interior? Those carpets show every footprint.

Sensors making sense

CES is such a big show you get to see not just the sausage itself, but everything that goes into it. For self-driving sausages, that means sensors. Everyone agrees that autonomous vehicles will need multiple, redundant ways of sensing the world around them. And just about everyone agrees that one very good way to do that is with lidar.

Lidar has definitely seen a lot of action recently—this excellent feature from Tim Lee is a great jumping-off point. Someone new seemingly enters that market every day, but for the time being, it still all belongs to Velodyne.

Velodyne sees a split coming in the market. Robo-taxis and other fleet-managed vehicles get a spinning array on the roof, the latest of which is now doubled to 128 channels. The optics and motors need regular maintenance, see, and that's not something you want to leave to the general public. For customer-owned self-drivers, there's the Velarray. It's solid-state—well, almost. Although there is some motion involved, it's frictionless, and no motor or bearing is involved. Each Velarray has a 120-degree field of view, so you'd want to embed at least three of them around a vehicle for all-around coverage.

Because it has been designed to ASIL B, Velodyne says Velarray is also suitable for use in ADAS, the collection of advanced driver assists that keep you straight and narrow when cruising and forewarned when backing out of a busy parking lot. Order a big enough batch—a million should do it—and the price drops down into the hundreds per unit, where it starts to make more sense. And don't worry too much about potential miscreants spoofing your laser returns or one lidar-equipped car blinding another: these units are pulse-encoded and will ignore all but their own signals.

AEye's R&D vehicle, a Chevy Bolt EV.
AEye's iDAR sensors, which fuse true-color optical and lidar together.
AdaSky is bringing the world of far infrared to the automobile.
FIR won't substitute for lidar, but they would complement each other.

AEye is one of the many young challengers to Velodyne. It had the honor of being my last demo of the trip, and its team gamely met me at 8am on the Thursday morning to show off iDAR. AEye's contribution to the field is an integrated lidar (an agile, micro-optical mechanical setup, or MOEMs) and low-light camera, plus embedded AI. That fuses inputs from the two sensors, classifying what to keep and what to ignore before sending it on up the stack. And because latency is so low, AEye says it's able to "selectively revisit any chosen object twice within 30 microseconds."

True-color information from the camera also proves important, providing an abundance of extra context, like knowing that the SUV stopped by the side of the road is black and white or what color a traffic light happens to be. (For the curious, Aptiv handled that problem by equipping all the traffic lights along its pre-mapped routes with DSRC vehicle-to-infrastructure [V2I] access points.) As we stood behind AEye's sensor-rigged Chevy Bolt EV, my eyes remained glued to the display as the team cycled through its various modes. Unlike the huge false-color point clouds I've seen in the past, this was an easily recognizable depiction of the world around us. Was I mesmerized or just tired and ready to leave town?

At some point in between, one other new sensor system caught my eye. AdaSky comes from a growing cohort of startups out of Israel, incubated indirectly by the conscription of smart young minds into the world of intelligence, surveillance, and reconnaissance. Think swords into plowshares, but in AdaSky's case, the sword is the far infrared and the plowshare is called Viper. It's the first far IR camera system designed for an automotive application, and although AdaSky wouldn't characterize it as a replacement for lidar, the company did think Viper a good partner.

Because Viper detects emitted light, not reflected light, it won't get blinded driving into the sun or out of a tunnel, and it doesn't care if it's midnight or midday. You probably saw reports that the rain put a temporary stop to the driving demos. As rainstorms go, it was a good one and the first precipitation Las Vegas had seen in many months. (The exact interval differed with every taxi or Lyft driver I spoke to, citizen meteorology having yet to bloom among these drivers.) But the storm didn't present any challenges for the FIR camera, which was as happy highlighting the cracked state of Las Vegas' road surfaces in the rain and drizzle as it would have been in shining sun.

Mercedes-Benz brought some of these camo-wrapped preproduction A-Classes to CES. Given that the car won't be on sale in the US, I think that was a little mean, but the car maker wanted to show us its new infotainment system, MBUX.
Mercedes-Benz brought some of these camo-wrapped preproduction A-Classes to CES. Given that the car won't be on sale in the US, I think that was a little mean, but the car maker wanted to show us its new infotainment system, MBUX. Credit: Mercedes-Benz

Trend spotting and pattern recognition

Sometime around my third guided tour through a Tier 1 supplier's setup, I could no longer escape seeing several trends. Multi-screen, high-bandwidth infotainment systems will be the most outwardly visible sign. Five-screen demos were the norm—one for the main instrument display, then another for the center stack, and one each for front and rear passengers. These should have sufficient throughput to pipe different HD cartoon feeds to two different screens, but that's not the really interesting part.

New cockpit domain controllers get that honor. It hasn't gone unnoticed that the bundles of discrete controllers and all their associated wiring that goes into our current cars make them heavier and more complicated than they need to be. These new cockpit domain controllers do away with that; instead of a distributed brain with separate black boxes to run the instruments, infotainment system, heads-up display, and so on, everything happens in a single digital brain. Multiple operating systems run alongside each other in hypervisors; locked-down, ASIL D-spec real-time ones for mission critical systems with buckets of anomaly detection and firewalls surrounding the less important bits which can touch (and be touched by) the Internet.

Those "less important" bits will almost certainly be running on Android or Linux. Volvo and Audi are both in the former camp for their new systems, but don't count Linux out. Automotive Grade Linux is picking up steam, particularly with Japanese OEMs. Android has its advantages, but more than once I heard someone worry about being dependent upon an OS that's continually improved and updated with smartphones and tablets—and not automobiles—in mind.

MBUX is built on Linux, and it's surprisingly good.
MBUX is built on Linux, and it's surprisingly good. Credit: Mercedes-Benz

Mercedes-Benz used Linux to build MBUX, its first legitimately good infotainment system. The German company had several camouflaged A-Class hatchbacks on hand a few weeks ahead of that car's public reveal. The move was slightly unfair for those of us based in the US, as that car won't go on sale here. But MBUX will make it over eventually. The pre-CES buzz surrounding MBUX was the integration of what3words, a natural-language-based map-addressing platform. But the moment I realized Stuttgart got it right was its use of Nuance's voice recognition. It continues to wow me in BMWs, and soon drivers of the three-pointed star can get in on that action.

Speaking of voice recognition, Alexa was a near-ubiquitous presence. Amazon's virtual assistant is gaining acceptance among the OEMs and will begin showing up in Toyotas and Lexuses with the most current infotainment systems after a software update later this year. Farther off in the future, there's still plenty of talk of AI digital assistants, trained off our interactions with our car. As long as they play nice with the AI helpers that will no doubt also run our homes, I'm sure everything will be fine.

Augmented reality is making leaps and bounds. Heads-up displays in cars are going to get brighter and more detailed—Pioneer had my favorite—and even headlights are getting the AR treatment. It's probably not coming to an airplane window any time soon, though. From what I can gather, the problem is not one of technology but of persuading someone to add the cost and weight to their planes. Still, you have to admit that would be a pretty cool feature to play with on a clear day at 37,000 feet.

Do I really want my car to know how I feel?

The other thing everyone wanted to show off were solutions to distracted or drowsy driving. I expect we're going to see a lot more haptic feedback through the seat of our pants; used sparingly, it's a highly effective way to alert a driver.

Panasonic's emotion monitoring was surprisingly detailed.
Panasonic's emotion monitoring was surprisingly detailed. Credit: Jonathan Gitlin

Driver-monitoring systems were shown in varying degrees of sophistication. Some went beyond simple gaze tracking and facial expression analysis to include skin temperature, pulse and respiration rate, and more. Interpretations of that data showed a similar spread. For some, a simple scale of how alert the driver was sufficed, but others could chart emotion on multi-axis graphs.

I'm not quite sure how I'm going to feel if my car knows my mood better than I do. I can certainly see the value; it detects rising anger or frustration and then cues up some relaxing music, maybe changes the interior lighting to a more calming shade. But will the flip side be an AI that constantly monitors me, correlating my emotional state to everything I'm looking at? Then again, that world is almost certainly coming to us via our smartphones anyway.

Is that a robot on a motorbike?

CES wouldn't be CES if you couldn't find a few jewels among the thousands of exhibitors. Two that left an impression with me both involve motorbikes. The first was from Bosch and is meant to address the problem of distracted riding. (Foolishly, I had assumed that this was more of a problem on four wheels, but the urge to check one's smartphone is apparently independent of that.)

Bosch, knowing that people just want to use their smartphones in cars, has been working on a platform called MySpin that lets you do just that by casting apps from an Android or iOS phone to the infotainment screen. It's like Android Auto or CarPlay, only cross-platform, and it's showing up in systems from Jaguar Land Rover, among others.

Now you can get MySpin for 2Wheelers on your bike. The UI was easily intuitive even to me, and the idea seems like a much better one than more injured or dead bikers.

That sentiment was also on display in Yamaha's Motobot, a collaboration with SRI International. The idea is very straightforward, although it's easier said than done: build a bike-riding robot that's as fast around a track as Valentino Rossi. That's seven-times MotoGP World Champion Valentino Rossi.

Well, it wouldn't be CES without at least one robot, right?
Well, it wouldn't be CES without at least one robot, right? Credit: Jonathan Gitlin

Unlike all the other autonomous vehicles I came across, this one has no commercial application. Yamaha isn't planning on selling robots that will ride your bike for you, and it's not even talking about putting self-driving systems on bikes so they can sometimes ride themselves with you on it. No, Yamaha's doing this for the hell of it, although one possible application would be helping the factory test new motorbikes. Repeatable precision and accuracy would be useful in a test program, and robots don't complain that it's cold, too early, or the tea's run out.

That Motobot can ride a sport bike at all would be good enough for me. Unlike human riders, it can't move around on the bike, and weight transfer is such a core component to making those things handle. Right now, Motobot is about 30 seconds slower than Rossi at California's Thunderhill Raceway Park. I have to imagine my lap time deficit would be measured in minutes...

Photo of Jonathan M. Gitlin
Jonathan M. Gitlin Automotive Editor
Jonathan is the Automotive Editor at Ars Technica. He has a BSc and PhD in Pharmacology. In 2014 he decided to indulge his lifelong passion for the car by leaving the National Human Genome Research Institute and launching Ars Technica's automotive coverage. He lives in Washington, DC.
90 Comments