Here’s What It’s Like to Ride In Uber’s Self-Driving Car
As I climb into the backseat, there’s not much to indicate this Uber is the latest front in the race to kick humans out of the driver’s seat.
Sure, there’s an engineer in the passenger seat, reading streams of data off a laptop, and the car’s roof is wearing a stack of sensors that would make Carmen Miranda jealous. But back here, there’s just a tablet.
The size of an iPad and mounted between the front seats, facing the passengers in back, this screen is the signature element of Uber’s latest disruption: a fleet of autonomous cars in downtown Pittsburgh. As of Wednesday, when pre-selected customers in the City of Bridges order an Uber, they may be greeted by a self-driving Ford Fusion instead of the standard guy or gal making extra money off their Prius. An engineer will ride in the driver’s seat, ready to take over whenever things get tricky—but it’s this tablet that will teach Uber’s riders about this new mode of transportation.
“We’re working towards developing a transparent experience that provides riders with enough information about the trip and the vehicle system to feel safe and confident,” says Emily Bartel, a product manager at Uber’s Advanced Technologies Center, in Pittsburgh.
The touchscreen relays the key details of the ride: Whether the car’s in autonomous or human-operated mode (noted by a small blue box at the top of the tablet). The car’s speed, its steering angle, whether it’s braking, miles driven autonomously, and time left in your trip. The map view shows where you’ve been and where you’re going. There’s even a “request stop” button, for anyone too freaked out to finish the trip.
An FAQ section addresses half a dozen basic concerns: I thought this was supposed to be a Volvo? Those are coming early next year. How fast does the car go? 25 to 30 mph usually (and yes, the car speeds when Uber deems it safer to match the pace of traffic). How long before you finally ditch the bag of flesh in the driver’s seat? The cars will come with handlers for the foreseeable future.
That’s the basic rundown, the stuff you’ll want to know on your first ride. The more compelling element of the system is the live view of the world. The lower two thirds of the screen depict the car’s surroundings as seen by the spinning LIDAR system on the roof. When pedestrians cross the street, you can see them. When the car’s following another vehicle, you see it.
It’s not necessary info, but it’s reassuring. You know the way you can watch a human driver and have a good idea of whether she sees that jaywalker up ahead? The screen lets you do the same thing, assuring you know the car is aware of the threat you’ve spotted.
The players working on semi-autonomous cars, including Tesla, Volvo, Audi, and Delphi—have addresses a similar problem. Audi’s A7 prototype, which can drive itself on the highway, pulls back the steering wheel to signal when it has the controls. Volvo’s “Concept 26” does the same thing, adding a driver’s seat that reclines nearly flat, with a footrest. The center screen in Delphi’s semi-autonomous concept provides a camera feed of the road ahead, with the car’s path highlighted in blue and orange arrows to indicate turns.
But Uber’s key customer is in the back, not the driver’s seat, a design problem few have addressed. Google hasn’t revealed any plans for getting the public into its cars, but rigged the interior of its autonomous prototype (the one without a steering wheel or pedals) with a wide LCD screen that shows images of pedestrians and other obstacles crossing the car’s path.
This summer, Local Motors launched a self-driving shuttle dubbed Olli, which uses IBM’s Watson supercomputer to have conversations with riders. It fields questions like, “Will this traffic make me late?” and “Why are we stopping?” (Answers, respectively: Yup. Do you want me to cream that old lady?)
It’s all about trust, Carnegie Mellon researcher Raj Rajkumar said of the Olli project. “Passengers need to know that the vehicle is indeed functioning correctly, operating safely.” If Uber’s users are ill at ease in this early wave of autonomy, it’s going to be way harder to convince the masses to switch over to robot chauffeurs down the line.
Of course, if this tablet can’t answer their questions, riders in Pittsburgh can still pepper the engineer up front with their questions (though Uber would prefer they not). And if the company’s researchers do their jobs right, they won’t be in the car forever. All they’ll leave behind is the tablet.
See the original post: