U.S. Startup Aims to Take Autonomous Cars to Next Level

Driverless cars need to be capable not only of maneuvering safely in traffic but communicating their movements to other drivers or pedestrians, the U.S. startup says.

Drive.ai, a U.S.-based technology company staffed by experts from Stanford University’s Artificial Intelligence Laboratory, has become the latest entrant in the autonomous vehicle race, with an assist from a prominent ex-Detroit 3 executive.

Drive.ai is launching a fleet of pilot vehicles — made by various carmakers — retro-fitted with roof-mounted communications systems capable of not only sensing the surrounding environment, but communicating to those in the area what the vehicle will do.

The company said Tuesday that its aim is to use artificial intelligence to “create a robust new language of human-robot interactions — essential for making people trust and welcome self-driving vehicles.”

The company said former General Motors senior executive Steve Girsky will serve on its board of directors. Girsky stepped down from the GM board in June this year after serving the carmaker in several capacities including vice chairman, and chairman of its Opel unit.

“We all know that the automotive industry is in the midst of a foundational shift,” Girsky said in a statement. “The emergence of self-driving technology, and deep learning in particular, brings an incredible opportunity to save countless lives, transform the transportation landscape, and shift the way we think about cars and technology. The team at Drive.ai has the vision and expertise to lead this new era.”

In a statement, the privately held company said Girsky brings “deep expertise within corporate management and the automobile industry.”

Carol Reiley, co-founder and president of Drive.ai, said the company was founded in April 2015 by a team of artificial intelligence experts dedicated to taking autonomous vehicles to the next level. A number of Phd. Candidates, including Reiley herself, have suspended their doctoral work to take part in the project, she said in an interview.

“We’re pushing deep learning more end-to-end than it’s been done before,” Reiley said. “We use it for perception all the way through to decision making — how a car should maneuver and drive” and also employ “artificial intelligence inside and outside the car.”

In traffic situations, there’s often an unwritten code between drivers and pedestrians. Driverless cars need to be capable not only of maneuvering safely in traffic but communicating their movements to other drivers or pedestrians, she said.

“At a pedestrian is crossing, there’s this non verbal communication that takes place. When you remove the driver, how do you understand what the (vehicle’s) intention is? We want to communicate intent through several different indications: lights, emojis, sounds…… The example of worst designed feature of the car is the horn. It’s a monotone.”

Drive.ai will “build retro-fitted kits for business fleets” that may deliver either people or cargo, she said in an interview. “We retrofit your vehicles. We are working with partners to develop a route-based approach.”

The company aims to keep costs low and minimize complexity of the kits, she says.

Reiley: “Humans are inherently terrible drivers … with deep learning … computers can get better and improve through time.”

 

“We’re using off the shelf embedded hardware,” she said. “We’ve been able to develop software to run on low-cost, low-powered chips.”

Reiley declined to name suppliers of the sensors and other equipment that will be used in the kits. Reiley said Drive.ai would name its partners and the automakers it is working with later. The company is “agnostic” about the car brands that will be used, she said.

“We will work on select make and models to start,” she said.

Reiley said Drive.ai is geared to Level 4 and Level 5 automation, as defined by the National Highway Traffic Safety Administration. At Level 4, vehicles are able to take full control of all safety-critical functions in certain traffic situations, while the driver can resume control with a reasonable transition time. Level 5 means full autonomy.

The company is starting its tests near its home base in California.

“Right now we have a test license to operate in California,” she said. But other states, including Nevada, Florida, Louisiana, Michigan, North Dakota, Tennessee and Utah — and Washington D.C. have passed laws to allow autonomous-vehicle research.

Reiley, who has spent the last 15 years of her career researching the use of robots in fields as diverse as surgery and undersea exploration, is a firm believer that computers with deep learning capability can be better drivers than humans.

“I just had my Uber driver texting while he was in traffic. I know nothing about his driving history. Humans are inherently terrible drivers,” she said. “With deep learning computers can get better and improve through time.”

Deep learning is “the closest algorithm we have to how the human brain works,” she said.

Earlier efforts at computer vision have been based on a set of rules, she said, using the example of a chair.

“You would have to write a set of rules to identify a chair — four legs, a seat and a back. Those rules start to break when it’s three-legged or a bar stool. But humans learn really quickly” chairs come in many forms, she said. “Deep learning works in that same way. You present it with variety of examples and it makes up rules so you don’t need to hard code it.”

Deep learning means “a new language between people and cars and how do we shape that.”

Sameep Tandon, CEO and co-founder of Drive.ai, summed it up in a statement: “Deep learning is the most sophisticated form of artificial intelligence, and the best one capable of responding intelligently to the infinite situations cars face on the roads.”

Source: Automotive News Europe

You can reach Bradford Wernle at bwernle@crain.com.