posted on February 20, 2013 |
You sink deep into the comfort of the driver’s seat and pull out your phone to check your e-mail while sipping a macchiato. It’s a pretty lousy day, all things considered, with grey skies and rain beating down on the windows. Fortunately, you’re quite dry inside the car. And even more fortunately, you can take both hands off the wheel.
Thanks to recent advances in self-driving technology this scenario might not be as dangerous as you’d think.
Following in the footsteps of Audi, Toyota, and Google, a team of 22 researchers at Oxford’s Department of Engineering Science, led by Professor Paul Newman, have developed a new self-driving car called, rather straightforwardly, the RobotCar.
The RobotCar is a specially adapted Nissan Leaf electric car that uses small cameras and 3D laser scanners to act as its ‘eyes’ for mapping a 3D model of its surroundings. The model is then fed into a computer that is stored in the trunk. This allows the car to ‘remember’ roads and suburbs, allowing it to drive itself along familiar routes.
If the car detects an obstacle or pedestrian ahead, it brakes automatically. The driver can also tap the brake pedal to regain control from the computer at any time.
Oxford’s RobotCar car differs from Google’s car by having fewer sensors and relying more heavily on an on-board three dimensional street map. Newman’s team believes that a GPS doesn’t offer enough accuracy for robots to make decisions about how and when to move safely, and even if it did, it would say nothing about the robot’s immediate surroundings. Therefore, they used a 3D laser scanning system to map the 3D structure of the environment and location of the car on the road. The two lasers, located on the front and rear, are used to scan an area of about a foot around the car 13 times per second. The team also used a pair of stereo cameras on the roof of the car to detect the position of the car relative to the road.
There are total of three on-board computers, which constantly monitor and adapt to road conditions. An iPad on the dashboard acts as the car’s user interface, a low level controller is used for the car's electronics, and the Main Vehicle Computer controls everything from the steering to the indicators.
So far, the RobotCar has only driven the private roads around Begbroke Science Park, but the team is currently in talks with the Department of Transportation to see how and when it could be used on public roads. A significant amount of testing is still planned, mainly to help the car understand traffic flows and long-term decision making to help avoid traffic jams.
I can only hope that these problems are solved within my lifetime. I really want that macchiato.