Boston Dynamics’ Atlas Tackles Parkour and the Tesla Android

This Week in Engineering explores the latest in engineering from academia, government and industry.


Episode Summary:

Advanced robot maker Boston Dynamics is famous for four and two-legged robots that mimic the movement of dogs and people respectively. The bipedal humanoid robot, Atlas is a YouTube sensation with the capability of very sophisticated motion that mimics advanced human movements such as dancing. In its latest development, the company puts Atlas through a parkour course, demonstrating how the sensor suite and algorithms adapt to the local environment. 

Elon Musk’s much-anticipated AI day was held August 19, and two key announcements were noteworthy. One is the development of a scalable and modular processor array for the company’s Dojo supercomputer program, but the other is perhaps more significant: a humanlike robot program. Musk announced that the firm is hiring talent for the robotics program immediately, and expects to produce a human -sized, bipedal and useful machine the prototype ready in 2022. It’s a very ambitious timeline and much depends on the success of Dojo. 

Access all episodes of  This Week in Engineering on engineering.com TV along with all of our other series. 

Transcript of this week’s show:

To see any graphs, charts, graphics, images, and/or videos to which the transcript may be referring, watch the above video.

Segment 1: Boston Dynamics’ Atlas a bipedal robot has become a YouTube favourite for its advanced capability and almost human range of motion. The company is released a new video of the latest iteration of Atlas, running an improvised parkour course which demonstrates an even more lifelike performance. How the team achieves this has been little discussed, but the company has recently opened up about some of thecomputational strategies used by the company. 

Boston dynamics engineer Pat Marion describes the problems of effective bipedal robot motion as “rapid behaviour creation, dynamic locomotion and connections between perception and control”. Perception algorithms convert data generated by cameras and lidar into information usable by higher level decision-making code. Atlas uses inertial platforms along with sensors for joint position and force for balance purposes, but the machine must still see and perceive its surroundings to be useful. According to Marion, Atlas uses time-of-flight sensing to generate point clouds of the robot’s immediate environment at a 15 frames per second rate, then processes this point cloud data to create virtual surfaces. The surfaces are then fed into a mapping system that builds a virtual model of the ground environment and objects within the camera’s field of view. The system plans a route forward, then assesses the local environment, modifying the motion control instruction set in real time. 

While a future perfect system would workanywhere, anytime, the current Boston Dynamics technology uses predefined maps of the general area of operation, then lets the machine fill in close range details. The strategy is similar to that employed by Waymo for self-driving auto technology. Is thisthe way forward for bipedal multipurpose robotics? Time will tell, but it looks like a useful compromise that will get useful machines into the field quickly. 

Segment 2: In what may be the least surprising news announcement of the year, Tesla CEO Elon Musk has announced that the firm will develop its own bipedal, android-like general purpose robot. Announced at Tesla’s artificial intelligence day event on August 19, the Tesla ‘bot will stand 5’8” tall and weigh 125 pounds, and use 40 actuators, with 12 alone in the hands. Unit will be built from unspecified lightweight materials, very likely carbon fiber, and will use end effectors shaped like human hands. 

The considerable computational power needed to develop a robot of this complexity will require aserious supercomputer, which Tesla is building with the Dojo program. Dojo will use Tesla’s house designed D1 chip, which uses 7 nm technology and can operate at 362Tflops per second, which is certainly in the league of supercomputer speed. Multiple chips are integrated into modules that Tesla calls Dojo training tiles. 

The company reports that each tile is capable of performing nine PFLOPs per second, 30 times the capability a single chip. By scaling the tiles in a single assembly, Tesla predicts that early 12 tile configurations will handle 100 Pflops per second, with room for further growth. The first use of the technology will be to develop Tesla’s self driving program for vehicles, but it will certainly be needed to achieve the much more difficult task of operating a true human scale bipedal robot. 

Musk announced that Tesla will beginr ecruiting talent for the robot project immediately, and predicts the first prototype will be ready in 2022. That is a very ambitious timeline and much depends on the success of the Dojo project and the ability to find the necessary engineering talent in a highly competitive employment market. If you’re an AI or robotics engineer, this industry is your iron rice bowl. We’ll report back as Tesla reveals more about the project. 

Written by

James Anderton

Jim Anderton is the Director of Content for ENGINEERING.com. Mr. Anderton was formerly editor of Canadian Metalworking Magazine and has contributed to a wide range of print and on-line publications, including Design Engineering, Canadian Plastics, Service Station and Garage Management, Autovision, and the National Post. He also brings prior industry experience in quality and part design for a Tier One automotive supplier.