Singapore researchers used off-the shelf industrial robot products for this project.
Robot programming is getting easier and easier, with lead-to-teach path programming, user-friendly plugins for grippers and other peripherals, and even programming training courses from the major manufacturers. But if you still can’t figure out how to write a motion program for your assembly task, you may not have to worry for long.
According to a recent release from Science Robotics, Researchers at Nanyang Technological University in Singapore have shown how commercial off-the-shelf robotic hardware, including Denso robots and Robotiq grippers, can assemble an IKEA chair out of factory settings – a demonstration of independent movement that has thus far been restricted to elementary tasks. Their findings highlight the ability of manufacturing robots to perform tasks that require human-like dexterity, suggesting they may soon be ready for use in a wider range of applications beyond a factory assembly line.
Though often second nature to humans, dexterity involves the mastery of various skills, including hand-eye coordination, the detection of forces and fine control of multiple movements. Here, Researcher Francisco Suárez-Ruiz and colleagues presented randomly scattered chair parts to a group of industrial robot arms, parallel grippers, six-axis force-torque sensors at the wrist, and 3-D cameras.
Using visual and tactile cues, the robots successfully assembled the chair in around 20 minutes. Three major operations in particular allowed for completion of the task: the robots quickly and reliably identified correct parts in a randomly cluttered environment; coordinated fast, collision-free motions to construct the chair, and detected force changes as they gripped onto the chair pieces (to verify that pins slid into the correct holes, for example). While substantial coding was used to program the robots’ movements, the authors hope that combining the capabilities of these robots with advanced AI could lead to fully autonomous function.
The researchers published the following video of the assembly task.
Note how the robots use their force sensors to perform the insertion of the wooden dowels, using a circular motion. It’s also interesting to note that the parts are all fixtured in place, from the dowels to the assembled side of the chair on the work surface (note the white material between the chair and the tabletop). Furthermore, at around 0:25 in the video, the footage skips forward with two pieces placed upright in the work envelope, presumably by a person.
While this project certainly shows the potential of robot technology to perform complex tasks, the experiment doesn’t go very far beyond the current capabilities of robots on the market today. However, assembly tasks like this one could be done much faster and more efficiently without sensors or vision, fully programmed. So, what’s the application for independent movement in the robotics industry?
Potential applications for this technology include high-mix, low-volume assembly, packaging, machine tending or palletizing tasks. Hypothetically, with a fully autonomous, independent robot, you would be able to plug the robot into a machine tending task at a new machine, with new parts, and have it figure out how to perform the task on it’s own. Combine this capability with concepts of generative design and path-optimization, and you could return the next day to find the robot performing a completely new motion, tending the machine more efficiently than before.
The team also published a “blooper reel” of mistakes and crashes that the robots made. Check it out below:
Note that at around 0:15 in the video, the robot seems to execute its peg-placing program properly, but it incorrectly ‘thinks’ that the space it ‘feels’ at the edge of the leg is the hole. Programmers, how would you address this problem in the motion programming? Let us know in the comments below.