A CMU graduate student researcher developed an algorithm that teaches robots how to distinguish transparent liquid.
Researchers from Carnegie Mellon University (CMU) have developed a novel way of teaching a robot to pour water into a glass. According to Gautham Narasimhan, one of the university’s former post-graduate students who worked on the project, the system utilized artificial intelligence and image translation to improve robots’ pouring techniques. This could eventually see robots functioning as servers, gardeners or even as pharmacists, helping measure and mix medicines. The research was performed at CMU’s Robots Perceiving and Doing Lab.
David Held, an assistant professor in the Robotics Institute who advised Narasimhan, shared how labeling data can be tedious and time-consuming. When it comes to water, this could potentially entail labeling individual water droplets in an image. “You need some way of telling the algorithm what the right and wrong answers are during the training phase of learning,” Held shared.
To train the AI, the team used contrastive learning which is a technique that helps improve vision tasks. Using image translation algorithms, the team presented various images and asked the AI to convert images from one style to another. This enabled them to teach the AI how to distinguish between particular objects.
Transparent liquid is particularly tricky for robots to see since whatever is in the background can influence the way the liquid reflects, refracts and absorbs light.
In one session, the team asked the AI to translate an image of a horse to an image of a zebra. As stated by Held, the same can be done with the image of colored liquid, translating it to an image of transparent liquid. Through this method, the robot can better understand and identify transparent liquids.
To achieve this, the team played videos behind a transparent glass full of water. According to Narasimhan, this will help train the system to enable the robot to pour water against different backgrounds. The results showed that the robot could pour water up to a certain amount in glasses of different shapes and sizes. The team is hoping to further expand the study by examining the effect of different lighting conditions. They are also hoping to explore more advanced capabilities such as the robot pouring water from one container to another, as well as estimating the volume.
For Narasimhan and his team, the study offered them opportunities to present their findings to robotics experts at the recent IEEE International Conference on Robotics and Automation. Conducting these kinds of projects is a great steppingstone for students to work on and showcase solutions to real engineering problems. Besides encouraging innovation, it can expose them to approach problems through effective design and collaborative, interdisciplinary thinking.
“People in robotics really appreciate it when research works in the real world and not just in simulation,” said Narasimhan.
Similarly, it can also open students to career opportunities in their desired field. Narasimhan currently works as a computer vision engineer at Path Robotics, a company that designs automated robots for welding in manufacturing.