MIT engineers have developed a method that enables robots to make intuitive, task-relevant decisions to identify which tasks need to be completed first.
The Clio method enables a robot the ability to identify the parts of a scene that matter, given the immediate tasks at hand. The robot then takes in a list of tasks described in natural language, and based on that information, determines the level of granularity needed to “interpret” its surroundings and “remember” only the parts that are relevant.
READ MORE: Pudu Robotics unveils semi-humanoid robot
The researches conducted a number of experiments ranging from a cluttered cubicle, to a five-story building on MIT’s campus.
They used Clio to automatically segment a scene at different levels of granularity, based on a set of tasks specified in natural-language prompts such as “move rack of magazines” and “get first aid kit.”
In addition, they ran the Clio on a Boston Dynamics quadruped robot, which, as it explored the building, identified and mapped only those parts of the “scene” that directly related to its tasks, of which including retrieving a dog toy while ignoring piles of office supplies in the way.
On the potential application of a project like this, Luca Carlone, associate professor in MIT’s Department of Aeronautics and Astronautics (AeroAstro), principal investigator in the Laboratory for Information and Decision Systems (LIDS), and director of the MIT SPARK Laboratory said: “Search and rescue is the motivating application for this work, but Clio can also power domestic robots and robots working on a factory floor alongside humans.”
Innovations and achievements in machine vision and cobots will be highlighted and celebrated at the second annual Robotics & Automation Awards on 06 November 2024 at De Vere Grand Connaught Rooms in London. Visit www.roboticsandautomationawards.co.uk to learn more about this unmissable industry event – and to book your table!