created by researchers at technical university munich, the robot TUM-rosie prepares a traditional bavarian breakfast

led by dr. michael beets and dr. bernd radig, a team of researches in the IAS (intelligent autonomous systems) group at germany’s technical university munich demonstrate advances in the fields of robotics with two robots that go shopping for food and prepare and serve a traditional bavarian breakfast. the event was organized by the german research collaborative CoTeSys (‘cognition for technical systems’).

the two robots are the US-designed TUM-james and german-engineered TUM-rosie. in the two-part demonstration (a follow-up to last year’s event, during which the pair prepared pancakes), TUM-james first simulates shopping, examining a shelf of products, making a selection, and then carrying and organizing the groceries at ‘home’. then TUM-rosie collects and boils sausages, skims them out, and transfers them to a serving bowl, while TUM-james uses an electric bread slicer to cut a french baguette.

15x-sped video of the robots preparing breakfast and grocery-shopping (full length, annotated videos are embedded below)

much of the robots’ activity is learning-based. their vision is based on output from kinect sensors, while surface-based matching handles the differentiation of objects, such as which colour bowl is the destination for a particular task, accomplished through the ‘learning’ of the 3D models of these objects in order to pinpoint them even if their location is altered. likewise, in the shopping task, TUM-james recognizes objects on the grocery shelf via 3D perception algorithms from the open-source ‘point cloud library’ (PCL) combined with the ‘objects of daily use finder’.

both robots evaluate their grasp of objects using torque and position sensors. any failures in task execution are handled using learned knowledge about the process; for example, if TUM-rosie fails to get the sausage out of the pot, it immediately reattempts the action.

after ‘shopping’, TUM-james determines where in the kitchen the object should be placed using the IAS ‘knowrob‘ (‘knowledge processing for robots’) system, which provides a way for the robot to calculate where the most semantically similar object already in the kitchen is located. this method reduces the amount of information that must be encoded about each object; instead, the robot is able to use information about things it has already seen or what is already available in the environment, and make ‘logical’ decisions when handling new materials.

breakfast making robots at TUM TUM-james uses an electric slicer to slice bread for breakfast

breakfast making robots at TUM TUM-james goes grocery-shopping, using object recognition databases to select what is needed

breakfast making robots at TUM the inset view indicates the robot’s vision, as it adjusts its arm to grasp the shopping basket and fold down the handle

breakfast making robots at TUM diagram depicts the functioning of the ‘objects of daily use finder’ object recognition software

full-length, annotated video of the robots preparing breakfast

full-length, annotated video of TUM-james ‘shopping’ for groceries

via IEEE spectrum