Robot hand holds cup

Robots with Common Sense


Author: Jörg Heeren

Professor Philipp Cimiano from Bielefeld University and Professor Michael Beetz from the University of Bremen have established a long-term collaboration with the goal of making knowledge from many different sources accessible in a machine-readable format. This information can help robots to gain the necessary knowledge for performing everyday tasks.

When tackling the tasks of everyday life, we humans draw from a large base of general knowledge without realizing it. For example, if we want to slice fruit, we know which knife to use and how much pressure to apply. If we encounter difficulties, we can read self-help books or watch explanatory videos on YouTube. Then we combine this new information with our manipulative knowledge and common sense to complete the task.

As part of their research collaboration, Professor Michael Beetz from the University of Bremen and Professor Philipp Cimiano from the University of Bielefeld are examining whether a human approach to tackling new tasks can be transferred to robots. Cimiano leads the Semantic Databases research group in Bielefeld and is the coordinator as well as a member of the executive board of the Center for Cognitive Interaction Technology (CITEC); Beetz leads the Institute for Artificial Intelligence (IAI) and the Collaborative Research Center EASE (Everyday Activity Science and Engineering) in Bremen.

How Robots Can Close Gaps in Their Knowledge

The detailed knowledge robots need for a seemingly simple task like slicing fruit is quite extensive if it is not possible for the task to be completed under standardized conditions every time. If a futuristic household robot were standing in a kitchen and received the order to slice a few apples, it would need to first select and recognize the appropriate objects (apples, knife, cutting board), then correctly grab and position these, choose a strategy for slicing, and finally divide the apple into appropriately-sized pieces – and each of these processes would need to be broken down into a large number of additional smaller tasks.

Portrait photo Professor Dr Philipp Cimiano.
‘One challenge is to teach the robots to read and interpret internet sources, form appropriate conclusions, and implement these into their planning processes to take the best action in a given situation’, says Professor Dr Philipp Cimiano.

The future robot will perhaps not immediately know what the order ‘slice a few apples’ means and what it needs to do, but it will be able to quickly close this gap in knowledge. Professors Beetz and Cimiano are laying an important foundation for this in their project, which serves as the basis for a larger collaboration. Simply put, they want to enable robots to independently acquire necessary knowledge from a wide variety of sources, such as cookbooks or YouTube videos, and to translate this into detailed, machine-understandable task directives. Because using a guidebook such as a cookbook requires a lot of basic knowledge, the robots must also be able to acquire common sense.

To make this possible, three essential elements need to operate together – first, ontologies (machine-readable dictionaries, e.g. How can I recognize an apple and which concrete attributes does it have?); second, knowledge graphs which store relational information about objects (e.g. the apples are in the bowl); and third, the robots’ control programs.

Cooperation between Researchers from Bremen, Bielefeld, and Paderborn

The cooperation between the universities in Bremen and Bielefeld combines the extensive expertise necessary for a smooth interaction of the three elements. Professor Cimiano and CITEC provide important experience in researching AI systems that interact with and learn from different environments – for example, from text-based sources. ‘The internet provides a constantly growing number of sources and resources for common sense knowledge,’ Cimiano explains. ‘One challenge is to teach robots how to read and interpret these sources, draw appropriate conclusions, and implement these in their planning processes to take the best action in a given situation.’

The researchers in Professor Beetz’ team have been working for many years on integrating differing AI and robot components into holistic systems. A sub-aspect of this research is the machine evaluation of videos to both glean theoretical knowledge, as well as derive patterns in movement.

Michael Beetz interacts with a robot
Professor Dr Michael Beetz leads the Institute for Artificial Intelligence (IAI) and the Collaborative Research Center EASE (Everyday Activity Science and Engineering).

A crucial component of the collaboration is the ability to work with other researchers who want to use and extend these infrastructures and results – also known as ‘open research’ or ‘open science.’ To this end, Bremen and Bielefeld have founded the Joint Research Center on Cooperative and Cognition-enabled AI (CoAI JRC) with the University of Paderborn. At the heart of this research center is the Virtual Research and Training Building. In addition to the researchers from the associated three universities who install their virtual laboratories there, the worldwide AI community can participate as well, both by using the existing infrastructure as well as by providing their own resources.

A Virtual Research Building for Users Worldwide

The virtual building is still under construction, but already contains several labs in which researchers can use robots under real circumstances, as if they were present at that university. These include an almost completely furnished apartment with a kitchen, as well as a retail store with shelves and products. Different types of robots are available for the experiments. Researchers across the globe can thus use the on-site labs at the Institute for Artificial Intelligence as realistic semantic twins without having to travel to Bremen.

Additionally, the virtual building provides numerous teaching and learning materials for the field of artificial intelligence and robotics. ‘The components are to serve as the nucleus of an ecosystem, which will continuously grow over time and serve as a model throughout the EU,‘ Professor Beetz explains.

The professors already have a further goal in mind with their partners in Paderborn: they want to enable the robots to learn interactively with humans and accomplish tasks together. ‘We want them to not just perform a task, but to do it in a way that satisfactorily supports their partners,‘ Beetz states. ‘Doing so is predicated on acquiring a common understanding of the activity and being able to interact and dynamically adapt one’s actions.‘ The end of this development would see robots who are able to not just follow a set of instructions, but also learn and cooperate constructively with humans.