Skip to main content

How robots adapt in communication


Author: Julia Bömer

In a new study by the Universities of Bielefeld and Bremen, scientists are testing how humans can better follow the instructions of a robot. The robots have to adapt their communication. The aim is to improve human-machine interaction. Several researchers from Transregio 318 and the University of Bremen are involved in the study.

In der Studie wurden die Gehirnwellen von Teilnehmenden mittels Elektroenzephalographie (EEG) aufgezeichnet, während sie via Bildschirm mit einem Roboter interagierten. Für die Untersuchung wurden Labore in Bielefeld und Bremen zusammengeschaltet – genutzt wurde dafür die Bremer LabLinking-Methode. Für die Studie kooperierten die Arbeitsgruppen von Professorin Dr. Britta Wrede von der Medizinischen Fakultät OWL der Universität Bielefeld und von Professorin Dr. Tanja Schultz vom Fachbereich Informatik und Mathematik der Universität Bremen. Wredes Arbeitsgruppe ist auf Medizinische Assistenzsysteme spezialisiert. Die Gruppe von Schultz befasst sich mit Kognitiven Systemen.

Im neuen research_tv-Beitrag erläutern die Wissenschaftler*innen ihre Forschung:

Research-TV:„How Robots Adapt in Communication. Improving Human-Machine Interaction“


In a new study by the Universities of Bielefeld and Bremen, scientists are testing how humans can better follow the instructions of a robot. The robots have to adapt their communication. The aim is to improve human-machine interaction. Several researchers from Transregio 318 and the University of Bremen are involved in the study.


Hello, I am Pepper.
I will explain to you how to set the table.
So in this study we would like to establish
a humanoid robot
that is able to keep people on track
despite the fact that they get distracted
by the environment or by mind wandering.
So we wanted to know if we can find
EEG patterns that distinguish
between an utterance
that provides a scaffold
in terms of a hesitation or not.
So we basically have a connection
with the lab in Bielefeld.
And what we do is
we interchange, interconnect
with video, with audio
and with bio signals that we capture,
in particular with EEG signals
that give us information
about the brain activity of a person.
So we send off brain activity signals,
and the lab in Bielefeld
sends off the interaction in social cues of a robot.
The silver fork is placed horizontally on the plate.
In this case, the robot is the teacher
and the human is the learner.
And what we see what has to happen
is that the robot, as the teacher,
understands how much has been understood
by the human learner.
The EEG measures, in a nutshell,
electrical activity in the brain
and gives us insight
about the neural activity.
And the EEG cap contains a lot of electrodes
to capture brain activity
at different parts of the brain.
We need to put that cap on to measure
where exactly the different electrodes go,
so there’s a placement system to find out
which electrode needs to be placed,
and then the electrical activity
which comes through the skull
is actually very, very tiny in voltage.
So what we need to do
is we need to apply a conductive gel
so that we get the best signal quality
at the electrodes.
The scientific goal of our study
is to find out, on the one hand,
how can we from the EEG signal determine
when a person is distracted?
And on the other hand, how can the robot
then respond to such a distraction
to get the attention of the user back
and to give them all the information they need.
The white napkin is placed rolled up in the glass.
If it is possible for the robot
to really get an understanding
when the human interaction partner
isn’t able to follow anymore,
so that would already be
a great breakthrough because currently
robots, they are actually just continuing
in their process
in the way they have represented the task to proceed.
But when the robot now
is able to understand
that something is going wrong,
then it should be able to adapt.
Good job.
In the Center of CoAI, we joined forces with
University Bielefeld and Paderborn
to establish a co-construction of AI,
that means we would like to provide
scaffolding and support such
that robots can learn from human beings
by observing them.
So with this Joint Research Center,
we want to foster and support research like this.
And there are many, many other labs
that would be really great
to link together.
What I think is really cool about this, this setup,
is that it really allows us
to bring together expertise and technology
from all kinds of places.
5So I think now we have a really great starting point
to routinely do such experiments
where experts from all kinds of fields
can come together and do experiments,
which would be impossible
for each individual to perform.
And I think that is a great chance for science.

Further Informations

  • Website of Transregio Constructing Explainability (Erklärbarkeit konstruieren) TRR 318
  • Website of CoAI Joint Research-Centers