Skip to main content

Humans explaining self-explaining machines

Author: Bielefeld University

The processes behind machine learning are incomprehensible for many. At the Collaborative Research Centre/Transregio (CRC/TRR) 318 “Constructing Explainability” at Bielefeld University and Paderborn University, researchers are working to develop ways to design explanatory processes and to enable users to take control of explanations from artificial intelligence (AI). At the first international conference organised by TRR 318, the focus will be on social-science research on explanatory AI. The conference is called ‘Explaining Machines’ and will take place from June 23–24 in the CITEC Building of Bielefeld University. The conference will be held in English.

Currently, a key question in AI research is how to arrive at comprehensible explanations of underlying machine processes: Should humans be able to explain how machines work, or should machines learn to explain themselves?

Two photos showing: Professor Dr Elena Esposito from Bielefeld University, a professor of sociology and its interdisciplinary integration at Bielefeld University, and Professor Dr Tobias Matzner, a professor of media, algorithms, and society at Paderborn University.
Dr. Elena Esposito (left) and Dr. Tobias Matzner (right) have conceptualised and organised the conference. Dr. Esposito, from Bielefeld University, is a professor of sociology and its interdisciplinary integration at Bielefeld University, and Dr. Matzner is a professor of media, algorithms, and society at Paderborn University. Photo (left): Bielefeld University/Michael Adamski; Photo (right): Paderborn University

‘The double-meaning of the name of our conference, “Explaining Machines,” expresses these various possibilities: machines explaining themselves, humans explaining machines – or maybe both at the same time,’ says Professor Dr. Elena Esposito. The Bielefeld sociologist is heading a subproject at TRR 318 and is organising the conference together with her colleague Professor Dr. Tobias Matzner from Paderborn University. Dr. Matzer is a media studies researcher who is also heading a Transregio subproject. ‘If explanations from machines are to have an impact socially and politically, it’s not enough that explanations are comprehensible to computer scientists,’ says Matzner. ‘Different socially situated people must be included in explanatory processes – from doctors to retirees and schoolchildren.’

The Technical and Social Challenges of AI Projects

The organising team emphasises the interdisciplinary focus of the conference. ‘It’s not enough to develop AI systems solely with the expertise of a single discipline. The computer scientists who design the machines must work together with social scientists who study how humans and AI interact and under what conditions this interaction takes place,’ explains Esposito. ‘Today more than ever, the challenge of AI projects is both technical and social. With this conference, we hope to encourage the inclusion of perspectives and insights from the social sciences in the debates surrounding explanatory machines.’

Previously, the ‘explanability’ of artificial intelligence was largely the domain of computer scientists. ‘In this research approach, the main view is that explainability and comprehension arise from transparency – that is, having as much information available as possible. An alternative view to this is that of co-construction,’ says Professor Dr. Katharina Rohlfing, a linguist at Paderborn University and the spokesperson of Transregio. ‘In our research, we do not consider humans to be passive partners who simply receive explanations. Instead, explanations emerge at the interface between the explainer and the explainee. Both actively shape the process of explanation and work towards achieving agreement on common ideas and conceptions. Cross-disciplinary collaboration is therefore essential to the study of explainability.’

Conference Talks from Media, Philosophy, Law, and Sociology

The conference includes 10 short talks. Among the international, renowned guests in attendance will be Professor Dr. Frank Pasquale, an expert on legal aspects of artificial intelligence, Professor Dr. Mireille Hildebrandt, a jurist and philosopher, and Dr. David Weinberger, a philosopher of the Internet, along with sociologist Dr. Dominique Cardon, legal scholar Dr. Antoinette Rouvroy and media researcher Dr. Bernhard Rieder. Following the keynote talks, participants will have the opportunity to discuss the conference research articles directly with the presenters. These articles will be sent to conference attendees by email after registering for the event.

Man with tablet PC next to woman pointing towards whiteboard
How can social science research enrich the debate on explanatory machines? The first conference from Transregio 318 will delve into this question.

The conference ‘Explaining Macines’ is Transregio 318’s first major scientific event. Information on the program can be found here. In the next three funding years, further conferences are planned that will focus on the concept of explainability from the perspectives of different scientific disciplines.

Members of the press are welcome to report on the conference: registration is required in advance by sending an email to The conference organizers and the Transregio spokesperson will be available during the conference to answer any questions from the press.

Collaborative Research Centre/Transregio (TRR) 318

The strongly interdisciplinary research program entitled ‘Constructing Explainability’ goes beyond the question of algorithmic decision-making as the basis for the explainability of AI with an approach that requires the active participation of humans with social-technical systems. The goal is to enhance human-machine interaction by focusing on the comprehension of algorithms and examining this as the product of a multimodal explanatory process. The German Research Foundation (DFG) is providing approximately 14 million Euro in funding for this project through July 2025.