Project KOSMOX: Development of a novel local counterfactual interface

Welcome to the web presence of the project "Development of a novel local counterfactual explanation method and interface considering cognitive modeling approaches - KOSMOX". Here you will find information about the project as well as current event information and contact persons.

  • Explainability of AI
  • Quality control through AI 

 

Approach

Due to the enormous progress in the field of AI in recent years, more and more of these technologies are being used - especially in the field of machine learning, they serve as decision support for users. The analysis of the data requires a downstream verification, validation and interpretation of the results by the user - due to the black box character, there is currently a lack of transparency and explainability. A robust system for explainability should support the decision maker in understanding the decisions made and alert him to changes in order to achieve a desired result in the future based on the ML model used. To provide these functions, the project aims to develop a local post-hoc explanatory system characterized by a semantic integration of two complementary explanatory approaches, local rule-based and simulation-based causal. In the context of the KOSMOX project, an explanation interface will be developed by incorporating insights from cognitive, IS, organizational sciences, HCI, and behavioral economics, which will enable interactive communication between the end-user and the applied AI techniques or their explanations. This is intended to support experts in finding relevant explanations or understanding the MLE results for the decision-making process. 

 

Technological objective

  1. KOSMOX aims at developing a local counterfactual post-hoc explanatory approach. This will be done by integrating two complementary explanatory approaches, local rule-based and simulation-based causal, and the relevant techniques from Complex Systems. In particular, new approaches to identify the explanatory locality and to eliminate the implicit, method-specific cognitive biases will be developed.
  2. Within the KOSMOX project, an explanation interface will be developed based on insights from cognitive science, organizational science, HCI, and behavioral economics, which will enable interactive communication between the end-user and the applied AI techniques or their explanations. This is intended to support experts in finding relevant explanations or understanding ML results for the decision-making process.

Profile

Project title: KOSMOX
Runtime: 24 months
Promotion: Bundesministerium für Bildung und Forschung (BMBF)
Project partner: Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI)
ContiTech AG
Lenze SE
Villeroy & Boch AG
Destination: Explainability of AI systems