Project KOSMOX: Development of a novel local counterfactual interface.

Welcome to the web presence of the project "Development of a novel local counterfactual explanation method and interface considering cognitive modeling approaches - KOSMOX". Here you will find information about the project as well as current event information and contact persons.

  • Explainability of AI
  • Quality control through AI

 

Approach

Due to the enormous progress in the field of AI in recent years, more and more of these technologies are being used - especially in the field of machine learning, they serve as decision support for users. The analysis of the data requires a downstream verification, validation and interpretation of the results by the user - due to the black box character, there is currently a lack of transparency and explainability. A robust explainability system should support the decision maker in understanding the decisions made and alert him to changes in order to achieve a desired result in the future based on the ML model used. To provide these functions, the project aims to develop a local post-hoc explanatory system characterized by a semantic integration of two complementary explanatory approaches, local rule-based and simulation-based causal. Within the framework of the KOSMOX project, an explanation interface will be developed, incorporating insights from cognitive, IS, organizational sciences, HCI and behavioral economics, which will enable interactive communication between the end-user and the applied AI techniques or their explanations. This is intended to support experts in finding relevant explanations or understanding the MLE results for the decision-making process.

 

Technological objective

  1. KOSMOX aims at developing a local counterfactual post-hoc explanatory approach. This will be done by integrating two complementary explanatory approaches, local rule-based and simulation-based causal, and the relevant techniques from Complex Systems. In particular, new approaches for identifying explanatory locality and eliminating implicit, method-specific cognitive biases will be developed.
  2. Within the KOSMOX project, an explanation interface will be developed based on insights from cognitive science, organizational science, HCI, and behavioral economics, which will enable interactive communication between the end-user and the applied AI techniques or their explanations. This is intended to support experts in finding relevant explanations or understanding ML results for the decision-making process.

Profile

Project title: KOSMOX
Runtime: 24 Months
Funding: Bundesministerium für Bildung und Forschung (BMBF)
Project partner: Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI)
ContiTech AG
Lenze SE
Villeroy & Boch AG
Goal: Explainability of AI systems