Spatial Relation Learning for Explainable Image Classification and Annotation in Critical Applications - Archive ouverte HAL
Journal Articles Artificial Intelligence Year : 2021

Spatial Relation Learning for Explainable Image Classification and Annotation in Critical Applications

Abstract

With the recent successes of black-box models in Artificial Intelligence (AI) and the growing interactions between humans and AIs, explainability issues have risen. In this article, in the context of high-stake applications, we propose an approach for explainable classification and annotation of images. It is based on a transparent model, whose reasoning is accessible and human understandable, and on interpretable fuzzy relations that enable to express the vagueness of natural language. The knowledge about relations is set beforehand by an expert and thus training instances do not need to be annotated. The most relevant relations are extracted using a fuzzy frequent itemset mining algorithm in order to build rules, for classification, and constraints, for annotation. We also present two heuristics that make the process of evaluating relations faster. Since the strengths of our approach are the transparency of the model and the interpretability of the relations, an explanation in natural language can be generated. Supported by experimental results, we show that, given a segmentation of the input, our approach is able to successfully perform the target task and generate explanations that were judged as consistent and convincing by a set of participants.
Fichier principal
Vignette du fichier
spatial_relation_learning_for_explainable_image_classification_and_annotation_in_critical_applications.pdf (2.52 Mo) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03418311 , version 1 (10-11-2021)

Identifiers

Cite

Régis Pierrard, Jean-Philippe Poli, Céline Hudelot. Spatial Relation Learning for Explainable Image Classification and Annotation in Critical Applications. Artificial Intelligence, 2021, 292, 103434 (30 p.). ⟨10.1016/j.artint.2020.103434⟩. ⟨hal-03418311⟩
109 View
91 Download

Altmetric

Share

More