Emb-trattunet: a novel edge loss function and transformer-CNN architecture for multi-classes pneumonia infection segmentation in low annotation regimes - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Artificial Intelligence Review Année : 2024

Emb-trattunet: a novel edge loss function and transformer-CNN architecture for multi-classes pneumonia infection segmentation in low annotation regimes

Résumé

Abstract One of the primary challenges in applying deep learning approaches to medical imaging is the limited availability of data due to various factors. These factors include concerns about data privacy and the requirement for expert radiologists to perform the time-consuming and labor-intensive task of labeling data, particularly for tasks such as segmentation. Consequently, there is a critical need to develop novel approaches for few-shot learning tasks in this domain. In this work, we propose a Novel CNN-Transformer Fusion scheme to segment Multi-classes pneumonia infection from limited CT-scans data. In total, there are three main contributions: (i) CNN-Transformer encoders fusion, which allows to extract and fuse richer features in the encoding phase, which contains: local, global and long-range dependencies features, (ii) Multi-Branches Skip Connection (MBSC) is proposed to extract and fuse richer features from the encoder features then integrate them into the decoder layers, where MBSC blocks extract higher-level features related to the finer details of different infection types, and (iii) a Multi-classes Boundary Aware Cross-Entropy (MBA-CE) Loss function is proposed to deal with fuzzy boundaries, enhance the separability between classes and give more attention to the minority classes. The performance of the proposed approach is evaluated using two evaluation scenarios and compared with different baseline and state-of-the-art segmentation architectures for Multi-classes Covid-19 segmentation. The obtained results show that our approach outperforms the comparison methods in both Ground-Glass Opacity (GGO) and Consolidation segmentation. On the other hand, our approach shows consistent performance when the training data is reduced to half, which proves the efficiency of our approach in few-shot learning. In contrast, the performance of the comparison methods drops in this scenario. Moreover, our approach is able to deal with imbalanced data classes. These advantages prove the effectiveness and efficiency of the proposed EMB-TrAttUnet approach in a pandemic scenario where time is critical to save patient lives.

Dates et versions

hal-04520691 , version 1 (25-03-2024)

Licence

Paternité

Identifiants

Citer

Fares Bougourzi, Fadi Dornaika, Amir Nakib, Abdelmalik Taleb-Ahmed. Emb-trattunet: a novel edge loss function and transformer-CNN architecture for multi-classes pneumonia infection segmentation in low annotation regimes. Artificial Intelligence Review, 2024, 57 (4), pp.90. ⟨10.1007/s10462-024-10717-2⟩. ⟨hal-04520691⟩
2 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More