Enhancing Trust in AI-Driven Dermatology: CLIP for Explainable Skin Lesion Diagnosis
Abstract
Skin carcinoma is the most common cancer worldwide and costs over $8 billion annually. Early diagnosis is vital for improving melanoma survival rates from 23% to 99%. Deep neural networks show promising results in classifying skin lesions as benign or malignant, but black-box methods are typically not trusted by doctors. In this paper we use the CLIP (Contrastive Language-Image Pretraining) model, trained on various skin lesion datasets, to capture meaningful relationships between visual features and related diagnostic terms in an effort to increase explainability. We also use a gradient-based visual explanation method for CLIP, known as Grad-ECLIP, which highlights the critical regions in images linked to specific diagnostic descriptions. This pipeline not only classifies skin lesions and generates corresponding descriptions but also adds a layer of visual explanations.
Cite
Text
Kamal and Oates. "Enhancing Trust in AI-Driven Dermatology: CLIP for Explainable Skin Lesion Diagnosis." NeurIPS 2024 Workshops: AIM-FM, 2024.Markdown
[Kamal and Oates. "Enhancing Trust in AI-Driven Dermatology: CLIP for Explainable Skin Lesion Diagnosis." NeurIPS 2024 Workshops: AIM-FM, 2024.](https://mlanthology.org/neuripsw/2024/kamal2024neuripsw-enhancing/)BibTeX
@inproceedings{kamal2024neuripsw-enhancing,
title = {{Enhancing Trust in AI-Driven Dermatology: CLIP for Explainable Skin Lesion Diagnosis}},
author = {Kamal, Sadia and Oates, Tim},
booktitle = {NeurIPS 2024 Workshops: AIM-FM},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/kamal2024neuripsw-enhancing/}
}