Medical Manifestation-Aware De-Identification

Abstract

Face de-identification (DeID) has been widely studied for common scenes, but remains under-researched for medical scenes, mostly due to the lack of large-scale patient face datasets. In this paper, we release MeMa, consisting of over 40,000 photo-realistic patient faces. MeMa is re-generated from massive real patient photos. By carefully modulating the generation and data-filtering procedures, MeMa avoids breaching real patient privacy, while ensuring rich and plausible medical manifestations. We recruit expert clinicians to annotate MeMa with both coarse- and fine-grained labels, building the first medical-scene DeID benchmark. Additionally, we propose a baseline approach for this new medical-aware DeID task, by integrating data-driven medical semantic priors into the DeID procedure. Despite its conciseness and simplicity, our approach substantially outperforms previous ones.

Cite

Text

Tian et al. "Medical Manifestation-Aware De-Identification." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I25.34835

Markdown

[Tian et al. "Medical Manifestation-Aware De-Identification." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/tian2025aaai-medical/) doi:10.1609/AAAI.V39I25.34835

BibTeX

@inproceedings{tian2025aaai-medical,
  title     = {{Medical Manifestation-Aware De-Identification}},
  author    = {Tian, Yuan and Wang, Shuo and Zhai, Guangtao},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {26363-26372},
  doi       = {10.1609/AAAI.V39I25.34835},
  url       = {https://mlanthology.org/aaai/2025/tian2025aaai-medical/}
}