FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields

Abstract

Neural fields provide a memory-efficient representation of data, which can effectively handle diverse modalities and large-scale data. However, learning to map neural fields often requires large amounts of training data and computations, which can be limited to resource-constrained edge devices. One approach to tackle this limitation is to leverage Federated Meta-Learning (FML), but traditional FML approaches suffer from privacy leakage. To address these issues, we introduce a novel FML approach called FedMeNF. FedMeNF utilizes a new privacy-preserving loss function that regulates privacy leakage in the local meta-optimization. This enables the local meta-learner to optimize quickly and efficiently without retaining the client's private data. Our experiments demonstrate that FedMeNF achieves fast optimization speed and robust reconstruction performance, even with few-shot or non-IID data across diverse data modalities, while preserving client data privacy.

Cite

Text

Yun et al. "FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields." International Conference on Computer Vision, 2025.

Markdown

[Yun et al. "FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields." International Conference on Computer Vision, 2025.](https://mlanthology.org/iccv/2025/yun2025iccv-fedmenf/)

BibTeX

@inproceedings{yun2025iccv-fedmenf,
  title     = {{FedMeNF: Privacy-Preserving Federated Meta-Learning for Neural Fields}},
  author    = {Yun, Junhyeog and Hong, Minui and Kim, Gunhee},
  booktitle = {International Conference on Computer Vision},
  year      = {2025},
  pages     = {2161-2171},
  url       = {https://mlanthology.org/iccv/2025/yun2025iccv-fedmenf/}
}