Babakniya, Sara

7 publications

NeurIPS 2025 Escaping Collapse: The Strength of Weak Data for Large Language Model Training Kareem Amin, Sara Babakniya, Alex Bie, Weiwei Kong, Umar Syed, Sergei Vassilvitskii
ICLRW 2025 Escaping Collapse: The Strength of Weak Data for Large Language Model Training Kareem Amin, Sara Babakniya, Alex Bie, Weiwei Kong, Umar Syed, Sergei Vassilvitskii
NeurIPS 2023 A Data-Free Approach to Mitigate Catastrophic Forgetting in Federated Class Incremental Learning for Vision Tasks Sara Babakniya, Zalan Fabian, Chaoyang He, Mahdi Soltanolkotabi, Salman Avestimehr
ICMLW 2023 Don’t Memorize; Mimic the past: Federated Class Incremental Learning Without Episodic Memory Sara Babakniya, Zalan Fabian, Chaoyang He, Mahdi Soltanolkotabi, Salman Avestimehr
TMLR 2023 Revisiting Sparsity Hunting in Federated Learning: Why Does Sparsity Consensus Matter? Sara Babakniya, Souvik Kundu, Saurav Prakash, Yue Niu, Salman Avestimehr
NeurIPSW 2023 SLoRA: Federated Parameter Efficient Fine-Tuning of Language Models Sara Babakniya, Ahmed Roushdy Elkordy, Yahya H. Ezzeldin, Qingfeng Liu, Kee-Bong Song, Mostafa EL-Khamy, Salman Avestimehr
NeurIPSW 2022 Federated Sparse Training: Lottery Aware Model Compression for Resource Constrained Edge Sara Babakniya, Souvik Kundu, Saurav Prakash, Yue Niu, Salman Avestimehr