Dovrolis, Constantine

11 publications

TMLR 2025 Before Forgetting, There's Learning: Representation Learning Challenges in Online Unsupervised Continual Learning Cameron Ethan Taylor, Shreyas Malakarjun Patil, Constantine Dovrolis
TMLR 2025 How Can Knowledge of a Task’s Modular Structure Improve Generalization and Training Efficiency? Shreyas Malakarjun Patil, Cameron Ethan Taylor, Constantine Dovrolis
ICML 2025 PEAKS: Selecting Key Training Examples Incrementally via Prediction Error Anchored by Kernel Similarity Mustafa Burak Gurbuz, Xingyu Zheng, Constantine Dovrolis
CVPR 2024 NICE: Neurogenesis Inspired Contextual Encoding for Replay-Free Class Incremental Learning Mustafa Burak Gurbuz, Jean Michael Moorman, Constantine Dovrolis
CoLLAs 2024 Patch-Based Contrastive Learning and Memory Consolidation for Online Unsupervised Continual Learning Cameron Ethan Taylor, Vassilis Vassiliades, Constantine Dovrolis
NeurIPS 2023 Neural Sculpting: Uncovering Hierarchically Modular Task Structure in Neural Networks Through Pruning and Network Analysis Shreyas Malakarjun Patil, Loizos Michael, Constantine Dovrolis
CoLLAs 2022 Model-Free Generative Replay for Lifelong Reinforcement Learning: Application to Starcraft-2 Zachary Alan Daniels, Aswin Raghavan, Jesse Hostetler, Abrar Rahman, Indranil Sur, Michael Piacentino, Ajay Divakaran, Roberto Corizzo, Kamil Faber, Nathalie Japkowicz, Michael Baron, James Smith, Sahana Pramod Joshi, Zsolt Kira, Cameron Ethan Taylor, Mustafa Burak Gurbuz, Constantine Dovrolis, Tyler L. Hayes, Christopher Kanan, Jhair Gallardo
ICML 2022 NISPA: Neuro-Inspired Stability-Plasticity Adaptation for Continual Learning in Sparse Networks Mustafa B Gurbuz, Constantine Dovrolis
ICML 2021 PHEW : Constructing Sparse Networks That Learn Fast and Generalize Well Without Training Data Shreyas Malakarjun Patil, Constantine Dovrolis
IJCAI 2021 Unsupervised Progressive Learning and the STAM Architecture James Seale Smith, Cameron E. Taylor, Seth Baer, Constantine Dovrolis
ICLRW 2019 Unsupervised Continual Learning and Self-Taught Associative Memory Hierarchies James Smith, Seth Baer, Zsolt Kira, Constantine Dovrolis