Rolnick, David

44 publications

ICLRW 2025 A Joint Space-Time Encoder for Geographic Time-Series Data David Mickisch, Konstantin Klemmer, Mélisande Teng, David Rolnick
ICML 2025 Alberta Wells Dataset: Pinpointing Oil and Gas Wells from Satellite Imagery Pratinav Seth, Michelle Lin, Brefo Dwamena Yaw, Jade Boutot, Mary Kang, David Rolnick
NeurIPS 2025 Bringing SAM to New Heights: Leveraging Elevation Data for Tree Crown Segmentation from Drone Imagery Mélisande Teng, Arthur Ouaknine, Etienne Laliberté, Yoshua Bengio, David Rolnick, Hugo Larochelle
NeurIPS 2025 Causal Climate Emulation with Bayesian Filtering Sebastian Hickman, Ilija Trajković, Julia Kaltenborn, Francis Pelletier, Alexander T Archibald, Yaniv Gurwicz, Peer Nowack, David Rolnick, Julien Boussard
AAAI 2025 FoMo: Multi-Modal, Multi-Scale and Multi-Task Remote Sensing Foundation Models for Forest Monitoring Nikolaos-Ioannis Bountos, Arthur Ouaknine, Ioannis Papoutsis, David Rolnick
ICML 2025 Galileo: Learning Global & Local Features of Many Remote Sensing Modalities Gabriel Tseng, Anthony Fuller, Marlena Reil, Henry Herzog, Patrick Beukema, Favyen Bastani, James R Green, Evan Shelhamer, Hannah Kerner, David Rolnick
NeurIPS 2025 GreenHyperSpectra: A Multi-Source Hyperspectral Dataset for Global Vegetation Trait Prediction Eya Cherif, Arthur Ouaknine, Luke A. Brown, Phuong D. Dao, Kyle R Kovach, Bing Lu, Daniel Mederer, Hannes Feilhauer, Teja Kattenborn, David Rolnick
NeurIPS 2025 Open-Insect: Benchmarking Open-Set Recognition of Novel Species in Biodiversity Monitoring Yuyan Chen, Nico Lang, B. Christian Schmidt, Aditya Jain, Yves Basset, Sara Beery, Maxim Larrivée, David Rolnick
CVPRW 2025 Task-Informed Meta-Learning for Remote Sensing Gabriel Tseng, Hannah Kerner, David Rolnick
ICML 2025 The Butterfly Effect: Neural Network Training Trajectories Are Highly Sensitive to Initial Conditions Gül Sena Altıntaş, Devin Kwok, Colin Raffel, David Rolnick
JMLR 2024 Fourier Neural Operators for Arbitrary Resolution Climate Data Downscaling Qidong Yang, Alex Hernandez-Garcia, Paula Harder, Venkatesh Ramesh, Prasanna Sattigeri, Daniela Szwarcman, Campbell D. Watson, David Rolnick
ICMLW 2024 Improving Molecular Modeling with Geometric GNNs: An Empirical Study Ali Ramlaoui, Théo Saulus, Basile Terver, Victor Schmidt, David Rolnick, Fragkiskos D. Malliaros, Alexandre AGM Duval
ECCV 2024 Insect Identification in the Wild: The AMI Dataset Aditya Jain, Fagner Cunha, Michael J Bunsen, Juan Sebastián Cañas, Léonard Pasi, Nathan Pinoy, Flemming Helsing, JoAnne Russo, Marc S Botham, Michael Sabourin, Jonathan Fréchette, Alexandre Anctil, Yacksecari Lopez, Eduardo Navarro, Filonila Pérez, Ana C Zamora, Jose Alejandro Ramirez-Silva, Jonathan Gagnon, Tom A August, Kim Bjerge, Alba Gomez Segura, Marc Belisle, Yves Basset, Kent P McFarland, David B Roy, Toke T Høye, Maxim Larrivee, David Rolnick
TMLR 2024 Linear Weight Interpolation Leads to Transient Performance Gains Gaurav Iyer, Gintare Karolina Dziugaite, David Rolnick
ICMLW 2024 Linear Weight Interpolation Leads to Transient Performance Gains Gaurav Iyer, Gintare Karolina Dziugaite, David Rolnick
JMLR 2024 PhAST: Physics-Aware, Scalable, and Task-Specific GNNs for Accelerated Catalyst Design Alexandre Duval, Victor Schmidt, Santiago Miret, Yoshua Bengio, Alex Hernández-García, David Rolnick
ICML 2024 Position: Application-Driven Innovation in Machine Learning David Rolnick, Alan Aspuru-Guzik, Sara Beery, Bistra Dilkina, Priya L. Donti, Marzyeh Ghassemi, Hannah Kerner, Claire Monteleoni, Esther Rolf, Milind Tambe, Adam White
ECML-PKDD 2024 Simultaneous Linear Connectivity of Neural Networks Modulo Permutation Ekansh Sharma, Devin Kwok, Tom Denton, Daniel M. Roy, David Rolnick, Gintare Karolina Dziugaite
ICML 2024 Stealing Part of a Production Language Model Nicholas Carlini, Daniel Paleka, Krishnamurthy Dj Dvijotham, Thomas Steinke, Jonathan Hayase, A. Feder Cooper, Katherine Lee, Matthew Jagielski, Milad Nasr, Arthur Conmy, Eric Wallace, David Rolnick, Florian Tramèr
ICMLW 2024 The Butterfly Effect: Tiny Perturbations Cause Neural Network Training to Diverge Gül Sena Altıntaş, Devin Kwok, David Rolnick
AAAI 2023 Bugs in the Data: How ImageNet Misrepresents Biodiversity Alexandra Sasha Luccioni, David Rolnick
NeurIPS 2023 ClimateSet: A Large-Scale Climate Model Dataset for Machine Learning Julia Kaltenborn, Charlotte Lange, Venkatesh Ramesh, Philippe Brouillard, Yaniv Gurwicz, Chandni Nagda, Jakob Runge, Peer Nowack, David Rolnick
ICMLW 2023 Deep Networks as Paths on the Manifold of Neural Representations Richard D Lange, Devin Kwok, Jordan Kyle Matelsky, Xinyue Wang, David Rolnick, Konrad Kording
ICML 2023 FAENet: Frame Averaging Equivariant GNN for Materials Modeling Alexandre Agm Duval, Victor Schmidt, Alex Hernández-Garcı́a, Santiago Miret, Fragkiskos D. Malliaros, Yoshua Bengio, David Rolnick
JMLR 2023 Hard-Constrained Deep Learning for Climate Downscaling Paula Harder, Alex Hernandez-Garcia, Venkatesh Ramesh, Qidong Yang, Prasanna Sattegeri, Daniela Szwarcman, Campbell Watson, David Rolnick
ICML 2023 Hidden Symmetries of ReLU Networks Elisenda Grigsby, Kathryn Lindsey, David Rolnick
ICML 2023 Maximal Initial Learning Rates in Deep ReLU Networks Gaurav Iyer, Boris Hanin, David Rolnick
NeurIPS 2023 Normalization Layers Are All That Sharpness-Aware Minimization Needs Maximilian Mueller, Tiffany Vlaar, David Rolnick, Matthias Hein
NeurIPSW 2023 On the Importance of Catalyst-Adsorbate 3D Interactions for Relaxed Energy Predictions Alvaro Carbonero, Alexandre AGM Duval, Victor Schmidt, Santiago Miret, Alex Hernández-García, Yoshua Bengio, David Rolnick
NeurIPS 2023 SatBird: A Dataset for Bird Species Distribution Modeling Using Remote Sensing and Citizen Science Data Mélisande Teng, Amna Elmustafa, Benjamin Akera, Yoshua Bengio, Hager Radi, Hugo Larochelle, David Rolnick
TMLR 2022 Clustering Units in Neural Networks: Upstream vs Downstream Information Richard D Lange, David Rolnick, Konrad Kording
ICLR 2022 Deep ReLU Networks Preserve Expected Length Boris Hanin, Ryan Jeong, David Rolnick
ICLRW 2022 Inductive Biases for Relational Tasks Giancarlo Kerg, Sarthak Mittal, David Rolnick, Yoshua Bengio, Blake Aaron Richards, Guillaume Lajoie
NeurIPSW 2022 PhAST: Physics-Aware, Scalable, and Task-Specific GNNs for Accelerated Catalyst Design Alexandre AGM Duval, Victor Schmidt, Santiago Miret, Yoshua Bengio, Alex Hernández-García, David Rolnick
NeurIPS 2022 Understanding the Evolution of Linear Regions in Deep Reinforcement Learning Setareh Cohan, Nam Hee Kim, David Rolnick, Michiel van de Panne
ICLR 2021 DC3: A Learning Method for Optimization with Hard Constraints Priya L. Donti, David Rolnick, J Zico Kolter
NeurIPS 2021 Techniques for Symbol Grounding with SATNet Sever Topan, David Rolnick, Xujie Si
ICML 2020 Reverse-Engineering Deep ReLU Networks David Rolnick, Konrad Kording
ICML 2019 Complexity of Linear Regions in Deep Networks Boris Hanin, David Rolnick
NeurIPS 2019 Deep ReLU Networks Have Surprisingly Few Activation Patterns Boris Hanin, David Rolnick
NeurIPS 2019 Experience Replay for Continual Learning David Rolnick, Arun Ahuja, Jonathan Schwarz, Timothy Lillicrap, Gregory Wayne
ICLR 2019 Measuring and Regularizing Networks in Function Space Ari Benjamin, David Rolnick, Konrad Kording
NeurIPS 2018 How to Start Training: The Effect of Initialization and Architecture Boris Hanin, David Rolnick
ICLR 2018 The Power of Deeper Networks for Expressing Natural Functions David Rolnick, Max Tegmark