Foster, Dean P.

28 publications

ALT 2023 Linear Reinforcement Learning with Ball Structure Action Space Zeyu Jia, Randy Jia, Dhruv Madeka, Dean P. Foster
NeurIPS 2022 A Few Expert Queries Suffices for Sample-Efficient RL with Resets and Linear Value Approximation Philip Amortila, Nan Jiang, Dhruv Madeka, Dean P. Foster
NeurIPS 2021 The Benefits of Implicit Regularization from SGD in Least Squares Problems Difan Zou, Jingfeng Wu, Vladimir Braverman, Quanquan Gu, Dean P. Foster, Sham Kakade
NeurIPS 2019 Dynamic Local Regret for Non-Convex Online Forecasting Sergul Aydore, Tianhao Zhu, Dean P. Foster
COLT 2016 Online Sparse Linear Regression Dean P. Foster, Satyen Kale, Howard J. Karloff
JMLR 2015 Eigenwords: Spectral Word Embeddings Paramveer S. Dhillon, Dean P. Foster, Lyle H. Ungar
COLT 2015 Variable Selection Is Hard Dean P. Foster, Howard J. Karloff, Justin Thaler
AISTATS 2014 A Level-Set Hit-and-Run Sampler for Quasi-Concave Distributions Shane T. Jensen, Dean P. Foster
UAI 2014 Adaptive Monotone Shrinkage for Regression Zhuang Ma, Dean P. Foster, Robert A. Stine
UAI 2014 Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent Yichao Lu, Dean P. Foster
NeurIPS 2014 Large Scale Canonical Correlation Analysis with Iterative Least Squares Yichao Lu, Dean P. Foster
JMLR 2014 Spectral Learning of Latent-Variable PCFGs: Algorithms and Sample Complexity Shay B. Cohen, Karl Stratos, Michael Collins, Dean P. Foster, Lyle Ungar
JMLR 2013 A Risk Comparison of Ordinary Least Squares vs Ridge Regression Paramveer S. Dhillon, Dean P. Foster, Sham M. Kakade, Lyle H. Ungar
NeurIPS 2013 Faster Ridge Regression via the Subsampled Randomized Hadamard Transform Yichao Lu, Paramveer Dhillon, Dean P. Foster, Lyle Ungar
NeurIPS 2013 New Subsampling Algorithms for Fast Least Squares Regression Paramveer Dhillon, Yichao Lu, Dean P. Foster, Lyle Ungar
NeurIPS 2013 One-Shot Learning and Big Data with N=2 Lee H Dicker, Dean P. Foster
NeurIPS 2012 A Spectral Algorithm for Latent Dirichlet Allocation Anima Anandkumar, Dean P. Foster, Daniel J. Hsu, Sham M. Kakade, Yi-kai Liu
ICML 2012 Using CCA to Improve CCA: A New Spectral Method for Estimating Vector Models of Words Paramveer S. Dhillon, Jordan Rodu, Dean P. Foster, Lyle H. Ungar
COLT 2011 Complexity-Based Approach to Calibration with Checking Rules Dean P. Foster, Alexander Rakhlin, Karthik Sridharan, Ambuj Tewari
NeurIPS 2011 Multi-View Learning of Word Embeddings via CCA Paramveer Dhillon, Dean P. Foster, Lyle H. Ungar
NeurIPS 2011 Stochastic Convex Optimization with Bandit Feedback Alekh Agarwal, Dean P. Foster, Daniel J. Hsu, Sham M. Kakade, Alexander Rakhlin
ECML-PKDD 2009 Multi-Task Feature Selection Using the Multiple Inclusion Criterion (MIC) Paramveer S. Dhillon, Brian Tomasik, Dean P. Foster, Lyle H. Ungar
COLT 2007 Multi-View Regression via Canonical Correlation Analysis Sham M. Kakade, Dean P. Foster
JMLR 2006 Streamwise Feature Selection Jing Zhou, Dean P. Foster, Robert A. Stine, Lyle H. Ungar
AISTATS 2005 Streaming Feature Selection Using IIC Lyle H. Ungar, Jing Zhou, Dean P. Foster, Bob A. Stine
NeurIPS 2005 Worst-Case Bounds for Gaussian Process Models Sham M. Kakade, Matthias W. Seeger, Dean P. Foster
COLT 2004 Deterministic Calibration and Nash Equilibrium Sham M. Kakade, Dean P. Foster
ICML 1997 Characterizing the Generalization Performance of Model Selection Strategies Dale Schuurmans, Lyle H. Ungar, Dean P. Foster