Skip to yearly menu bar Skip to main content


Show Detail Timezone:
America/Los_Angeles
 
Filter Rooms:  

 

MON 28 MAR
midnight
Community Activities and Mentorship:
(ends 1:30 AM)
1:30 a.m.
Orals 1:30-2:30
[1:30] Denoising and change point localisation in piecewise-constant high-dimensional regression coefficients
[1:45] Noise Regularizes Over-parameterized Rank One Matrix Recovery, Provably
[2:00] Survival regression with proper scoring rules and monotonic neural networks
[2:15] Multivariate Quantile Function Forecaster
(ends 2:30 AM)
2:30 a.m.
Orals 2:30-3:30
[2:30] Differentiable Bayesian inference of SDE parameters using a pathwise series expansion of Brownian motion
[2:45] Nonparametric Relational Models with Superrectangulation
[3:00] Robust Bayesian Inference for Simulator-based Models via the MMD Posterior Bootstrap
[3:15] Unifying Importance Based Regularisation Methods for Continual Learning
(ends 3:30 AM)
4:30 a.m.
Posters 4:30-6:00
(ends 6:00 AM)
6 a.m.
Orals 6:00-7:00
[6:00] Almost Optimal Universal Lower Bound for Learning Causal DAGs with Atomic Interventions
[6:15] Variance Minimization in the Wasserstein Space for Invariant Causal Prediction
[6:30] On the Assumptions of Synthetic Control Methods
[6:45] Differentially Private Densest Subgraph
(ends 7:00 AM)
7 a.m.
Community Activities and Mentorship:
(ends 8:00 AM)
Orals 7:00-8:00
[7:00] Optimal Rates of (Locally) Differentially Private Heavy-tailed Multi-Armed Bandits
[7:15] Nonstochastic Bandits and Experts with Arm-Dependent Delays
[7:30] Towards Agnostic Feature-based Dynamic Pricing: Linear Policies vs Linear Valuation with Unknown Noise
[7:45] Towards an Understanding of Default Policies in Multitask Policy Optimization
(ends 8:00 AM)
9 a.m.
Remarks:
(ends 9:15 AM)
10:15 a.m.
Posters 10:15-11:45
(ends 11:45 AM)
TUE 29 MAR
1 a.m.
Posters 1:00-2:30
(ends 2:30 AM)
2:30 a.m.
Orals 2:30-3:30
[2:30] Adversarially Robust Kernel Smoothing
[2:45] A Single-Timescale Method for Stochastic Bilevel Optimization
[3:00] Lifted Primal-Dual Method for Bilinearly Coupled Smooth Minimax Optimization
[3:15] Generative Models as Distributions of Functions
(ends 3:30 AM)
3:30 a.m.
Orals 3:30-4:30
[3:30] Amortized Rejection Sampling in Universal Probabilistic Programming
[3:45] Adaptive Importance Sampling meets Mirror Descent : a Bias-variance Tradeoff
[4:00] Loss as the Inconsistency of a Probabilistic Dependency Graph: Choose Your Model, Not Your Loss Function
[4:15] On the Consistency of Max-Margin Losses
(ends 4:30 AM)
5:30 a.m.
Community Activities and Mentorship:
(ends 7:00 AM)
7 a.m.
8 a.m.
Orals 8:00-9:00
[8:00] Many processors, little time: MCMC for partitions via optimal transport couplings
[8:15] Rapid Convergence of Informed Importance Tempering
[8:30] Projection Predictive Inference for Generalized Linear and Additive Multilevel Models
[8:45] Density Ratio Estimation via Infinitesimal Classification
(ends 9:00 AM)
9:45 a.m.
Award Ceremony:
(ends 10:00 AM)
11 a.m.
Community Activities and Mentorship:
(ends 12:00 PM)
WED 30 MAR
midnight
Orals 12:00-1:00
[12:00] Sampling from Arbitrary Functions via PSD Models
[12:15] Orbital MCMC
[12:30] Hardness of Learning a Single Neuron with Adversarial Label Noise
[12:45] Data-splitting improves statistical performance in overparameterized regimes
(ends 1:00 AM)
1 a.m.
Orals 1:00-2:00
[1:00] Beta Shapley: a Unified and Noise-reduced Data Valuation Framework for Machine Learning
[1:15] Faster Rates, Adaptive Algorithms, and Finite-Time Bounds for Linear Composition Optimization and Gradient TD Learning
[1:30] A general class of surrogate functions for stable and efficient reinforcement learning
[1:45] A Complete Characterisation of ReLU-Invariant Distributions
(ends 2:00 AM)
2:30 a.m.
Invited Talk:
Mihaela van der Schaar
(ends 3:30 AM)
3:30 a.m.
Posters 3:30-5:00
(ends 5:00 AM)
6 a.m.
Orals 6:00-7:00
[6:00] Minimax Optimization: The Case of Convex-Submodular
[6:15] Doubly Mixed-Effects Gaussian Process Regression
[6:30] Fast and Scalable Spike and Slab Variable Selection in High-Dimensional Gaussian Processes
[6:45] Debiasing Samples from Online Learning Using Bootstrap
(ends 7:00 AM)
7 a.m.
Orals 7:00-8:00
[7:00] Entropy Regularized Optimal Transport Independence Criterion
[7:15] Two-Sample Test with Kernel Projected Wasserstein Distance
[7:30] Estimating Functionals of the Out-of-Sample Error Distribution in High-Dimensional Ridge Regression
[7:45] Heavy-tailed Streaming Statistical Estimation
(ends 8:00 AM)
8:30 a.m.
Posters 8:30-10:00
(ends 10:00 AM)
10 a.m.
Community Activities and Mentorship:
(ends 12:00 PM)