Negrea, JeffreyYang, JunFeng, HaoyueRoy, DanielHuggins, Jonathan2023-07-252023-07-252022-112022-11-14J. Negrea, J. Yang, H. Feng, D. Roy, J. Huggins. 2022. "Statistical Inference with Stochastic Gradient Algorithms" https://doi.org/10.48550/arXiv.2207.12395https://hdl.handle.net/2144/46492The tuning of stochastic gradient algorithms (SGAs) for optimization and sampling is often based on heuristics and trial-and-error rather than generalizable theory. We address this theory–practice gap by characterizing the large-sample statistical asymptotics of SGAs via a joint step-size–sample-size scaling limit. We show that iterate averaging with a large fixed step size is robust to the choice of tuning parameters and asymptotically has covariance proportional to that of the MLE sampling distribution. We also prove a Bernstein–von Mises-like theorem to guide tuning, including for generalized posteriors that are robust to model misspecification. Numerical experiments validate our results and recommendations in realistic finite-sample regimes. Our work lays the foundation for a systematic analysis of other stochastic gradient Markov chain Monte Carlo algorithms for a wide range of models.en-USThis article is distributed under the terms of the Creative Commons Attribution 4.0 International (CC BY 4.0).https://creativecommons.org/licenses/by/4.0/Statistical inference with stochastic gradient algorithmsPreprint2023-02-0910.48550/arXiv.2207.12395799510