BEGIN:VCALENDAR VERSION:2.0 PRODID:-//132.216.98.100//NONSGML kigkonsult.se iCalcreator 2.20.4// BEGIN:VEVENT UID:20250707T073741EDT-49612IguEM@132.216.98.100 DTSTAMP:20250707T113741Z DESCRIPTION:\n Approximate Cross-Validation for Large Data and High Dimensio ns\n\n  \n\n Abstract:\n\n\nThe error or variability of statistical and mach ine learning algorithms is often assessed by repeatedly re-fitting a model with different weighted versions of the observed data. The ubiquitous too ls of cross-validation (CV) and the bootstrap are examples of this techniq ue. These methods are powerful in large part due to their model agnosticis m but can be slow to run on modern\, large data sets due to the need to re peatedly re-fit the model. We use a linear approximation to the dependence of the fitting procedure on the weights\, producing results that can be f aster than repeated re-fitting by orders of magnitude. This linear approxi mation is sometimes known as the “infinitesimal jackknife” (IJ) in the sta tistics literature\, where it has mostly been used as a theoretical tool t o prove asymptotic results. We provide explicit finite-sample error bounds for the infinitesimal jackknife in terms of a small number of simple\, ve rifiable assumptions. Without further modification\, though\, we note that the IJ deteriorates in accuracy in high dimensions and incurs a running t ime roughly cubic in dimension. We additionally show\, then\, how dimensio nality reduction can be used to successfully run the IJ in high dimensions when data is sparse or low rank. Simulated and real-data experiments supp ort our theory.\n\n\n Speaker\n\n\nTamara Broderick is an Associate Profess or in the Department of Electrical Engineering and Computer Science at MIT . She is a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL)\, the MIT Statistics and Data Science Center\, and the Institute for Data\, Systems\, and Society (IDSS). She completed her Ph.D. in Statistics at the University of California\, Berkeley in 2014. Previou sly\, she received an AB in Mathematics from Princeton University (2007)\, a Master of Advanced Study for completion of Part III of the Mathematical Tripos from the University of Cambridge (2008)\, an MPhil by research in Physics from the University of Cambridge (2009)\, and an MS in Computer Sc ience from the University of California\, Berkeley (2013). Her recent rese arch has focused on developing and analyzing models for scalable Bayesian machine learning. She has been awarded an Early Career Grant (ECG) from th e Office of Naval Research (2020)\, an AISTATS Notable Paper Award (2019)\ , an NSF CAREER Award (2018)\, a Sloan Research Fellowship (2018)\, an Arm y Research Office Young Investigator Program (YIP) award (2017)\, Google F aculty Research Awards\, an Amazon Research Award\, the ISBA Lifetime Memb ers Junior Researcher Award\, the Savage Award (for an outstanding doctora l dissertation in Bayesian theory and methods)\, the Evelyn Fix Memorial M edal and Citation (for the Ph.D. student on the Berkeley campus showing th e greatest promise in statistical research)\, the Berkeley Fellowship\, an NSF Graduate Research Fellowship\, a Marshall Scholarship\, and the Phi B eta Kappa Prize (for the graduating Princeton senior with the highest acad emic average).\n\nZoom Link\n\n \n DTSTART:20201113T203000Z DTEND:20201113T213000Z SUMMARY:Tamara Broderick (MIT) URL:/mathstat/channels/event/tamara-broderick-mit-3262 04 END:VEVENT END:VCALENDAR