2025-26 Frederick L. Hovde Distinguished Lecturer
Jianqing Fan
Frederick L. Moore Professor at Princeton University

Date: March 11, 2026
Location: DSAI
Time: To be determined
Classification and diffusion induced neural density estimators and simulators for generative AI and Society
Abstract
Deep learning has tremendous applications for generative AI and the Society. After briefly introduction on its applications to various societal problems, the focus will be on generative AI. Neural network-based methods for conditional density estimation have recently gained substantial attention, as various neural density estimators have outperformed classical approaches in real-data experiments. Despite these empirical successes, implementation can be challenging due to the need to ensure non-negativity and unit-mass constraints, and theoretical understanding remains limited. In particular, it is unclear whether such estimators can adaptively achieve faster convergence rates when the underlying density exhibits a low-dimensional structure. This paper addresses these gaps by proposing a structure-agnostic neural density estimator, called the classification-induced neural density estimator and simulator (CINDES) that is straightforward to implement and provably adaptive, attaining faster rates when the true density admits a low-dimensional composition structure. Another key contribution of our work is to show that the proposed estimator integrates naturally into generative sampling pipelines, most notably score-based diffusion models, where it achieves provably faster convergence when the underlying density is structured. We validate its performance through extensive simulations and a real-data application. We also prove the optimality of score-based diffusion models for density estimation when the target density admits a factorizable, low-dimensional, nonparametric structure in a separate work. The main challenge is that the low-dimensional, factorizable structure no longer holds for most diffused timesteps, and it is very difficult to show that these diffused score functions can be well approximated without a significant increase in the number of network parameters.
Bio
Jianqing Fan, Academician of Academia Sinica and member of Royal Academy of Belgium, is the Frederick L. Moore Professor at Princeton University. He was the past president of the Institute of Mathematical Statistics and the International Chinese Statistical Association. He is the joint editor of the Journal of the American Statistical Association and was the co-editor of The Annals of Statistics, Probability Theory and Related Fields, Econometrics Journal, Journal of Econometrics, and Journal of Business and Economics Statistics. His research interests include high-dimensional statistics, data science, machine learning, mathematics of AI, financial economics, and computational biology. He coauthored 4 books and published over 300 papers with a Google citation of 100,000. His published work has been recognized by The 2000 COPSS Presidents' Award, Morningside Gold Medal of Applied Mathematics, Guggenheim Fellow, P.L. Hsu Prize, Guy medal in silver, Noether Distinguished Scholar Award, Le Cam Award and Lecture, Frontiers of Science Award, and Wald Memorial Award and Lecture, and follow of American Associations for Advancement of Science, Institute of Mathematical Statistics, American Statistical Association, and Society of Financial Econometrics.