Top Qs
Timeline
Chat
Perspective

Lam Nguyen

Vietnamese computer scientist and applied mathematician From Wikipedia, the free encyclopedia

Remove ads

Lam M. Nguyen is a Vietnamese computer scientist and applied mathematician known for his contributions to optimization algorithms for machine learning and notable for proposing and developing the SARAH stochastic recursive gradient method.[1] He is a Research Scientist at the IBM Research, Thomas J. Watson Research Center, New York, USA, where his research focuses on the intersection of optimization and machine learning.[2] He is an INFORMS Senior member and a member of the Beta Gamma Sigma honor society, one of the highest honors a business student can receive.[3][4]

Quick facts Education, Known for ...
Remove ads

Education and career

Nguyen earned a Bachelor of Science degree in Applied Mathematics and Computer Science from Lomonosov Moscow State University (2008), an M.B.A. from McNeese State University (2013), and a Ph.D. in Industrial and Systems Engineering from Lehigh University (2018). His dissertation, A Service System with On-Demand Agents, Stochastic Gradient Algorithms and the SARAH Algorithm, received the university's Elizabeth V. Stout Dissertation Award.[5] His doctoral advisor was Katya Scheinberg.

He joined IBM Research in 2018 as a Research Scientist.[2] Since 2020, Nguyen has been a Principal Investigator at the MIT-IBM Watson AI Lab, leading projects on safe and interpretable learning for time-series data. He was appointed Adjunct Faculty at Lehigh University in 2024.[6]

Remove ads

Research and contributions

Nguyen's research centers on optimization methods for machine learning and stochastic optimization. He is recognized as the lead inventor of the SARAH (Stochastic Recursive Gradient) algorithm, introduced at ICML 2017, which has influenced a wide class of variance-reduced optimization methods.[1][7][8][9][10]

He is the co-editor of the book Federated Learning: Theory and Practice (Elsevier, 2024), which provides a unified treatment of the theoretical and practical aspects of federated learning.[11]

Remove ads

Editorial and professional service

Nguyen serves as an Action Editor for the Journal of Machine Learning Research and Machine Learning, and as an Associate Editor of the Journal of Optimization Theory and Applications.[12][13][14] He has been in the Organizing Committee for the Conference on Neural Information Processing Systems (NeurIPS) 2023–2025,[15] and Senior Area Chair for International Conference on Machine Learning (ICML), International Conference on Learning Representations (ICLR), Conference on Neural Information Processing Systems (NeurIPS), and Artificial Intelligence and Statistics (AISTATS). He has organized workshops at NeurIPS 2021 and AAAI 2023.

Invited and plenary talks

Nguyen has delivered invited lectures at major conferences including multiple INFORMS Annual Meetings. He is a Plenary Speaker at the International Conference on Modeling, Computation and Optimization (MCO 2025), held at the University of Lorraine, France, presenting Advances in Non-Convex Optimization: Shuffling Methods and Momentum Techniques for Machine Learning.[16]

Selected publications

  • Nguyen, L. M. et al. (2017). "SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient." In Proceedings of ICML 2017.[1]
  • Nguyen, L. M. et al. (2021). "A Unified Convergence Analysis for Shuffling-Type Gradient Methods." Journal of Machine Learning Research.[17]
  • Nguyen, L. M. et al. (2024). Federated Learning: Theory and Practice. Elsevier.[11]

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads