Top Qs
Timeline
Chat
Perspective

John D. Lafferty

From Wikipedia, the free encyclopedia

Remove ads

John D. Lafferty is an American scientist, Professor at Yale University and leading researcher in machine learning. He is best known for proposing the Conditional Random Fields with Andrew McCallum and Fernando C.N. Pereira.[4]

Quick Facts Alma mater, Known for ...
Remove ads

Biography

Summarize
Perspective

In 2017, Lafferty was appointed the John C. Malone Professor of Statistics and Data Science at Yale University.[6] He previously taught at the University of Chicago as Louis Block Professor of Statistics and Computer Science,[6] and has held positions at the University of California, Berkeley and the University of California, San Diego. His research interests lie in statistical machine learning,[2][3] information retrieval,[5] and natural language processing,[7] with a focus on computational and statistical aspects of nonparametric methods, high-dimensional data and graphical models.

Prior to University of Chicago in 2011, he was faculty at Carnegie Mellon University since 1994, where he helped to found the world's first machine-learning department. Before CMU, he was a research staff member at IBM Thomas J. Watson Research Center, where he worked on natural speech and text processing in the group led by Frederick Jelinek. Lafferty received a Ph.D. in mathematics from Princeton University, where he was a member of the Program in Applied and Computational Mathematics, under Edward Nelson in 1986. He was an assistant professor in the Mathematics Department at Harvard University before joining IBM.[8]

He was elected Fellow of IEEE in 2007 "for contributions to statistical pattern recognition and statistical language processing".[1]

Remove ads

Academic career

Lafferty has held many positions, including: 1) program co-chair and general co-chair of the Neural Information Processing Systems (NIPS) Foundation conferences; 2) co-director of CMU's new Ph.D. Machine Learning Ph.D. Program; 3) associate editor of the Journal of Machine Learning Research[9] and the Electronic Journal of Statistics; and 4) member of the Committee on Applied and Theoretical Statistics (CATS) of the National Research Council.[10]

He has also received numerous awards, including two Test-of-Time awards at the International Conference on Machine Learning (ICML) 2011 & 2012,[2][3] classic paper prize of ICML 2013,[4] and Test-of-Time awards at the Special Interest Group on Information Retrieval (SIGIR) 2014.[5]

Remove ads

Selected works

  • 1990. A statistical approach to machine translation.[7]
The idea of statistical machine translation was born in the labs of IBM Research.[11]
  • 2001. Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data.
Test-of-Time Award of ICML 2011.[2]
  • 2002. Diffusion Kernels on Graphs and Other Discrete Input Spaces.
Test-of-Time Award of ICML 2012.[3]
  • 2003. Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions.
Classic paper prizes of ICML 2013.[4]
  • 2003. Beyond independent relevance: methods and evaluation metrics for subtopic retrieval.
Test of Time Award of SIGIR 2014.[5]
  • 2006. Dynamic topic models. ICML'06.

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads