Luke Zettlemoyer
Luke Zettlemoyer is a Professor in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, and a Senior Research Director at Meta. His research focuses on empirical methods for natural language semantics, and involves designing machine learning algorithms, introducing new tasks and datasets, and, most recently, studying how to best develop self-supervision signals for pre-training. His honors include being elected ACL President, named an ACL Fellow, winning a PECASE award, an Allen Distinguished Investigator award, and multiple best paper awards. Luke was an undergrad at NC State, received his PhD from MIT and was a postdoc at the University of Edinburgh.
AI2050 Project
Frontier language models (LMs) are monolithic; they are designed under the assumption that a single very large model trained by a centralized authority on all available data will always perform the best in every new task or use case. Zettlemoyer’s project challenges this assumption, arguing that we can improve performance, open up the development process, unlock new levels of scaling, and improve model specialization and responsible data use by breaking the LM monolith. We propose to fundamentally rethink the relationship between data and model parameters by developing new language models and training mechanisms that scale far beyond current methods.
Professor, University of Washington
Hard ProblemCapabilities