Tri Dao
Tri Dao is an Assistant Professor at Princeton University and Chief Scientist of Together AI. He completed his PhD in Computer Science at Stanford. He works at the intersection of machine learning and systems, and his research interests include hardware-aware algorithms and sequence models with long-range memory. His notable work includes FlashAttention and Mamba. His work has received the COLM 2024 Outstanding Paper award, the MLSys 2025 Outstanding Paper Honorable Mention, and the ICML 2022 Outstanding Paper Runner-up award.
AI2050 Project
Current AI systems excel at tasks with abundant training data but struggle in specialized fields requiring deep expertise, like advanced engineering and scientific research. Dao’s project develops AI that learns through experimentation rather than just copying examples, using computational tools like compilers and simulators to provide feedback on proposed solutions. By combining this approach with new architectures that can understand entire technical systems at once, they enable AI to discover genuinely new solutions in domains where human experts are scarce. This breakthrough could automate complex technical work across engineering, science, and medicine, dramatically accelerating innovation in fields critical to society.
Assistant Professor, Princeton University
Hard ProblemCapabilities