Noam Shazeer
Noam Shazeer is a legendary AI researcher and co-author of the seminal “Attention Is All You Need” paper, which introduced the Transformer architecture. After leaving Google to co-found Character.ai, he recently returned to Google DeepMind in 2024 as part of a high-profile licensing deal, where he continues to lead cutting-edge development in large-scale language models.