Seungyeon Kim

Research Scientist


Seungyeon Kim is a Research Scientist at Meta, where he is part of the Llama Research team advancing the frontier of large language models, reporting to Manohar Paluri. He holds a Ph.D. in Computer Science from the Georgia Institute of Technology. His research focuses on improving the capabilities and efficiency of machine learning systems, with particular interests in memorization in LLMs, adaptive computation, extreme classification, and model distillation. Prior to joining Meta, he was an Research Engineer at Google DeepMind under the supervision of Rob Fergus.