Profile

Intro

I'm a Research Scientist at the Meta GenAI. Prior to joining Meta GenAI, I was a Postdoctoral Researcher at the Meta AI Research (FAIR), worked with Dr. Yuandong Tian and received my Ph.D. degree from Korea Advanced Institute of Science and Technology (KAIST) in 2023, advised by Prof. Sung Ju Hwang, and won the best Ph.D. Dissertation Award from the KAIST College of Engineering and School of Computing. During my Ph.D. study, I was a research intern at Meta AI and my research was supported by Google Ph.D. Fellowship.

My research focuses on developing efficient large-scale AI models for real-world applications. I specialize in model optimization techniques, including AutoML, Neural Architecture Search (NAS), meta-learning, hyperparameter optimization, and model compression through Knowledge Distillation (KD). Recently, I have been particularly interested in designing, analyzing, and evaluating large language models (LLMs) for efficient long-context generation and optimizing Retrieval-Augmented Generation (RAG) techniques, enabling practical deployment in real products.

Related keywords: LLM Efficiency, RAG, AutoML, NAS, KD, Meta-learning






Mentoring

Sohyun An, M.S Student at KAIST, Now Ph.D. Student at UCLA
Apr. 2022 - Aug. 2024
  • Topic: Neural Architecture Search, Diffusion Models
  • Paper: AutoML 2022 [W1], ICLR 2023 [C6], ICLR 2024 [C9]
  • Sewoong Lee, Ph.D. Student at KAIST
    Feb. 2021 - Dec. 2021
  • Topic: Neural Architecture Search
  • Paper: NeurIPS 2021 [C4]
  • Eunyoung Hyung, M.S. Student at KAIST, Now AI Researcher at Samsung Research
    Sep. 2019 - Jan. 2021
  • Topic: Neural Architecture Search
  • Paper: ICLR 2021 [C2], NeurIPS 2021 [C3]