About Me
I recently got my Ph.D. in School of Computing at KAIST, advised by Prof. Sung Ju Hwang. I was a research intern at Meta AI (Seattle, WA). My research has been supported by Google Ph.D. Fellowship. My Ph.D. dissertation won the Best Ph.D. Dissertation Award from both School of Computing & College of Engineering at KAIST. I am currently working as a postdoctoral researcher at the Fundamental AI Research (FAIR) Labs at Meta AI, working with Yuandong Tian. Here is my CV (last update: 2024. 01. 31).
My research interest includes:
- Efficient LLMs
- Neural Architecture Search
- Meta-learning
News
2024. 02. The Best Ph.D. Dissertation Award from the KAIST College of Engineering and School of Computing
2024. 01. One paper accepted to ICLR 2024
2023. 10. One paper accepted to Findings of EMNLP 2023
2023. 09. I joined FAIR Labs at Meta AI as a postdoctoral researcher, working with Yuandong Tian.
2023. 09. Keynote at AutoML conference 2023, Germany
2023. 05. Hayeon successfully defended her Ph.D.
2023. 05. One paper accepted to Findings of ACL 2023
2023. 04. Google Travel Grant for ICLR 2023 from Google
2023. 02. Selected as the online experience chair for the AutoML conference 2023
2023. 01. AI/CS/EE Rising Stars 2023 supported by Google Explore Computer Science Research
2023. 01. One paper accepted to ICLR 2023 as Notable-top-25% - Spotlight Presentation
2022. 10. NeurIPS2022 Top reviewer
2022. 09. Google Ph.D. fellowship
2022. 08. Research internship at MetaAI, Seattle, WA, USA
2022. 02. AI/CS/EE Rising Stars 2022 supported by Google Explore Computer Science Research
2022. 01. One paper accepted to ICLR 2022 as Spotlight Presentation
Education
Ph.D. in School of Computing, Korea Advanced Institute of Science and Technology (KAIST)
Mar. 2018 - Aug. 2023
Dissertation Title: “Efficient and Generalizable Neural Architecture Search for the Real World”
(Best Ph.D. Dissertation Award from both School of Computing & College of Engineering)
Committee: Sung Ju Hwang, Eunho Yang, Se-Young Yun, Frank Hutter, Cho-Jui HsiehM.S. in School of Computing, Korea Advanced Institute of Science and Technology (KAIST)
Mar. 2016 - Feb. 2018B.S. in Computer Science, Sungkyunkwan University
Mar. 2012 - Feb. 2016
Awards & Honors
- Best Ph.D. Dissertation Award
For the top 9 out of 756 (1%) Ph.D. dissertations, College of Engineering, KAIST, 2024 - Best Ph.D. Dissertation Award
For the top 1 out of 23 (4%) Ph.D. dissertations, School of Computing, KAIST, 2024 - Google Travel Grant
ICLR, 2023 - Keynote Speaker
AutoML Conference, 2023 - Online Experience Chair
AutoML Conference, 2023 - AI/CS/EE Rising Stars Award 2023
Google Explore Computer Science Research, 2023 - Top Reviewer
NeurIPS, 2022 - Google Ph.D. Fellowship
One of the five recipients from Republic of Korea, 2022 - AI/CS/EE Rising Stars Award 2022
Google Explore Computer Science Research, 2022 - The Best Presentation Award
KAIST-Korea Agency for Defence Development Workshop, 2021 - NAVER Ph.D. Fellowship
One of the ten Ph.D. students with outstanding research outcome in KAIST CS Dept., 2020 - Outstanding Reviewer (Top 30%)
ICML, 2020 - Kyunghyun Cho Travel Grant
ICLR, 2020
Work Experiences
- Postdoctoral Researcher, FAIR (Meta AI), Menlo Park, CA (Sep 2023 - Present)
- Research Intern, Meta AI, Seattle, WA (Aug 2022 - Dec 2022)
- Research Intern, AITRICS, Seoul, South Korea (Dec 2020 - Feb 2021)
- Research Intern, National AI Research Institute, Deajeon, South Korea (Jul 2015 - Jun 2015)
- Developer Intern, Samsung Electronics, Suwon, South Korea (Jan 2015 - Feb 2015)
Preprints
- SuperNet in Neural Architecture Search: A Taxonomic Survey
[paper]
Stephen Cha, Taehyeon Kim, Hayeon Lee, Se-Young Yun
Arxiv 2022
International Conference Publications
DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models
[paper] [code]
Sohyun An* Hayeon Lee* Jaehyeong Jo Seanie Lee Sung Ju Hwang (*: equal contribution)
ICLR 2024Co-training and Co-distillation for Quality Improvement and Compression of Language Models
[paper]
Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Hongbo Zhang, Sung Ju Hwang and Alexander Min
Findings of EMNLP 2023 (long paper)A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
[paper]
Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Sung Ju Hwang and Alexander Min
Findings of ACL 2023Meta-Prediction Model for Distillation-aware NAS on Unseen Datasets
[paper] [code]
Hayeon Lee*, Sohyun An*, Minseon Kim, Sung Ju Hwang (*: equal contribution)
ICLR 2023, Spotlight (notable-top-25%)Online Hyperparameter Meta-Learning with Hypergradient Distillation
[paper]
Hae Beom Lee, Hayeon Lee, Jaewoong Shin, Eunho Yang, Timothy Hospedales, Sung Ju Hwang
ICLR 2022, Spotlight (acceptance = 176 / 3391 = 5.1%)HELP: Hardware-Adaptive Efficient Latency Prediction for NAS via Meta-Learning
[paper] [code]
Hayeon Lee*, Sewoong Lee*, Song Chong, Sung Ju Hwang (*: equal contribution)
NeurIPS 2021, Spotlight (acceptance < 3%)Task-Adaptive Neural Network Search with Meta-Contrastive Learning
[paper] [code]
Wonyong Jeong*, Hayeon Lee*, Gun Park*, Eunyoung Hyung, Jinheon Baek, Sung Ju Hwang (*: equal contribution)
NeurIPS 2021, Spotlight (acceptance < 3%)Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets
[paper] [code]
Hayeon Lee*, Eunyoung Hyung*, Sung Ju Hwang (*: equal contribution)
ICLR 2021Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks
[paper] [code]
Hae Beom Lee*, Hayeon Lee*, Donghyun Na*, Saehoon Kim, Minseop Park, Eunho Yang, Sung Ju Hwang (*: equal contribution)
ICLR 2020, Oral (acceptance = 48/2594 = 1.9%)
Domestic Conference Publication
- Learning Spatial Relationships for Cross Modal Retrieval
Hayeon Lee*, Wonjun Yoon*, Jinseok Park, Sung Ju Hwang (*: equal contribution)
CKAIA 2020
Workshop Publication
- Lightweight Neural Architecture Search with Parameter Remapping and Knowledge Distillation
[paper]
Hayeon Lee*, Sohyun An*, Minseon Kim, Sung Ju Hwang (*: equal contribution)
AutoML 2022 Workshop
Keynote
- “Transferable Neural Architecture Search with Diffusion Models for the Real World”
- @ AutoML conference 2023, Germany, September 2023
Invited Talks
- “Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets”
- @ Samsung Electronics DS DIT Center, Korea, April 2021
- @ Korea Agency for Defence Development, Korea, October 2021
- “Task-Adaptive Neural Network Search with Meta-Contrastive Learning”
- @ NeurIPS Social: ML in Korea, Dec 2021
- @ Hanbat National University in Republic of Korea, April 2022
- @ KAIST Programming Language Research Group in Republic of Korea, May 2022
- @ Electronic & Information Research Information Center in Republic of Korea, May 2022
- “HELP: Hardware-Adaptive Efficient Latency Prediction for NAS via Meta-Learning”
- @ NeurIPS Social: ML in Republic of Korea, Dec 2021
- @ Hanbat National University in Republic of Korea, April 2022
- @ KAIST Programming Language Research Group in Republic of Korea, May 2022
- @ Electronic & Information Research Information Center in Republic of Korea, May 2022
- @ Ewha University in Republic of Korea, June 2022
- “DiffusionNAG: Task-guided Neural Architecture Generation with Diffusion Models”
- @ Hanbat National University in Republic of Korea, May 2023
News Articles
- 전산학부 이하연, 전기및전자공학부 최유정, 2022 구글 PhD 펠로우 선정, AI Times, 2022.09.08
- KAIST 황성주 교수팀, NeurIPS에 빅테크 오토ML 문제 푼 비결 공개, AI Times, 2021.12.08
- “NeurIPS에 소개된 국내 연구 성과는?” EIRIC, 5월부터 리뷰 세미나 개최, AI Times, 2022.04.27
Academic Services
Area Chair
- AutoML 2024
Online Experience Chair
- AutoML 2023
Conference Reviewer
- ICML 2020, 2021 (expert), 2022, 2023, 2024
- NeurIPS 2020, 2021, 2022, 2023
- ICLR 2021, 2022, 2023, 2024
- ACL 2022 Dec ARR.
- CVPR 2023, 2024
- AAAI 2021
- ACML 2020
Journal Reviewer
- Transactions on Machine Learning Research (TMLR)
Projects
Human-Inspired Large-Scale Visual Recognition System, Samsung Electronics, 2019-2022
AutoML with Large-scale Hyperparameter Meta-Learning, Google, 2022-2023