About Me

I’m a Ph.D Candidate at Machine Learning and Artificial Intelligence (MLAI) lab in KAIST, under the supervision of Prof. Sung Ju Hwang (Expected graduation date: Aug. 2023). I was a research intern at MetaAI (Seattle, WA). My research has been supported by Google Ph.D. Fellowship. I will join the FAIR team at Meta as a postdoc under the supervision of Yuandong Tian.

My research interest includes:

  • Neural architecture search
  • Meta-learning
  • AutoML

News

May. 2023. One paper accepted to Findings of ACL 2023
Apr. 2023. Google Travel Grant for ICLR 2023 from Google
Feb. 2023. Selected as the Keynote Speaker for the AutoML conference 2023
Feb. 2023. Selected as the online experience chair for the AutoML conference 2023
Jan. 2023. AI/CS/EE Rising star 2023 supported by Google Explore Computer Science Research
Jan. 2023. One paper accepted to ICLR 2023 as Notable-top-25% - Spotlight Presentation
Oct. 2022. NeurIPS2022 Top reviewer
Sep. 2022. Google Ph.D. fellowship
Aug. 2022. Research internship at MetaAI, Seattle, WA, USA
Feb. 2022. AI/CS/EE Rising star 2022 supported by Google Explore Computer Science Research
Jan. 2022. One paper accepted to ICLR 2022 as Spotlight Presentation
Oct. 2021. Best presentation award at KAIST-ADD workshop
Sep. 2021. Two papers accepted to NeurIPS 2021 as Spotlight Presentations
Apr. 2021. Invited talk at Samsung, Suwon, South Korea
Jan. 2021. One paper accepted to ICLR 2021
Dec. 2020. Research internship at AITRICS, Seoul, South Korea
Dec. 2020. ICML2020 oustanding reviewer (Top 30%)
Nov. 2020. NAVER Ph.D. fellowship
Dec. 2019. One paper accepted to ICLR 2020 as Oral Presentation

Education

Awards & Honors

Work Experiences

Preprints

  • DiffusionNAG: Task-guided Neural Architecture Generation with Diffusion Models
    [paper]
    Sohyun An* Hayeon Lee* Jaehyeong Jo Seanie Lee Sung Ju Hwang (*: equal contribution)
    Arxiv 2023
  • SuperNet in Neural Architecture Search: A Taxonomic Survey
    [paper]
    Stephen Cha, Taehyeon Kim, Hayeon Lee, Se-Young Yun
    Arxiv 2022

International Conference Publications

  • A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models
    [paper]
    Hayeon Lee, Rui Hou, Jongpil Kim, Davis Liang, Sung Ju Hwang and Alexander Min
    Findings of ACL 2023

  • Meta-Prediction Model for Distillation-aware NAS on Unseen Datasets
    [paper] [code]
    Hayeon Lee*, Sohyun An*, Minseon Kim, Sung Ju Hwang (*: equal contribution)
    ICLR 2023, Spotlight Presentation (notable-top-25%)

  • Online Hyperparameter Meta-Learning with Hypergradient Distillation
    [paper]
    Hae Beom Lee, Hayeon Lee, Jaewoong Shin, Eunho Yang, Timothy Hospedales, Sung Ju Hwang
    ICLR 2022, Spotlight Presentation (acceptance = 176 / 3391 = 5.1%)

  • HELP: Hardware-Adaptive Efficient Latency Prediction for NAS via Meta-Learning
    [paper] [code]
    Hayeon Lee*, Sewoong Lee*, Song Chong, Sung Ju Hwang (*: equal contribution)
    NeurIPS 2021, Spotlight Presentation (acceptance < 3%)

  • Task-Adaptive Neural Network Search with Meta-Contrastive Learning
    [paper] [code]
    Wonyong Jeong*, Hayeon Lee*, Gun Park*, Eunyoung Hyung, Jinheon Baek, Sung Ju Hwang (*: equal contribution)
    NeurIPS 2021, Spotlight Presentation (acceptance < 3%)

  • Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets
    [paper] [code]
    Hayeon Lee*, Eunyoung Hyung*, Sung Ju Hwang (*: equal contribution)
    ICLR 2021

  • Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks
    [paper] [code]
    Hae Beom Lee*, Hayeon Lee*, Donghyun Na*, Saehoon Kim, Minseop Park, Eunho Yang, Sung Ju Hwang (*: equal contribution)
    ICLR 2020, Oral Presentation (acceptance = 48/2594 = 1.9%)

Domestic Conference Publication

  • Learning Spatial Relationships for Cross Modal Retrieval
    Hayeon Lee*, Wonjun Yoon*, Jinseok Park, Sung Ju Hwang (*: equal contribution)
    CKAIA 2020

Workshop Publication

  • Lightweight Neural Architecture Search with Parameter Remapping and Knowledge Distillation
    [paper]
    Hayeon Lee*, Sohyun An*, Minseon Kim, Sung Ju Hwang (*: equal contribution)
    AutoML 2022 Workshop

Invited Talks

  • “Rapid Neural Architecture Search by Learning to Generate Graphs from Datasets”
    • @ Samsung Electronics DS DIT Center, Korea, April 2021
    • @ Korea Agency for Defence Development, Korea, October 2021
  • “Task-Adaptive Neural Network Search with Meta-Contrastive Learning”
    • @ NeurIPS Social: ML in Korea, Dec 2021
    • @ Hanbat National University in Republic of Korea, April 2022
    • @ KAIST Programming Language Research Group in Republic of Korea, May 2022
    • @ Electronic & Information Research Information Center in Republic of Korea, May 2022
  • “HELP: Hardware-Adaptive Efficient Latency Prediction for NAS via Meta-Learning”
    • @ NeurIPS Social: ML in Republic of Korea, Dec 2021
    • @ Hanbat National University in Republic of Korea, April 2022
    • @ KAIST Programming Language Research Group in Republic of Korea, May 2022
    • @ Electronic & Information Research Information Center in Republic of Korea, May 2022
    • @ Ewha University in Republic of Korea, June 2022
  • “DiffusionNAG: Task-guided Neural Architecture Generation with Diffusion Models”
    • @ Hanbat National University in Republic of Korea, May 2023

News Articles

Academic Services

Conference Reviewer

  • ICML 2020, 2021 (expert), 2022, 2023
  • NeurIPS 2020, 2021, 2022, 2023
  • ICLR 2021, 2022, 2023
  • ACL 2022 Dec ARR.
  • CVPR 2023
  • AAAI 2021
  • ACML 2020

Journal Reviewer

  • Transactions on Machine Learning Research (TMLR)

Chair

  • Online Experience Chair AutoML Conference 2023

Projects

  • Human-Inspired Large-Scale Visual Recognition System, Samsung Electronics, 2019-2022

  • AutoML with Large-scale Hyperparameter Meta-Learning, Google, 2022-Present