I'm a Research Scientist at the
My research focuses on developing efficient large-scale AI models for real-world applications. I specialize in model optimization techniques, including AutoML, Neural Architecture Search (NAS), meta-learning, hyperparameter optimization, and model compression through Knowledge Distillation (KD). Recently, I have been particularly interested in designing, analyzing, and evaluating large language models (LLMs) for efficient long-context generation and optimizing Retrieval-Augmented Generation (RAG) techniques, enabling practical deployment in real products.
Related keywords: LLM Efficiency, RAG, AutoML, NAS, KD, Meta-learning