Hong-Yu Chen | Northwestern University

Department of Computer Science.
Address: Mudd Hall 3205, 2233 Tech Drive, Third Floor, Evanston, IL 60208.
Email: charlie.chen@u.northwestern.edu.

website_self_2.png

Hi, I’m a second-year PhD student at Northwestern University in MAGICS lab advised by Prof. Han Liu. I focus on advancing the understanding and application of foundation models. My research explores their theoretical foundations, including their universal approximation capabilities, potential to perform complex algorithms, and their links to associative memory such as Hopfield Model. I also work on extending their applications beyond language and vision, particularly in time series analysis and other scientific domain such as astrophysics. I received my B.S. degree in Physics from National Taiwan University, advised by Prof. Hsi-Sheng Goan.

news

Apr 30, 2026 Four co-first-author papers accepted to ICML 2026!
Mar 20, 2026 Honored to receive the Lambda Research Grant.
Mar 04, 2026 Co-presented a work-in-progress talk at SkAI Institute on StarEmbed: developing and benchmarking time series foundation models for irregularly sampled astronomical light curves.
Oct 08, 2025 Gave a talk at Open Accelerated Computing (OAC) Summit 2025 on project StarEmbed-GPT.
Sep 04, 2025 Gave a talk at Open SkAI 2025 on project StarEmbed-GPT.

selected publications

  1. Universality, Function Composition, and Algorithm Emulation All In-Context
    Hong-Yu* Chen, Po-Chiao* Lin, Maojiang Su, Jerry Yao-Chieh Hu, and Han Liu
    In ICML, 2026
    *Equal Contribution
  2. Universal Approximation with Softmax Attention
    Jerry Yao-Chieh* Hu, Hude* Liu, Hong-Yu* Chen, Weimin Wu, and Han Liu
    In ICML, 2026
    *Equal Contribution
  3. Chain-of-Thought Gradient Descent
    Hong-Yu* Chen, Venkat Sripad* Ganti, Jerry Yao-Chieh Hu, Hude Liu, and Han Liu
    In ICML, 2026
    *Equal Contribution
  4. StarEmbed: Benchmarking Time Series Foundation Models on Astronomical Observations of Variable Stars
    Weijian* Li, Hong-Yu* Chen, Nabeel* Rehemtulla, Ved G. Shah, Dennis Wu, Dongho Kim, Qinjie Lin, Adam A. Miller, and Han Liu
    In ICML, 2026
    *Equal Contribution
  5. Learning spectral methods by transformers
    Yihan He, Yuan Cao, Hong-Yu Chen, Dennis Wu, Jianqing Fan, and Han Liu
    arXiv preprint arXiv:2501.01312, 2025
  6. Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
    Jerry Yao-Chieh Hu, Pei-Hsuan Chang, Haozheng Luo, Hong-Yu Chen, Weijian Li, Wei-Po Wang, and Han Liu
    In ICML, 2024