* denotes equal contribution.
2026
-
Universality, Function Composition, and Algorithm Emulation All In-Context
Hong-Yu Chen*, Po-Chiao Lin*, Maojiang Su, Jerry Yao-Chieh Hu, and Han Liu
In ICML, 2026
-
Universal Approximation with Softmax Attention
Jerry Yao-Chieh Hu*, Hude Liu*, Hong-Yu Chen*, Weimin Wu, and Han Liu
In ICML, 2026
-
Chain-of-Thought Gradient Descent
Hong-Yu Chen*, Venkat Sripad Ganti*, Jerry Yao-Chieh Hu, Hude Liu, and Han Liu
In ICML, 2026
-
StarEmbed: Benchmarking Time Series Foundation Models on Astronomical Observations of Variable Stars
Weijian Li*, Hong-Yu Chen*, Nabeel Rehemtulla*, Ved G. Shah, Dennis Wu, Dongho Kim, Qinjie Lin, Adam A. Miller, and Han Liu
In ICML, 2026
2025
-
-
Learning spectral methods by transformers
Yihan He, Yuan Cao, Hong-Yu Chen, Dennis Wu, Jianqing Fan, and Han Liu
arXiv preprint arXiv:2501.01312, 2025
-
2024
-
Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
Jerry Yao-Chieh Hu, Pei-Hsuan Chang, Haozheng Luo, Hong-Yu Chen, Weijian Li, Wei-Po Wang, and Han Liu
In ICML, 2024