Yuanhe Zhang

My name is Yuanhe Zhang ('Russell' for short). I'm generally interested in scaling laws of Large Langauge Models (LLMs) and efficient fine-tuning. Currently, I'm working with Dr. Fanghui LiuLink opens in a new window and Prof. Chenlei Leng on learning theory of neural networks. Previously, I worked with Dr. Fanghui LiuLink opens in a new window and Dr. Yudong ChenLink opens in a new window (from Wisconsin-Madison) on reveiling the training dynamics and generalization of LoRA. I hold the Chancellor's International Scholarship.
Previously, I obtained MMathStat @ Warwick with 1st class. My master thesis is on density estimation in high-dimensional and small data regime with statistical guarantee under the supervision of Dr. Ritabrata DuttaLink opens in a new window. The output was accepted by NeurIPS 2024.
I'm a huge fan of Denver Nuggets and Russell Westbrook.
Preprints:
Zhang, Y., Liu, F., & Chen, Y. (2025). One-step full gradient suffices for low-rank fine-tuning, provably and efficiently. arXiv preprint arXiv:2502.01235.
Publications:
Huk, D., Zhang, Y., Dutta, R., and Steel, M. (2024). Quasi-Bayes meets Vines. In The Thirty-eighth Annual Conference on Neural Information Processing Systems. Available at https://openreview.net/forum?id=gcpeEg88R3Link opens in a new window.
Services:
Program Associate, Reviewer @ NeurIPS 2024 Workshop on Fine-Tuning in Mordern Machine Learning: Principles and Scalability (FITMLLink opens in a new window).
Teaching:
24/25 Term 2: CS416 OptimisationLink opens in a new window