Hi there! Iโm Qi Li, a CS PhD student at Tsinghua University, and I also serve as a visiting student at HKUST-GZ. I received my Masterโs degree from Tsinghua University, and my Bachelorโs degree in computer science from Lanzhou University.
My current research interests focus on:
- Efficient algorithm for LLM: PEFT, Knowledge Editing
- Post-training of LLM
- Machine learning systems for LLM
- Understanding LLM from both theoretical and empirical perspectives
Iโm open to collaboration and discussion. Feel free to reach out via email!
Email: lqinfdim AT 163.com
๐ฅ News
- 2026.05: ย ๐๐ One paper is accepted by ICML 2026.
- 2026.01: ย ๐๐ One paper is accepted by ICLR 2026.
- 2025.09: ย ๐๐ One paper is accepted by NeurIPS 2025.
- 2025.05: ย ๐๐ One paper is accepted by ACL 2025.
- 2025.04: ย ๐๐ One paper is accepted by ISIT 2025.
- 2024.09: ย ๐๐ One paper is accepted by NeurIPS 2024.
- 2024.05: ย ๐๐ One paper is accepted by ACL 2024.
๐ Educations
- 2025.08 - Current, Tsinghua University, PhD Student.
- 2019.08 - 2022.08, Tsinghua University, Master of Engineering.
- 2014.09 - 2018.06, Lanzhou University, Bachelor of Engineering.
๐ Selected Publications
- Zhang Q, Chu C, Peng T, et al. LoRA-DA: Data-Aware Initialization for Low-Rank Adaptation via Asymptotic Analysis. Forty-third International Conference on Machine Learning (ICML 2026, CCF-A).
- Li Q, Wu J, Liu X, et al. Reasoning language model inference serving unveiled: An empirical study. Long Paper of The Fourteenth International Conference on Learning Representations (ICLR 2026, CCF-A).
- Li Q. et.al. AdaEdit: Advancing Continuous Knowledge Editing For Large Language Models. Long paper of the 63rd Annual Meeting of the Association for Computational Linguistics (ACL 2025, CCF-A).
- Qingyue Zhang, Haohao Fu, Guanbo Huang, Yaoyuan Liang, Chang Chu, Tianren Peng, Yanru Wu, Qi Li, Yang Li, Shao-Lun Huang. A High-Dimensional Statistical Method for Optimizing Transfer Quantities in Multi-Source Transfer Learning. Main Track Paper of the Thirty-nith Annual Conference on Neural Information Processing Systems (NeurIPS 2025, CCF-A).
- T. Peng, Q. Li, S.-L. Huang, โOn the Optimal Second-Order Convergence Rate of Minimax Estimation Under Weighted MSE,โ IEEE International Symposium on Information Theory, Jun., 2025. (ISIT 2025 Oral, Tsinghua B)
- Li Q. et.al. Should We Really Edit Language Models? On the Comprehensive Evaluation of Edited Language Models. Main Track Paper of the Thirty-eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024, CCF-A).
- Li, Q. et al. Can We Continually Edit Language Models? On the Knowledge Attenuation in Sequential Model Editing. Long paper of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024, CCF-A).
- Li, Q. et.al. Harnessing the Power of Pre-trained Vision-Language Models for Efficient Medical Report Generation. Long Paper of 32nd ACM International Conference on Information and Knowledge Management (CIKM 2023 Oral, CCF-B).
๐ป Projects
(For more details can click the images)
๐ Honors and Awards
- 2015.12, First Class Scholarship for Outstanding Students, Lanzhou University.
- 2017.12, First Class Scholarship for Outstanding Students, Lanzhou University.
- 2015.12, Outstanding Student at Lanzhou University.
- 2016.12, Outstanding Student at Lanzhou University.
- 2017.12, Outstanding Student at Lanzhou University.
- 2020.12, Scholarship for Excellent Students, Tsinghua Shenzhen International Graduate School.
๐ Academic Services
- Conference Reviewer: ACL ARRโ25,26, ICMLโ25,26, ICLRโ25,26, NeurIPSโ24,25,26, CVPRโ26, EMNLPโ23,24,25, ICASSPโ23,24,25,26, ECAIโ23,24, ICMEโ24,25,26
- Journal Reviewer: TMLR, DMLR

