Hi there! I’m Qi Li, a CS PhD student at Tsinghua University, where I also serve as a visiting student at HKUST-GZ. I received my Master’s degree in Engineering from Tsinghua University, and my Bachelor’s degree in Engineering from Lanzhou University (LZU).
My current research interests focus on:
- Efficient algorithm for LLM: PEFT, Knowledge Editing
- Post-training of LLM
- Machine learning systems for LLM
- Understanding LLM from both theoretical and empirical perspectives
I’m open to collaboration and discussion. Feel free to reach out via email!
Email: lqinfdim AT 163.com
🔥 News
- 2026.01: 🎉🎉 One paper is accepted by ICLR 2026.
- 2025.09: 🎉🎉 One paper is accepted by NeurIPS 2025.
- 2025.05: 🎉🎉 One paper is accepted by ACL 2025.
- 2025.04: 🎉🎉 One paper is accepted by ISIT 2025.
- 2024.09: 🎉🎉 One paper is accepted by NeurIPS 2024.
- 2024.05: 🎉🎉 One paper is accepted by ACL 2024.
📖 Educations
- 2025.08 - Current, Tsinghua University, PhD Student.
- 2019.08 - 2022.08, Tsinghua University, Master of Engineering.
- 2014.09 - 2018.06, Lanzhou University, Bachelor of Engineering.
📝 Selected Publications
- Li Q, Wu J, Liu X, et al. Reasoning language model inference serving unveiled: An empirical study[J]. Long Paper of The Fourteenth International Conference on Learning Representations (ICLR 2026, Tsinghua-A).
- Li Q. et.al. AdaEdit: Advancing Continuous Knowledge Editing For Large Language Models. Long paper of the 63rd Annual Meeting of the Association for Computational Linguistics (ACL 2025, CCF-A).
- Qingyue Zhang, Haohao Fu, Guanbo Huang, Yaoyuan Liang, Chang Chu, Tianren Peng, Yanru Wu, Qi Li, Yang Li, Shao-Lun Huang. A High-Dimensional Statistical Method for Optimizing Transfer Quantities in Multi-Source Transfer Learning. Main Track Paper of the Thirty-nith Annual Conference on Neural Information Processing Systems (NeurIPS 2025, CCF-A).
- T. Peng, Q. Li, S.-L. Huang, “On the Optimal Second-Order Convergence Rate of Minimax Estimation Under Weighted MSE,” IEEE International Symposium on Information Theory, Jun., 2025. (ISIT 2025 Oral, Tsinghua B)
- Li Q. et.al. Should We Really Edit Language Models? On the Comprehensive Evaluation of Edited Language Models. Main Track Paper of the Thirty-eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024, CCF-A).
- Li, Q. et al. Can We Continually Edit Language Models? On the Knowledge Attenuation in Sequential Model Editing. Long paper of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024, CCF-A).
- Li, Q. et.al. Harnessing the Power of Pre-trained Vision-Language Models for Efficient Medical Report Generation. Long Paper of 32nd ACM International Conference on Information and Knowledge Management (CIKM 2023 Oral, CCF-B).
💻 Projects
(For more details can click the images)
🎖 Honors and Awards
- 2015.12, First Class Scholarship for Outstanding Students, Lanzhou University.
- 2017.12, First Class Scholarship for Outstanding Students, Lanzhou University.
- 2015.12, Outstanding Student at Lanzhou University.
- 2016.12, Outstanding Student at Lanzhou University.
- 2017.12, Outstanding Student at Lanzhou University.
- 2020.12, Scholarship for Excellent Students, Tsinghua Shenzhen International Graduate School.
👔 Academic Services
- Conference Reviewer: ACL ARR’25,26, ICML’25,26, ICLR’25,26, NeurIPS’24,25, CVPR’26, EMNLP’23,24,25, ICASSP’23,24,25,26, ECAI’23,24, ICME’24,25,26

