我当前在上海交通大学人工智能学院担任助理教授,领导 EPIC (Efficient and Precision Intelligent Computing) 实验室,有硕士与博士学生指导资格。 此前,我在2024年6月于清华大学交叉信息学院获得博士学位,导师为马恺声副教授。在读博期间,我获得了微软学者荣誉称号(亚太地区年度共十二名),北京市优秀毕业生,清华大学优秀博士论文,清华大学启航奖金奖,清华大学蒋南翔奖学金。在博士期间,我已发表高水平学术论文二十余篇,其中第一作者十三篇,论文总计被引用超过2200次(截至2024年11月)。本人研究成果已在北极雄芯、华为、交叉信息核心技术研究院等公司中得到使用。在2024年7月后,我以助理教授的身份加入上海交通大学人工智能学院。
当前,实验室正在招收博后、本科或研究生的研究助理以及2026/2027级的学生。如果你有兴趣,请查看我们的招生贴。
一、轻量高效的语言/多模态大模型: 当前的生成式大模型有着百亿量级的参数,导致了极高的训练和推理成本,起了诸多问题。例如,OpenAI曾因无法承担计算成本限制其用户为ChatGPT4付费。通过研究面向生成式大模型的压缩加速方法,我们可以降低大模型的部署成本,使得大模型更好地在现实世界中得到使用。同时,如何使得小模型能具备大模型相同的表征能力也是人工智能的基础核心问题之一。
二、轻量高效的AIGC模型: 以Stable Diffusion和Sora为代表的文生图、文生视频模型已经掀起了AIGC(人工智能内容生成)的浪潮。然而,高分辨率的图像和长视频的计算成本往往极高,导致其难以真正投入到产业应用中。为解决该问题,我们致力于实现实现高效的视觉生成模型,推动AIGC产业落地。
三、数据高效的人工智能: 当前的人工智能模型需要在极大量的数据基础上进行训练,这严重提高了大模型的训练成本。如何更加高效地利用数据,更加科学地对数据进行清洗和合成,以及利用合成数据进一步提升生成式模型,通往数据高效的人工智能。
在以下学术会议和期刊担任审稿人: NeurIPS, ICML, ICLR, CVPR, ECCV, ICCV, AAAI, IJCAI, AISTATS, IEEE TPAMI, IEEE TCSVT, IEEE TIP, IEEE TMI, IJCV, Pattern Recognition, TACO, Scientific Reports and others.
在以下学术会议和期刊担任领域主席或客座编辑: IJCNN2025, Big Data and Cognitive Computing, ACL2025.
2024.12, 东北大学,沈阳,报告题目:免训练的扩散模型推理加速技术。
2024.12, 华为车智能汽车BU AI智能青年论坛,上海,报告题目:词元视角下的生成式模型压缩。
2024.12, 上海财经大学,上海,报告题目:词元视角下的生成式模型压缩。
2024.11, 中国农业大学,上海,报告题目:基于扩散模型的AIGC模型加速推理加速。
2024.4, 华为计算产品线青年论坛,杭州,报告题目:基于知识蒸馏的模型压缩。
Linfeng Zhang got his Bachelor degree in Northeastern University and then got his Ph.D. degree in Tsinghua Univeristy. Currently, he leads Efficient and Precision Intelligent Computing (EPIC) lab in Shanghai Jiaotong Univeristy.
Shaobo Wang is a Ph.D. candidate in the EPIC Lab at SAI, Shanghai Jiao Tong University, starting in 2024. Building on a strong background in efficient AI, explainable AI, and deep learning theory, he focuses his research on data synthesis and data reduction. He is particularly interested in foundation models, striving to understand their intrinsic behavior while making them more data efficient, lightweight, and cost effective in both training and inference.
Yifeng Gao is a master student in EPIC Lab, Shanghai Jiaotong University. His research interests focus on developing capable, reliable and efficient AI with algorithm and computing co-design. Currently, he focus on efficient inference of the multi-step reasoning on large language models as well as their truthworthiness.
Zichen Wen is a Ph.D. student in the EPIC Lab at Shanghai Jiao Tong University, under the supervision of Prof. Linfeng Zhang. He holds a B.S. degree in Computer Science from the University of Electronic Science and Technology of China. During his undergraduate studies, he published multiple research papers in prestigious AI conferences, including AAAI, ACM MM,etc. His research interests lie in Efficient Multi-Modal Large Models and Trustworthy AI, focusing on advancing the efficiency, reliability, and ethical aspects of artificial intelligence systems.
Xuelin Li will begin pursuing a Ph.D. degree at the EPIC Lab in 2025. He is expected to graduate with a Bachelor's degree from the University of Electronic Science and Technology of China (UESTC), where he achieved a perfect GPA: 4.0/4.0 in all courses within the School of Software. During his undergraduate studies, he received numerous awards, including National Scholarship. His research interests focus on developing efficient inference paradigms for trustworthy multimodal large language models..
Zexuan Yan is currently a senior student majoring in Computer Science and Technology at the University of Science and Technology of China, and will join the EPIC Lab of Zhang Linfeng's research group at the School of Artificial Intelligence, Shanghai Jiao Tong University in the fall of 2025. His research interests include multimodal models, AIGC, and diffusion model acceleration.
Chang Zou is currently an undergraduate student at Yingcai Honors College, University of Electronic Science and Technology of China (UESTC), expected to complete his bachelor's degree in 2026. Originally from Chengdu, Sichuan, he doesn’t eat spicy food despite his hometown’s reputation. His primary research focus is on the efficient acceleration of AIGC, particularly Diffusion Models, and he has a solid background in mathematics and physics. In 2024, he began his internship at the EPIC Lab, where, under the guidance of his advisor, Linfeng Zhang, he contributed to submissions for ICLR and CVPR.
Xuyang Liu is currently pursuing his M.S. degree at the College of Electronics and Information Engineering, Sichuan University. He is also a research intern at Taobao & Tmall Group, where he focuses on efficient multi-modal large language models. In 2024, he joined the EPIC Lab as a research intern under the guidance of Prof. Linfeng Zhang, contributing to the development of a comprehensive collection of resources on token-level model compression. His research interests include Efficient AI, covering areas such as discrimination, adaptation, reconstruction, and generation.