I am currently a fourth-year Ph.D. candidate in the DB4AI group at The Hong Kong University of Science and Technology (HKUST) supervised by Professor Lei Chen (ACM & IEEE Fellow). My research interest is building efficient systems for popular deep learning models, such as Large Language Models, Graph Neural Networks, and Recommendation Models. I received my Bachelor’s degree (2016-2020) from the School of EECS at Peking University (PKU), where I was advised by Professor Bin Cui (IEEE Fellow) and Professor Sujian Li. I have done several research&engineering internships at MSRA, Tencent, and Baidu.

The pronounciation of my Chinese name is similar to Sheen (for Xin/鑫) Jung (for Zhang/张) in English :)

Publications

Apt-Serve: Adaptive Request Scheduling on Hybrid Cache for Scalable LLM Inference Serving
Shihong Gao, Xin Zhang, Yanyan Shen, Lei Chen.
Accepted by SIGMOD 2025. (regular research track)

Efficient Training of Graph Neural Networks on Large Graphs.
Yanyan Shen, Lei Chen, Jingzhi Fang, Xin Zhang, Shihong Gao, Hongbo Yin.
VLDB 2024. [paper] [code] (tutorial track)

SIMPLE: Efficient Temporal Graph Neural Network Training at Scale with Dynamic Data Placement.
Shihong Gao, Yiming Li, Xin Zhang, Yanyan Shen, Yingxia Shao, Lei Chen.
SIGMOD 2024. [paper] [code] (regular research track)

DUCATI: A Dual-Cache Training System for Graph Neural Networks on Giant Graphs with the GPU.
Xin Zhang, Yanyan Shen, Yingxia Shao, Lei Chen.
SIGMOD 2023. [paper] [code] (regular research track)

Feature-Oriented Sampling for Fast and Scalable GNN Training.
Xin Zhang, Yanyan Shen, Lei Chen.
ICDM 2022. [paper] [code] (accept rate=85/870)

HET-GMP: A Graph-based System Approach to Scaling Large Embedding Model Training.
Xupeng Miao, Yining Shi, Hailin Zhang, Xin Zhang, Xiaonan Nie, Zhi Yang, Bin Cui.
SIGMOD 2022. [paper] [code]

Machine Reading Comprehension: a Literature Review.
Xin Zhang, An Yang, Sujian Li, Yizhong Wang.
Preprint on arXiv, 2019. [paper]

Last Update: Feb 2025