I am a Senior Research Scientist in Tencent AI Seattle Lab. My primary research interest lies in natural language processing and machine learning. Specifically, I am working on using knowledge (i.e., text retrieval, knowledge graph and etc) to enhance factuality and reasoning capibility of (large) language models.
I earned my Ph.D. in Computer Science and Engineering from the University of Notre Dame in 2023, advised by Prof. Meng Jiang. My research during Ph.D was generously supported by Bloomberg Ph.D Fellowship. Prior to my Ph.D., I received my Bachelor’s degree in Computer Science and Technology from Sichuan University in 2019.
I am actively seeking highly motivated interns who share my research interests. Kindly reach out to me at email@example.com with your resume!
Check my latest post on Twitter!
- [2023.09] Four papers are accepted at EMNLP 2023, on question answering, instruction tunning, math reasoning, and comparative reasoning.
- [2023.09] One paper is accepted at NeurIPS 2023.
- [2023.05] Three papers (two main and one findings) are accepted at ACL 2023.
- [2023.05] One paper on open-domain QA is accepted at TACL 2023.
- [2023.01] Two papers are accepted at ICLR 2023, on large language model for open-domain QA and multi-task pre-training.
- [2023.01] One survey paper is accepted at EACL 2023 on multi-task learning in NLP.
- [2022.11] Our paper “Empowering Language Models with Knowledge Graph Reasoning for Question Answering” [link] won the best paper award at SoCal NLP Symposium 2022! The paper is also accepted to EMNLP 2022!