Bio
Hi, I am a research engineer at Google, working on developer assistance and coding capabilities of large language models. Previously, I finished my Ph.D. at The Ohio State University, advised by Dr. Huan Sun, and received a B.Eng. in Computer Science from USTC. In the past, I have had the opportunity to intern and work with many wonderful industrial collaborators from Microsoft Research, Amazon and Google Research.
Research
My research interests lie in NLP and artificial intelligence in general, with emphasis on utilizing knowledge from heterogeneous sources and developing practical applications with AI. The aim is to build AI-powered systems/agents that can assist with decision-making and daily tasks for regular users as well as domain experts in Digital Era. Specifically, I am interested in the following directions:
- Large-scale pretraining and representation learning for data from heterogeneous sources (plain text, structured data, code, images, etc); both for general and domain-specific applications; and learning beyond next token prediction.
- Natural language agents with varied data, services and environments. Building practical agents that are accessible and collaborative to the user, and generalist agents that are efficient and robust.
My recent work has been focusing on large language models, how to understand and improve their capabilities at different stages of training, how to make them ground and act in the real world, and how to build applications around LLMs that are collaborative, trustworthy and accessible to users.
News
- [June, 2023]: Our preprint “Mind2Web: Towards a Generalist Agent for the Web” is online, check our paper and explore the dataset!
- [May, 2023]: Two works accepted to ACL’23! “Towards Understanding Chain-of-Thought Prompting: An Empirical Study of What Matters” led by Boshi Wang, and “Don’t Generate, Discriminate: A Proposal for Grounding Language Models to Real-World Environments” led by Yu Gu.
- [May, 2023]: I will attend WWW’23 and present my internship work at Google Research “What Do LLMs Know about Financial Markets? A Case Study on Reddit Market Sentiment Analysis”, as well as my on going project “A More Accessible Web with Natural Language Interface” Preview.
- [November, 2022]: I was awarded the Presidential Fellowship from OSU Graduate School! Super grateful for my Advisor Prof. Huan Sun and all the collaborators! (“The Presidential Fellowship is the most prestigious award given by the Graduate School. Recipients of this award embody the highest standards of scholarship in the full range of Ohio State’s graduate programs.”)
- [October, 2022]: Our work “Iteratively Prompt Pre-trained Language Models for Chain of Thought” led by Boshi Wang was accepted to EMNLP 2022!
- [June, 2022]: Our OSU TacoBot team earned the third-place honor in the first Alexa Prize TaskBot Challenge! 10 teams were selected worldwide out of 125 initiated applications to participate in the challenge in May 2021 and 5 teams were selected into finals in April 2022. Check out our report.
See More
- [May, 2022]: I will join Google Research NYC this summer as a research intern, working on financial social media analysis.
- [Januray, 2022]: Our VLDB'21 paper "TURL: Table Understanding through Representation Learning" was selected for 2022 ACM SIGMOD Research Highlight Award! Check the paper, technical perspective, and report on OSU CSE News.
- [August, 2021]: Our work "ReasonBert: Pre-trained to Reason with Distant Supervision" was accepted to EMNLP 2021! Find the paper and try the pre-trained model.
- [August, 2021]: I will join the OSU Tacobot Team for the Alexa Prize TaskBot Challenge.
- [May, 2021]: I will Join the Amazon Product Graph Team this summer as Applied Scientist Intern, working on information extraction from structured web pages.
- [March, 2021]:Our work on “Structure-Grounded Pretraining for Text-to-SQL” was accepted to NAACL 2021!
- [October, 2020]:Our work on “Table Understanding through Representation Learning” was accepted to VLDB 2021!
- [May, 2020]: I will Join Microsot Research this summer as Research Intern, working on text2SQL.
- [August, 2019]: Our work on “Relation Extraction with 2-hop Distant Supervision” was accepted to EMNLP!