Prospective Students and Visitors
Dr. Zhijiang Guo join the Hong Kong University of Science and Technology (Guangzhou) as an Assistant Professor in Spring 2025. The research group hopes to recruit several PhD students for enrollment in 2025/2026. We would greatly appreciate it if interested applicants could read the guidelines below and submit their applications via email as instructed.
About the Mentor
Zhijiang is currently a researcher at Huawei Noah’s Ark Lab and a visiting researcher at the University of Cambridge. He focuses on research in natural language processing and machine learning, with his current interests primarily centered around large language models (LLMs). He has published over 30 papers in leading conferences such as NeurIPS, ICLR, COLM, TACL, ACL, EMNLP, and NAACL, with several selected for Oral or Spotlight presentations, and has over 2300 citations on Google Scholar.
He has served as an Area Chair (AC) for ICLR, ACL, EMNLP, NAACL, and COLING, as well as a Senior Program Committee (SPC) member for AAAI and IJCAI. He has also been an Action Editor (AE) for the ACL Rolling Review and co-organized multiple FEVER workshops at ACL, EMNLP, and EACL. He earned his bachelor’s degree from Sun Yat-sen University, and then completed his PhD under Prof. Wei Lu at the Singapore University of Technology and Design. He was a visiting researcher in Prof. Shay Cohen’s group at the University of Edinburgh. He conducted postdoctoral research in Prof. Andreas Vlachos’s group at the University of Cambridge before joining Huawei Noah’s Ark Lab in 2023.
About the University
Hong Kong University of Science and Technology (HKUST) is one of the leading research institutions in Asia and globally. In 2025, HKUST ranked 47th in the QS World University Rankings and 3rd in the Times Higher Education Young University Rankings. In September 2022, HKUST (Guangzhou) was established, focusing on interdisciplinary development to complement each other. The curriculum is designed around four key areas: information, function, systems, and society, with research fields covering emerging areas such as data science, robotics and automation systems, artificial intelligence, and advanced materials. Currently, HKUST (Guangzhou) is located in Nansha, Guangzhou. The university offers a monthly scholarship of 18,000 HKD for doctoral students, and the teaching environment is entirely in English.
Research Interests
In my early career, I primarily focused on semantics such as AMR, exploring how to derive them (semantic parsing), and how to apply them to various downstream tasks. Recently, I have shifted my attention toward knowledge and reasoning, delving into intriguing questions that intersect with these two areas in the context of LLMs.
- Reasoning: This area has become increasingly important in the era of LLMs. I am thrilled to collaborate with insightful researchers around the world. Some of my recent projects include AutoPSV (Efficient automated process supervision), LeCo (Efficient self-refine and improvements), MR-BEN (Evaluate reasoning processes and reflection of LLMs), EffiLearner (Self-Optimization with external feedback), and DQ-LoRe (Retrieval-Augmented in-context learning). I am particularly interested in mathematical (PDA and FormalAlign) and coding tasks (MHPP).
- Knowledge: I’m glad to be conducting research on knowledge-intensive tasks in Andreas’s group during my postdoc. I mainly focus on how to leverage external knowledge (through retrieval-augmented methods) and internal knowledge (LLMs themselves) to solve problems. This includes applications in question-answering and fact-checking. If you’re unfamiliar with fact-checking, I invite you to check out our TACL survey. Some recent works in this area include FEVEROUS (Exploring structured and unstructured knowledge), AVeriTec (Addressing real-world fact-checking with external knowledge), Pinocchio (Investigating how LLMs encode factual knowledge), CtrlA (Enhancing adaptive retrieval-augmented generation), and knowledge conflicts (Between internal and external Knowledge of LLMs).
Other Interests:
- Long Contexts: While previous methods often struggled, there has been significant progress in the last 1-2 years. Recent work has focused on knowledge-intensive long-context tasks, including ProxyQA (Evaluating long-form generation) and Long2Rag (Retrieval-augmented generation under long contexts).
- Efficient Methods: Given the cost implications of LLMs, I have been particularly interested in developing efficient approaches. Beyond the efficient reasoning and coding efforts mentioned earlier, I have also been involved in research on efficient fine-tuning, such as HydraLora (Efficient asymmetric structure of LoRA).
- LLMs-as-Judges: With their capabilities, LLMs can often be used for evaluation of many things, such as assessing natural language generation or serving as a reward model. Recent projects in this area include PairS (Enhancing alignment with human preferences).
Experience in Mentoring
He has mentored over ten students at Cambridge and Huawei Noah’s Ark, most of whom have published papers at top conferences, including their first papers. I respect students’ interests and am dedicated to helping them grow in all aspects. I have written recommendation letters for many of my collaborating students (the application season is quite exhausting though). You can find the students I have mentored here. For more about my mentoring style, feel free to contact them; I have obtained their consent for this XD.
For PhD Students
As I will be joining the university in the spring of 2025, I am currently looking for students to start in the fall of 2025. Here are some key requirements I hope potential candidates will consider:
- Passionate about the fundamental research of LLMs, along with a desire to contribute to impactful scientific work. While it’s perfectly fine to begin with shorter projects, I hope that as we progress, we can collaborate on higher-quality, more influential research.
- Relevant research or engineering experience in LLMs. After two weeks of screening and interviews, I have updated this criterion. At this stage, I will prefer applicants who have research experience relevant to my research direction.
- Strong mathematical and programming skills, willing to learn and explore new ideas. Given the rapid advancements in our field, staying proactive is essential to keep up.
- A solid foundation in English writing and publication. The admission requirements for HKUST (GZ) include an IELTS score of 6.5 or a TOEFL score of 80. However, I believe that strong communication skills are equally important. Good writing can make a significant difference in sharing your ideas, and effective communication is vital for networking at conferences. I would be happy to attend conferences with you and can help introduce you to others in the field.
I will give priority to applicants who meet the following criteria:
- Experience in large language models, machine learning, or natural language processing, with publications in key conferences such as ML (ICML/NeurIPS/ICLR), NLP (ACL/EMNLP/NAACL/EACL), or LLM (COLM). I focus on these venues, as well as some journals (TACL/TMLR).
- Strong programming abilities and involvement in meaningful open-source projects related to LLMs.
- Previous experience as a research assistant in our lab or collaborators. I value working with individuals with whom I have had positive experiences, as our time together during the PhD will be collaborative and intensive.
For Research Assistant/Intern/Visitors
I have already welcomed multiple interns, research assistants, and visitors. At this time, I do not have plans to recruit additional team members. I will update this post when these new positions become available.
- My main goal is for the team to focus on a few key research areas. If our interests diverge significantly, it may be challenging for me to provide effective guidance. To ensure I can engage with all students’ research, I plan to limit the number of recruits.
- You can start in the spring without waiting until the fall. If our collaboration goes well, I would be delighted to have you continue as a PhD student. If you’re interested in exploring opportunities elsewhere, I would be more than happy to recommend you to other institutions abroad; this year, I’ve already submitted over 30 recommendation letters for two students based on our joint work.
- Both interns and visitors can work remotely, and time zone differences are not an issue for me, as I’m accustomed to it. We can discuss the specifics and come to an agreement; these roles are more flexible compared to those of PhD students.
- I strongly encourage active discussion and collaboration within the group, and I hope all students will participate actively in our meetings and share their research findings.
For MPhil Students
I am unable to recruit any MPhil students. Please apply through the university’s application system. If you have already been admitted, feel free to email me if you’re interested in exploring any research directions within our group.
Some Key Points
- I hope that the research interests of the applicants align well with mine, as I want to provide hands-on guidance to each student in their research, including brainstorming paper ideas, designing experiments, writing and submitting papers, and handling rebuttals. While modifying code may be more challenging, I am willing to assist with that as well.
- I do not place much emphasis on the educational background of applicants; instead, I value their attitudes toward learning and work, as well as their ability to handle pressure. In my view, most research on LLMs is not rocket science; it often requires a rigorous mindset and perseverance. When I refer to the ability to handle pressure, I don’t mean that I will impose stress on students—rather, you may need to be self-motivated. The current AI research community is vast, and the review process can be uncertain. Sometimes, our submissions may not receive recognition from reviewers, and publishing papers is crucial for job searches after earning a PhD. Even if our prior submission fails, I hope we can work together to ultimately publish quality work.
- I strongly encourage collaboration with researchers in academia and industry worldwide. This includes exchange programs during the PhD with universities in North America, Europe, and Singapore, as well as internships at companies. Having benefited from global collaborations myself, I have established connections that I can leverage to help you networking.
- Regarding my mentoring style, I hold at least one one-on-one discussion each week. If you need to discuss something, feel free to reach out or leave a message, and I will arrange a short meeting as soon as possible. Additionally, I organize weekly group meetings to help students understand each other’s research directions, and I regularly invite researchers from around the world to give talks during paper-sharing sessions.
- As HKUST (GZ) is a new university, if you are highly concerned about the institution’s reputation and ranking, it may not be the right fit for you.
Application Process
Please send your email to zhijiangguo@hkust-gz.edu.cn . To enhance efficiency, kindly include the following key points in your message:
- Research Direction: Share your research area and any publication records. If you don’t have publications, feel free to outline your research interests instead.
- Interest in My Work: Specify which of my research directions or papers interest you, and what particular area you wish to pursue if you join my team.
- Additional Information: Attach any relevant documents, such as your resume or transcripts. Please format the title of the email as “Year-Position-Your Name-Affiliation” (e.g., “25Fall-PhD-Wukong-SYSU”).
I typically respond to most emails quickly, whether to schedule an interview or to indicate that we may not be a good fit. If you don’t hear back from me for a while, it means I’m still considering your application and may need additional time. Thank you for your understanding!