Four Students Receive Honorable Mention in CRA Undergraduate Research Awards
Brian Chen, Kevin Hayes, Sean Rhee, and Marko Veljanovski made significant contributions to undergraduate research projects
Four Northwestern Computer Science students — Brian Chen, Kevin Hayes, Sean Rhee, and Marko Veljanovski — received honorable mentions in the Computer Research Association (CRA) 2024-2025 Outstanding Undergraduate Researcher Award competition.
The nationwide award program, supported by the US Department of Energy’s Sandia National Laboratories and Lawrence Berkeley National Laboratory, recognizes undergraduate students who demonstrate exemplary potential in computing research.
"I am really delighted to see the success of our undergraduate researchers, with several being recognized by CRA's national competition,” said Samir Khuller, Peter and Adrienne Barris Chair of Computer Science at Northwestern Engineering. “Also, a big congratulations to the faculty mentors who played a significant role in guiding the students towards exciting research topics."
Brian Chen
Adviser: Jill Fain Lehman, Carnegie Mellon University
Chen is a second-year student in computer science at the McCormick School of Engineering. This summer, he participated in the Human-Computer Interaction Institute (HCII) Summer Undergraduate Research Program at Carnegie Mellon University.
With HCII adviser Jill Fain Lehman, Chen contributed to the development of a multimodal digital assistant that guides patients through at-home, postoperative care procedures. He investigated the question-answering capabilities of GPT-3.5 Turbo, building a pipeline utilizing OpenAI’s API. Following 25 iterations of prompt revisions, the team evaluated the performance against six error categories.
Chen also used natural language data to classify and output probabilities for a patient’s current step in a wound care procedure.
At Northwestern, Chen is a member of the Sensing, Perception, Interactive Computing and Experiences (SPICE) Lab, led by Karan Ahuja, the Lisa Wissner-Slivka and Benjamin Slivka Assistant Professor of Computer Science and (by courtesy) assistant professor of electrical and computer engineering at the McCormick School of Engineering.
Kevin Hayes
Adviser: Peter Dinda
Hayes is a third-year student in computer science at Northwestern Engineering. A member of the Prescience Lab, Hayes is advised by Peter Dinda, professor of computer science and (by courtesy) electrical and computer engineering at Northwestern Engineering.
Hayes’s research focuses on the design and performance of novel operating systems, leveraging compiler techniques and architectural features to advance innovation in the kernel.
During the past year, Hayes has collaborated with computer science PhD student Kirill Nagaitsev on the “BEANDIP” project. The team aims to replace hardware interrupts with software polls automatically distributed across the kernel and user-space applications. By allowing the CPU to ask devices if they need attention at well-defined locations, BEANDIP reduces the non-determinism and high delivery cost of hardware interrupts, while potentially improving performance of interrupt-driven applications such as database engines or web servers.
“One of the huge benefits of undergraduate research is that it teaches you the subject you're researching in a way a class never could,” Hayes said. “This award is an indicator that I'm on the right track from ‘student’ to ‘researcher,’ and that means the world to me.”
Sean Rhee
Adviser: Peter Dinda
A fourth-year student in computer science, Rhee is a member of the multi-institution Constellation Project led by Dinda and co-PI Umut Acar (Carnegie Mellon University). The project aims to achieve high productivity and high performance on heterogeneous systems through parallelism.
Parallel workloads use multi-device and often multi-node computer clusters to improve performance. As application scale increases, for instance large language models and other massive-scale artificial intelligence workloads, the communication between devices bottlenecks computation processes. This problem is exacerbated by the networks themselves, which are often composed of complex, multi-layer topologies of varying performance.
Working closely with Mike Wilkins (PhD ’23) and Peizhi Liu (BS/MS ’23), Rhee developed high performance communication algorithms that don’t incur immense synthesis costs. The team implemented their algorithm through CUDA kernels placed directly into the NVIDIA Collective Communication Library (NCCL) and achieved up to 2x performance over NCCL on the test machine.
“Through my research journey, I've learned how to critically approach research problems, test hypotheses, and aggregate results to inform next steps,” Rhee said. “And hopefully, with continued dedication and effort, I'll be equipped to keep making progress not only on this work that lies at the heart of the tools that enable scientific and machine learning research, but on all of my future research pursuits as well.”
Marko Veljanovski
Adviser: Zach Wood-Doughty
Veljanovski is a fourth-year student pursuing a double major in computer science and mathematics. He is interested in the field of natural language processing (NLP), with a particular focus on large language models (LLMs). Veljanovski works on a variety of theoretical and applied projects, ranging from the design of causal estimators to improvements in abstractive summarization to creating more efficient LLM transfer learning methods.
Veljanovski conducted research last summer on invariant risk minimization (IRM), an approach for out-of- distribution generalization using an optimal classifier that remains consistent across environments.
“While IRM has been extensively tested with image data, text-based datasets remain underexplored, despite out-of-distribution generalization being crucial for large language model performance,” Veljanovski said.
Veljanovski and his team designed and adapted a flexible synthetic text data generating process to evaluate IRM for natural language processing.
In addition, Veljanovski was first author on the paper “DoubleLingo: Causal Estimation with Large Language Models,” which focused on the design of a causal estimator model that adjusts for textual confounders. On a novel text RCT dataset, Veljanovski was able to achieve the lowest estimation error compared to other estimators in the literature.
Veljanovski is advised by Zach Wood-Doughty, assistant professor of instruction in Northwestern Engineering.