Over 100 Primary 4 to Secondary 3 students from 24 schools participated in the HKUST Underwater Robot Competition 2025 on March 29-30. The competition concluded with an award presentation ceremony on May 10, 2025 (pictured), which also celebrated the 10th anniversary of the event.
The three winning teams of the USEL-ASMPT Technology Award 2025 alongside judging panel members: Prof. Mow Wai-Ho (third right), Associate Dean of Engineering (Undergraduate Studies); Prof. Robin Ma (second right), Associate Professor of Engineering Education in the Department of Mechanical and Aerospace Engineering, and Mr. Paul Lavigne (first right), Teaching Associate of the Center for Engineering Education Innovation (E2I)
Chair Professor Fan Zhiyong (center) was honored with a Scientist Award at ICFM 2025, while postdoctoral fellow Dr. Ding Yucheng (left) and PhD student Cao Yang (right) received Graduate Student Awards.
(Front row from left) Dr. Kyle Wong, Chief Executive Officer and Co-founder of PanopticAI Limited; Dr. Kenneth Tsang, Regional Chief Executive Officer of IHH Healthcare North Asia and Chief Executive Officer of Gleneagles Hospital Hong Kong; and Dr. Justin Cheng, Chief Executive Officer and Co-founder of SmartCare, sign a tripartite collaboration agreement on the development of AI-powered smart clinic solution, witnessed by (back row from left) Prof. Nancy Ip, President of The Hong Kong University of Science and Technology; Dr. Prem Kumar Nair, Group Chief Executive Officer of IHH Healthcare; and Dr. Patrick Lau, Deputy Executive Director of the Hong Kong Trade Development Council.
Professor Sun Qingping (left) and Research Assistant Professor Li Qiao (right), both from the Department of Mechanical and Aerospace Engineering at HKUST, demonstrate their newly developed Ti₇₈Nb₂₂ elastic alloy.
Prof. Wang Wei’s co-authored paper, titled “SpInfer: Leveraging Low-Level Sparsity for Efficient Large Language Model Inference on GPUs”, is one of the only two Best Papers recognized at ACM EuroSys 2025 from around 700 entries.
Making Large Language Models More Efficient for Real-World Deployment