Asiagraphics Young Researcher Award (2024)

The 2024 Asiagraphics (AG) Young Researcher Award was presented to Dr. Minhyuk Sung from KAIST, South Korea. The winner of this award was selected by the award jury chaired by Prof. Ming Lin (UMD College Park) and Prof. Leif Kobbelt (RWTH Aachen).

Minhyuk Sung

Dr. Minhyuk Sung is an assistant professor at KAIST, South Korea. Before joining KAIST, he spent one year as a research scientist at Adobe Research. He received his Ph.D. from Stanford University in 2019 and received M.S. and B.S. degrees from KAIST in 2008 and 2010, respectively. From 2010 to 2013, he was a researcher at KIST. His research focuses on machine learning for geometry processing.

Dr. Sung has pioneered research in understanding the compositional structure of 3D objects and applying this understanding to various geometry processing tasks, including segmentation (NeurIPS 2018, CVPR 2019, CVPR 2021 oral, CVPR 2022 oral, WACV 2023, 3DV 2024), generation/completion (SIGGRAPH Asia 2015, SIGGRAPH Asia 2017, SGP 2018, ECCV 2022), and reconstruction (CVPR 2019 oral, ECCV 2020, ICCV 2021, CVPR 2022). Notably, his research on ComplementMe (SIGGRAPH Asia 2017, selected as one of the six papers featured in an ACM press release) was the first to introduce learning the relationships across unlabelled parts of 3D objects to generate a novel 3D object through assembly. His work on SPFN (CVPR 2019 oral) also made significant strides in reconstructing raw 3D geometry using primitives, combining classical optimization approaches with the predictive power of neural networks. Recently, PartGlot (CVPR 2022 oral) introduced a novel 3D segmentation method based on natural language descriptions of 3D objects, demonstrating how holistic 3D linguistic descriptions can be used for segmentation through attention mechanisms.

His research has also pioneered new directions in learning the deformability of 3D objects and applying this to 3D object retrieval and editing. DeformSyncNet (SIGGRAPH Asia 2020) introduced the concept of learning synchronized dictionaries of possible deformations for 3D objects, which enables the transfer of deformation from one 3D object to another. DeepMetaHandles (CVPR 2021 oral) extended this idea by proposing the learning of meta-handles that capture deformation correlations across different parts of an object, facilitating more intuitive 3D object deformation.

Dr. Sung’s recent work extends into generative model techniques for various visual contents, including the introduction of a novel 3D generative model (ICCV 2023), expanding the capabilities of pretrained image diffusion models for arbitrary image generation (NeurIPS 2023), and editing meshes (CVPR 2024) and NeRF/Gaussian Splats (CVPR 2024).

Dr. Sung Actively serves the graphics communities as the technical program committees for SIGGRAPH Asia (2022, 2023), Pacific Graphics (2023), and Eurographics (2022, 2024). Additionally, he holds the role of associate editor at Graphics Models since 2022. He is a member of the working team of Asiagraphics Webinar. DeepMetaHandles (CVPR 2021 oral) stepped further to learn meta-handles of deformation learning the correlation across parts to allow the users to produce plausible deformation with more intuitive handles.