Skip to yearly menu bar Skip to main content


Poster

RepKPU: Point Cloud Upsampling with Kernel Point Representation and Deformation

Yi Rong · Haoran Zhou · Kang Xia · Cheng Mei · Jiahao Wang · Tong Lu


Abstract:

In this work, we present RepKPU, an efficient network for point cloud upsampling. We propose to promote upsampling performance by exploiting better shape representation and point generation strategy. Inspired by KPConv, we propose a novel representation called RepKPoints to effectively characterize the local geometry, whose advantages over prior representations are as follows: (1) density-sensitive; (2) large receptive fields; (3) position-adaptive, which makes RepKPoints a generalized form of previous representations. Moreover, we propose a novel paradigm, namely Kernel-to-Displacement generation, for point generation, where point cloud upsampling is reformulated as the deformation of kernel points. Specifically, we propose KP-Queries, which is a set of kernel points with predefined positions and learned features, to serve as the initial state of upsampling. Using cross-attention mechanisms, we achieve interactions between RepKPoints and KP-Queries, and subsequently KP-Queries are converted to displacement features, followed by a MLP to predict the new positions of KP-Queries which serve as the generated points. Extensive experimental results demonstrate that RepKPU outperforms state-of-the-art methods on several widely-used benchmark datasets with high efficiency.

Live content is unavailable. Log in and register to view live content