Poster
High-quality Point Cloud Oriented Normal Estimation via Hybrid Angular and Euclidean Distance Encoding
Yuanqi Li · Jingcheng Huang · Hongshen Wang · Peiyuan Lv · Yansong Liu · Jiuming Zheng · Jie Guo · Yanwen Guo
The proliferation of Light Detection and Ranging (LiDAR) technology has facilitated the acquisition of three-dimensional point clouds, which are integral to applications in VR, AR, and Digital Twin. Oriented normals, critical for 3D reconstruction and scene analysis, cannot be directly extracted from scenes using LiDAR due to its operational principles. Previous traditional or learning-based methods are prone to inaccuracies due to uneven distribution and noise due to the dependence on local geometry features. This paper addresses the challenge of estimating oriented point normals by introducing a point cloud normal estimation framework via hybrid angular and Euclidean distance encoding (HAE). Our method overcomes the limitations of local geometric information by combining angular and Euclidean spaces to extract features from both point cloud coordinates and light rays, leading to more accurate normal estimation. The core of our network consists of an angular distance encoding module, which leverages both ray directions and point coordinates for unoriented normal refinement, and a ray feature fusion module for normal orientation, that is robust to noise. We also provide a point cloud dataset with ground truth normals, generated a virtual scanner, which reflects real scanning distributions and noise profiles.
Live content is unavailable. Log in and register to view live content