Serviceeinschränkungen vom 12.-22.02.2026 - weitere Infos auf der UB-Homepage

Treffer: Global–local articulation pattern-based pedestrian detection using 3D Lidar data.

Title:
Global–local articulation pattern-based pedestrian detection using 3D Lidar data.
Source:
Remote Sensing Letters; Jul2016, Vol. 7 Issue 7, p681-690, 10p
Database:
Complementary Index

Weitere Informationen

Highly variable human poses and pedestrian occlusion make light detection and ranging (Lidar)-based pedestrian detection challenging. This letter proposes a novel framework to address these issues. Other than dividing humans into arbitrary number of parts and using the same features for all part detectors, we represent humans with global–local articulated parts and formulate new features relying on each part’s own character. Articulated parts are effective because they each usually maintain a relatively consistent shape across a broader range of body poses. In addition, to extract visible human segments from cluttered surroundings with the presence of pedestrian occlusion, both 3D information and 2D spatial information are used in a coarse-to-fine manner, making the interaction of human part and its neighbouring objects better analysed. The algorithm is evaluated over a busy street dataset and is shown to be competitive with the state-of-the-art Lidar-based algorithms. Remarkably, even at long distances, up to 20 m, it can handle pedestrian occlusion efficiently and effectively. [ABSTRACT FROM AUTHOR]

Copyright of Remote Sensing Letters is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)