Geolocation of vehicles, objects or people is commonly done using global navigation satellite system (GNSS) receivers. Such a receiver for GNSS-based positioning is either built into the vehicle, or separate handheld devices like a smartphone or similar are used. Self-localization in this way is simple and accurate up to a few meters.
Environments where no GNSS service is available require other strategies for self-localization. Especially in the military domain, it is necessary to be prepared for such GNSS-denied scenarios. Awareness of the own position in relation to other units is crucial in military operations, especially where joint operations have to be coordinated geographically and temporally. However, even if a common map-like representation of the terrain is available, precise self-localization relative to this map is not necessarily easy.
In this paper, we propose an approach for LiDAR-based localization of a vehicle-based sensor platform in an urban environment. Our approach is to use 360° scanning LiDAR sensors to generate short-duration point clouds of the local environment. In these point clouds, we detect pole-like 3D features such as traffic sign poles, lampposts or tree trunks. The relative distance and orientation of these features to each other is rather unique, and the matrix of these individual distances and orientations can be used to determine the position of the sensor relative to a current map. This map can either be created in advance for the entire area, or a cooperative preceding vehicle with an equivalent sensor setup can generate it. By matching the found LiDARbased 3D features with those of the map, not only the position of the sensor platform but also its orientation can be determined. We provide first experimental results of the proposed method, which were achieved with measurements by Fraunhofer IOSB’s sensor-equipped vehicle MODISSA.
In this paper, we expand our existing LiDAR-based approach for the tracking and detection of (low) flying small objects like commercial mini/micro UAVs. We show that UAVs can be detected by the proposed methods, as long as the movements of the UAVs correspond to the LiDAR sensor’s capabilities in scanning performance, range and resolution. The trajectory of the tracked object can further be analyzed to support the classification, meaning that UAVs and non- UAV objects can be distinguished by an identification of typical movement patterns. A stable tracking of the UAV is achieved by a precise prediction of its movement. In addition to this precise prediction of the target’s position, the object detection, tracking and classification have to be achieved in real-time.
For the algorithm development and a performance analysis, we analyzed LiDAR data that we acquired during a field trial. Several different mini/micro UAVs were observed by a system of four 360° LiDAR sensors mounted to a car. Using this specific sensor system, the results show that UAVs can be detected and tracked by the proposed methods, allowing a protection of the car against UAV threats within a radius of up to 35 m.
In literature there are several approaches for automated person detection in point clouds. While most techniques show acceptable results in object detection, the computation time is often crucial. The runtime can be problematic, especially due to the amount of data in the panoramic 360° point clouds. On the other hand, for most applications an object detection and classification in real time is needed.
The paper presents a proposal for a fast, real-time capable algorithm for person detection, classification and tracking in panoramic point clouds.
View contact details