4D Radar Signal Processing
& Pre-Processing
4D Radar Object
Detection Network Design
Detection Results Visualization
& Performance Evaluation
Radar is relatively robust to external environments such as light and weather. Also, it can measure velocity and distance to an object directly unlike other sensors such as camera and LiDAR. Thus, object detection using radar has the advantage of being able to recognize static/dynamic objects in various environments.
We are researching an object detection neural network with 4D imaging radar that accomplishes high-resolution object detection, which overcomes poor performance of object detection with conventional 3D radar.
Object detection neural network with high-resolution 4D radar information(distance, horizontal angle, vertical angle, and Doppler velocity) is able to detect very detailed objects such as vehicles, trees, bicycles, pedestrians, and so on.
Object detection with 4D imaging radar accomplishes state-of-the-art performance under bad weather due to strong robustness to rain, snow, and fog as well as high-resolution 4D radar information.
K-Radar: 4D Radar Object Detection Dataset and Benchmark for Urban Roads and Highways in Adverse Weather Conditions, Advances in Neural Information Processing Systems (NeurIPS) Datasets and Benchmarks, 2022
4차원 레이더의 포인트 클라우드 필터링 방법 및 데이터 처리 장치, July 2021(출원 일자)
Lane Projection from LiDAR Point
Cloud to Camera image (2021)
Lane Detection in
LiDAR Point Cloud (2021)
To mitigate the problem that conventional lane detection neural networks detect a small number of lanes in driving situations, we are researching a lane detection neural network that detects up to 6 lanes that provide sufficient lane information for autonomous vehicles.
Lane detection neural network with LiDAR which detects up to 6 lanes in challenging scenarios such as extreme light conditions in urban roads.
K-Lane: Lidar Lane Dataset and Benchmark for Urban Roads and Highways, IEEE/CVF Computer Vision and Pattern Recognition (CVPR) Workshop on Autonomous Driving, 2022
Row-wise Lidar Lane Detection Network with Lane Correlation Refinement, IEEE Conference on Intelligent Transportation Systems (ITSC), 2022
A Lidar based Lane Detection Network using Anchor Line, 3rd Korea Artificial Intelligence Conference, 2022
Robust Lane Detection Network Based on Lidar-Camera Sensor Fusion in Various Light Conditions, 3rd Korea Artificial Intelligence Conference, 2022
Lane detection dataset by AVE lab
KITTI open dataset
Lane and Object detection network
Lane detection result
Object detection result
The main drawback of neural network for perception is that it consumes a lot of computation power so that it is difficult for an autonomous vehicle to implement multiple algorithms simultaneously in real-time.
We are researching an integrated neural network for lane and object detection with LiDAR, which is a lightweight neural network for real-time that reduces computational costs such as GPU memory usage, training time, and inference time.
An integrated neural network for lane and object detection with LiDAR that simultaneously detects on-road objects and at most 6 lanes, which autonomous vehicles can implement for real-time.
The computational complexity of the training neural network is reduced by integrating multiple layers of object detection and lane detection.
This network accomplishes high accuracy of detection (F1-score) 90%, which is similar to the current state-of-the-art(SOTA).
Lane Detection Datasets conducted by
AVE lab in Various Urban Environment (2021)
Lane labeling program for LiDAR Point Cloud (2021)
We have conducted the world’s first large-scale multimodal (e.g., precisely calibrated front camera image, LiDAR point cloud, RTK-GPS data) dataset for lane detection, which is labeled up to 6 lanes in daytime and night.
Using this dataset, the high-performance object and lane detection is developed. (The dataset will be released soon)
K-Lane: Lidar Lane Dataset and Benchmark for Urban Roads and Highways, IEEE/CVF Computer Vision and Pattern Recognition (CVPR) Workshop on Autonomous Driving, 2022