Friday, April 26, 2024
ad
HomeData ScienceEdinburg Researchers Create New Weather Dataset For Next-Gen Autonomous Vehicles

Edinburg Researchers Create New Weather Dataset For Next-Gen Autonomous Vehicles

Driving during adverse weather conditions can be tricky, so why not include them in the training data of autonomous vehicles?

It has been nearly a decade since the first autonomous vehicle (AV) hit the road. Autonomous vehicles are attracting a lot of attention because of the convenience and safety benefits they provide. However, a fully autonomous vehicle is yet to progress beyond the testing stage. One of the biggest hurdles for this technology is not only artificial intelligence algorithms but also fog, rain, and snow weather data.

Today, more than hundreds of self-driving cars, trucks, and other vehicles companies are testing this technology but are leveraging data of road conditions on a clear sunny day. While the majority of the autonomous vehicles have given outstanding results on such test data, making the automobile navigate through rapidly changing road and weather conditions, especially in the circumstances with heavy snowfall, fog, or rain, poses a tremendous challenge. 

Read More: Microsoft Launched An Autonomous Beach Cleaning Robot BeachBot

Driverless vehicles rely on sensors to view street signs and lane dividers, but inclement weather can make it harder for them to ”see” the road and make correct decisions when cruising at high speeds. An autonomous car uses three sensors — camera, radar, and lidar — to view and perceive everything around it. The cameras assist it in obtaining a 360-degree vision of its surroundings, recognizing objects and people, and determining their distance. Radar aids lane maintaining and parking by detecting moving objects and calculating distance and speed in real-time. LIDAR uses lasers instead of radio waves to create 3D images of surroundings and map them, creating a 360-degree view around the car. 

However, light showers or snowflakes might cause LIDAR sensor systems to malfunction and lose accuracy. The vehicles also depend heavily on data gathered from optical sensors, which are less reliable in bad weather.

Therefore, leveraging bad weather data will not only play a crucial role in safety-critical AV choices like disengagement and operational domain designation but also help in more basic tasks like lane management.

Despite the pressing need to accommodate bad weather data in the training dataset, there was scanty publicly available data. As a result, the Radiate (RAdar Dataset In Adverse weaThEr) project, directed by Heriot-Watt University, has released a new dataset that will aid in the creation of autonomous vehicles that can drive safely in adverse conditions. The team drew inspiration from the sensors that have already proven their excellence during rain, snow, and fog weather in Scotland. Their goal is to make radar sensing research on object identification, tracking, SLAM (Simultaneous Localization and Mapping), and scene comprehension in harsh weather easier.

This dataset comprises three hours of annotated radar images, multi-modal sensor data (radar, camera, 3D LiDAR, and GPS/IMU), and more. Professor Andrew Wallace and Dr. Sen Wang of Heriot-Watt University have been gathering data since 2019. First, they outfitted a van with LiDAR, radar, stereo cameras, and geopositioning devices. Then, they intentionally drove the vehicle across Edinburgh and the Scottish Highlands at all hours of the day and night, capturing urban and country roads conditions in bad weather.

Wallace explains that such datasets are critical for developing and benchmarking autonomous vehicle perception systems. Though we are still a long way from having driverless cars on the roads, autonomous vehicles are already being tested in controlled environments and piloting zones.

Located on the outskirts of Edinburgh, Heriot-Watt houses the famous National Robotarium that is a £3 million initiative that brings together robotics, cognitive science, and psychology experts with colleagues from Imperial College London and the University of Manchester. Wallace’sWallace’s team is based at Heriot-Watt University’s Institute of Sensors, Signals, and Systems, which has previously pioneered conventional and deep learning techniques for sensory data interpretation. 

According to Wallace, the duo successfully demonstrated how radar could assist autonomous cars in navigating, mapping, and interpreting their environment in bad weather when vision and LiDAR are rendered useless. In addition, the team also labeled about 200,000 road actors in the dataset – bicycles, cars, pedestrians, traffic signs, and other road actors, that could help researchers and manufacturers develop safe navigation in autonomous vehicles of the future.

Dr. Wang cites, “When a car pulls out in front of you, you try to predict what it will do – will it swerve, will it take off? That’s what autonomous vehicles will have to do, and now we have a database that can put them on that path, even in bad weather.”

Wallace claims that they need to improve the resolution of the radar, which is naturally fuzzy. Combining high-resolution optical images with improved weather-penetrating capabilities of radar would help autonomous vehicles see and map better and, ultimately, travel more safely.

Subscribe to our newsletter

Subscribe and never miss out on such trending AI-related articles.

We will never sell your data

Join our WhatsApp Channel and Discord Server to be a part of an engaging community.

Preetipadma K
Preetipadma K
Preeti is an Artificial Intelligence aficionado and a geek at heart. When she is not busy reading about the latest tech stories, she will be binge-watching Netflix or F1 races!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular