Skip to main content

Driving in the snow is a team effort for AI sensors

Driving in the snow is a team effort for AI sensors

Nobody likes driving in a blizzard, including autonomous vehicles. To make self-driving cars safer on snowy roads, engineers look at the problem from the car’s point of view.

A major challenge for fully autonomous vehicles is navigating bad weather. Snow especially confounds crucial sensor data that helps a vehicle gauge depth, find obstacles and keep on the correct side of the yellow line, assuming it is visible. Averaging more than 200 inches of snow every winter, Michigan’s Keweenaw Peninsula is the perfect place to push autonomous vehicle tech to its limits. In two papers presented at SPIE Defense + Commercial Sensing 2021, researchers from Michigan Technological University discuss solutions for snowy driving scenarios that could help bring self-driving options to snowy cities like Chicago, Detroit, Minneapolis and Toronto.

Just like the weather at times, autonomy is not a sunny or snowy yes-no designation. Autonomous vehicles cover a spectrum of levels, from cars already on the market with blind spot warnings or braking assistance, to vehicles that can switch in and out of self-driving modes, to others that can navigate entirely on their own. Major automakers and research universities are still tweaking self-driving technology and algorithms. Occasionally accidents occur, either due to a misjudgment by the car’s artificial intelligence (AI) or a human driver’s misuse of self-driving features.

Humans have sensors, too: our scanning eyes, our sense of balance and movement, and the processing power of our brain help us understand our environment. These seemingly basic inputs allow us to drive in virtually every scenario, even if it is new to us, because human brains are good at generalizing novel experiences. In autonomous vehicles, two cameras mounted on gimbals scan and perceive depth using stereo vision to mimic human vision, while balance and motion can be gauged using an inertial measurement unit. But, computers can only react to scenarios they have encountered before or been programmed to recognize.

Since artificial brains aren’t around yet, task-specific artificial intelligence (AI) algorithms must take the wheel — which means autonomous vehicles must rely on multiple sensors. Fisheye cameras widen the view while other cameras act much like the human eye. Infrared picks up heat signatures. Radar can see through the fog and rain. Light detection and ranging (lidar) pierces through the dark and weaves a neon tapestry of laser beam threads.

“Every sensor has limitations, and every sensor covers another one’s back,” said Nathir Rawashdeh, assistant professor of computing in Michigan Tech’s College of Computing and one of the study’s lead researchers. He works on bringing the sensors’ data together through an AI process called sensor fusion.

“Sensor fusion uses multiple sensors of different modalities to understand a scene,” he said. “You cannot exhaustively program for every detail when the inputs have difficult patterns. That’s why we need AI.”

Rawashdeh’s Michigan Tech collaborators include Nader Abu-Alrub, his doctoral student in electrical and computer engineering, and Jeremy Bos, assistant professor of electrical and computer engineering, along with master’s degree students and graduates from Bos’ lab: Akhil Kurup, Derek Chopp and Zach Jeffries. Bos explains that lidar, infrared and other sensors on their own are like the hammer in an old adage. “‘To a hammer, everything looks like a nail,'” quoted Bos. “Well, if you have a screwdriver and a rivet gun, then you have more options.”

Most autonomous sensors and self-driving algorithms are being developed in sunny, clear landscapes. Knowing that the rest of the world is not like Arizona or southern California, Bos’s lab began collecting local data in a Michigan Tech autonomous vehicle (safely driven by a human) during heavy snowfall. Rawashdeh’s team, notably Abu-Alrub, poured over more than 1,000 frames of lidar, radar and image data from snowy roads in Germany and Norway to start teaching their AI program what snow looks like and how to see past it.

“All snow is not created equal,” Bos said, pointing out that the variety of snow makes sensor detection a challenge. Rawashdeh added that pre-processing the data and ensuring accurate labeling is an important step to ensure accuracy and safety: “AI is like a chef — if you have good ingredients, there will be an excellent meal,” he said. “Give the AI learning network dirty sensor data and you’ll get a bad result.”

Low-quality data is one problem and so is actual dirt. Much like road grime, snow buildup on the sensors is a solvable but bothersome issue. Once the view is clear, autonomous vehicle sensors are still not always in agreement about detecting obstacles. Bos mentioned a great example of discovering a deer while cleaning up locally gathered data. Lidar said that blob was nothing (30% chance of an obstacle), the camera saw it like a sleepy human at the wheel (50% chance), and the infrared sensor shouted WHOA (90% sure that is a deer).

Getting the sensors and their risk assessments to talk and learn from each other is like the Indian parable of three blind men who find an elephant: each touches a different part of the elephant — the creature’s ear, trunk and leg — and comes to a different conclusion about what kind of animal it is. Using sensor fusion, Rawashdeh and Bos want autonomous sensors to collectively figure out the answer — be it elephant, deer or snowbank. As Bos puts it, “Rather than strictly voting, by using sensor fusion we will come up with a new estimate.”

While navigating a Keweenaw blizzard is a ways out for autonomous vehicles, their sensors can get better at learning about bad weather and, with advances like sensor fusion, will be able to drive safely on snowy roads one day.



from ScienceBlog.com https://ift.tt/3uuSZcI

Comments

Popular posts from this blog

Wiggling worms suggest link between vitamin B12 and Alzheimer’s

Worms don’t wiggle when they have Alzheimer’s disease. Yet something helped worms with the disease hold onto their wiggle in Professor Jessica Tanis’s lab at the University of Delaware. In solving the mystery, Tanis and her team have yielded new clues into the potential impact of diet on Alzheimer’s, the dreaded degenerative brain disease afflicting more than 6 million Americans. A few years ago, Tanis and her team began investigating factors affecting the onset and progression of Alzheimer’s disease. They were doing genetic research with  C. elegans , a tiny soil-dwelling worm that is the subject of numerous studies. Expression of amyloid beta, a toxic protein implicated in Alzheimer’s disease, paralyzes worms within 36 hours after they reach adulthood. While the worms in one petri dish in Tanis’s lab were rendered completely immobile, the worms of the same age in the adjacent petri dish still had their wiggle, documented as “body bends,” by the scientists. “It was an observa...

‘Massive-scale mobilization’ necessary for addressing climate change, scientists say

A year after a global coalition of more than 11,000 scientists declared a climate emergency, Oregon State University researchers who initiated the declaration released an update today that points to a handful of hopeful signs, but shares continued alarm regarding an overall lack of progress in addressing climate risks. “Young people in more than 3,500 locations around the world have organized to push for urgent action,” said Oregon State University’s William Ripple, who co-authored “The Climate Emergency: 2020 in Review,” published today in Scientific American. “And the Black Lives Matter movement has elevated social injustice and equality to the top of our consciousness. “Rapid progress in each of the climate action steps we outline is possible if framed from the outset in the context of climate justice – climate change is a deeply moral issue. We desperately need those who face the most severe climate risks to help shape the response.” One year ago, Ripple, distinguished profess...

Ancient Shell Sounds

Abandoned at the mouth of your shelter you quivered apprehensively at our approach, crying out to be held as we proclaimed the exception of your discovery. Sighing wearily as we consigned you to the dusty silence of our archives. But now When I hold you in my hands, I see the face of your purposefully speckled complexion. When I lift you to my ear, I hear the sound of an ancient sea lapping at your shores. When I place you at my lips, I feel the heartbeat of your creator pulsing to my breath. I close my eyes, as you call out to all that you have lost. The shell that was recovered from the Marsoulas cave in the Pyrenees of France (Image Credit: C. Fritz, Muséum d’Histoire naturelle de Toulouse). This poem is inspired by recent research , which has discovered that a large seashell that sat in a French museum for decades is actually a musical instrument used around 18,000 years ago. In 1931, researchers working in southern France unearthed a large seashell at the entr...