Engineers at the University of California San Diego have developed a low cost, low power technology to help robots accurately map their way indoors, even in poor lighting and without recognizable landmarks or features.
The technology consists of sensors that use WiFi signals to help the robot map where it’s going. It’s a new approach to indoor robot navigation. Most systems rely on optical light sensors such as cameras and LiDARs. In this case, the so-called “WiFi sensors” use radio frequency signals rather than light or visual cues to see, so they can work in conditions where cameras and LiDARs struggle — in low light, changing light, and repetitive environments such as long corridors and warehouses.
And by using WiFi, the technology could offer an economical alternative to expensive and power hungry LiDARs, the researchers noted.
A team of researchers from the Wireless Communication Sensing and Networking Group, led by UC San Diego electrical and computer engineering professor Dinesh Bharadia, will present their work at the 2022 International Conference on Robotics and Automation (ICRA), which will take place from May 23 to 27 in Philadelphia.
“We are surrounded by wireless signals almost everywhere we go. The beauty of this work is that we can use these everyday signals to do indoor localization and mapping with robots,” said Bharadia.
“Using WiFi, we have built a new kind of sensing modality that fills in the gaps left behind by today’s light-based sensors, and it can enable robots to navigate in scenarios where they currently cannot,” added Aditya Arun, who is an electrical and computer engineering Ph.D. student in Bharadia’s lab and the first author of the study.
The researchers built their prototype system using off-the-shelf hardware. The system consists of a robot that has been equipped with the WiFi sensors, which are built from commercially available WiFi transceivers. These devices transmit and receive wireless signals to and from WiFi access points in the environment. What makes these WiFi sensors special is that they use this constant back and forth communication with the WiFi access points to map the robot’s location and direction of movement.
“This two-way communication is already happening between mobile devices like your phone and WiFi access points all the time — it’s just not telling you where you are,” said Roshan Ayyalasomayajula, who is also an electrical and computer engineering Ph.D. student in Bharadia’s lab and a co-author on the study. “Our technology piggybacks on that communication to do localization and mapping in an unknown environment.”
Here’s how it works. At the start, the WiFi sensors are unaware of the robot’s location and where any of the WiFi access points are in the environment. Figuring that out is like playing a game of Marco Polo — as the robot moves, the sensors call out to the access points and listen for their replies, using them as landmarks. The key here is that every incoming and outgoing wireless signal carries its own unique physical information — an angle of arrival and direct path length to (or from) an access point — that can be used to figure out where the robot and access points are in relation to each other. Algorithms developed by Bharadia’s team enable the WiFi sensors to extract this information and make these calculations. As the call and response continues, the sensors pick up more information and can accurately locate where the robot is going.
The researchers tested their technology on a floor of an office building. They placed several access points around the space and equipped a robot with the WiFi sensors, as well as a camera and a LiDAR to perform measurements for comparison. The team controlled their robot to travel several times around the floor, turning corners, going down long and narrow corridors, and passing through both bright and dimly lit spaces.
In these tests, the accuracy of localization and mapping provided by the WiFi sensors was on par with that of the commercial camera and LiDAR sensors.
“We can use WiFi signals, which are essentially free, to do robust and reliable sensing in visually challenging environments,” said Arun. “WiFi sensing could potentially replace expensive LiDARs and complement other low cost sensors such as cameras in these scenarios.”
That’s what the team is now exploring. The researchers will be combining WiFi sensors (which provide accuracy and reliability) with cameras (which provide visual and contextual information about the environment) to develop a more complete, yet inexpensive, mapping technology.
Source link
Author