Researchers from the University of Zurich (UZH) in Switzerland, along with Swiss research consortium NCCR Robotics, have come up with a way to teach unmanned aerial vehicles (UAVs) how to operate using an “eye-inspired” camera that can see in the dark more effectively and fly at faster speeds while capturing clear imagery.
According to a press release from NCCR, the research could be especially beneficial for search-and-rescue missions that take place at dusk or dawn, as well as in emergency response, when time is of the essence.
Conventional cameras function properly only when there is enough light; in addition, a camera-equipped drone’s speed must be limited so that the imagery is not blurred by motion. Sensors can help solve the problem, but many are “elaborate, expensive and bulky,” says NCCR.
However, the researchers claim their technology enables UAVs to fly in a wider range of conditions.
“This research is the first of its kind in the fields of artificial intelligence and robotics and will soon enable drones to fly autonomously and faster than ever, including in low-light environments,” says Professor Davide Scaramuzza, director of the Robotics and Perception Group at UZH.
The “event cameras,” which were invented at UZH with ETH Zurich, do not need to capture full light on the entire “bio-inspired retina” in order to produce a clear picture. Unlike their conventional counterparts, they only report changes in brightness for each pixel – in turn, ensuring sharp vision even during fast motion or in low-light environments, the press release explains.
The UZH researchers have also designed new software able to efficiently process the output from the cameras – in turn, enabling autonomous flights at higher speeds and in lower light.
A research paper on the work states as follows:
“Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range.
“However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras provide instant and rich information about the environment most of the time (in low-speed and good lighting scenarios), but they fail severely in case of fast motions or difficult lighting such as high dynamic range or low-light scenes.”
PhD Student Henri Rebecq says there is “still a lot of work to be done before these drones can be deployed in the real world,” as the event camera used in the research is still an “early prototype.” Rebecq notes that the software has not yet been proven to work reliably outdoors.
Scaramuzza adds, “We think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system.”