Immervision is a leading provider of “Deep Seeing Technology”: wide-angle optics, processing, and sensor fusion for next generation technology. Here, Immervision AVP Ludimila Centano gives a deep dive on the sensor options available for safe, low-light drone operations. Read on to understand the pros and cons of low-light cameras vs. LiDAR sensors, what actually qualifies as a low-light camera, and things to look for when choosing a sensor.
The following is a guest post by Ludimila Centeno, Associate Vice President of Technology and Support, Immervision. DRONELIFE neither accepts nor makes payment for guest posts.
It isn’t always possible to fly drones in full daylight and in wide open spaces. There are many applications for which the ability to operate drones in low-light environments is a necessity. Oftentimes, the problem is exacerbated by the need to work in confined spaces (e.g., mines, sewers, waterways in hydroelectric dams) or spaces with obstructions (e.g., factory buildings, warehouses, woods).
A few low-light application examples include filmmaking, surveilling persons and objects of interest, inspecting infrastructure like the undersides of bridges and the insides of railway tunnels, delivering medications to rural areas and isolated locations, and life-and-death situations like search and rescue operations that need to run day and night because every second counts.
New opportunities are being made available to commercial drone operators with the FAA providing Beyond Visual Line-of Sight (BVLOS) waivers (more information here.) In addition to flying over greater ranges and at higher altitudes, this includes flying in low-light conditions and at night.
Unfortunately, these opportunities remain out of reach in the absence of an efficient and effective solution for operating drones safely under less-than-ideal lighting situations.
Alternative Low-Light Sensor Options
By default, drones are not designed to operate in low-light conditions or at night. One option is to augment the drones with specialist sensor technologies.
Ultrasonic sensors are small, light, function in all light conditions, and may be of limited interest for certain applications, such as detecting the altitude of the drone while landing. However, these sensors have limited range, limited accuracy, inflexible scanning methods, and extremely limited resolution that provides only “something is there” or “nothing is there” information.
Radar sensors suitable for use on drones also work in all light conditions, are tolerant of bad weather (fog, rain, snow), and have a reasonable range. Once again, however, these sensors provide limited resolution, have a narrow Field of View (FoV), and are of limited interest for most low-light applications.
There are two main LiDAR technologies—Time-of-Flight (ToF) and Frequency Modulated Continuous Wave (FMCW)—both with their own advantages and disadvantages. Their main advantage in the case of low-light operations is that they use a laser to “illuminate” the object, which means they are unaffected by the absence of natural light. Although LiDAR can offer significantly higher resolution than radar, this resolution is only a fraction of that offered by camera technologies. Also, LiDAR data is not usually colorized, making its interpretation and analysis uncertain. Furthermore, the Size, Weight, and Power (SWaP) characteristics of LiDAR sensors limit their use in all but the largest drones.
All the sensors discussed above are active in nature, which means they emit energy and measure the reflected or scattered signal. By comparison—assuming they aren’t augmented by an additional light source to illuminate their surroundings—cameras are passive in nature, which means they detect the natural light reflected or emitted by objects in the surrounding environment. This passive capability may be mandatory in certain applications. Cameras also convey many other advantages, including low cost, low weight, and low power consumption coupled with high resolution and—when equipped with an appropriate lens subsystem—a 180-degree or a 360-degree FoV.
What Actually Qualifies as a Low-Light Camera?
There are many cameras available that claim to offer low-light capabilities. However, there is no good definition as to what actually qualifies as a low-light camera. Humans can subjectively appreciate the quality of an image, but how does one objectively quantify the performance of a low-light system?
At Immervision, we are often asked questions like “How dark can it be while your camera can still see?” This is a tricky question because these things are so subjective. It many respects, the answer depends on what there is to be seen. In the context of computer vision for object detection, for example, the type of object, its shape, color, and size all impact how easily it can be detected. This means that “How dark can it be while your camera can still see?” is the wrong question to ask if one wishes to determine whether a camera is good for low-light conditions… or not.
Fortunately, there are options available that offer a more deterministic and quantitative approach. The Harris detector model, for example, detects transitions in an image (e.g., corners and edges). This model can be used to quantify the image quality produced by a camera for use in machine vision applications. Also, using artificial intelligence (AI) models for object detection and recognition can provide a good approach to measure the performance of a camera and to compare different options.
Creating a Great Low-Light Camera
There are three main elements that impact a camera’s low-light sensitivity and capabilities: the lens assembly, the sensor, and the image signal processor.
- The Lens Assembly: Many wide-angle lenses cause the ensuing image to be “squished” at the edges. To counteract this, the multiple sub-lenses forming the assembly need to be crafted in such a way as to result in more “useful pixels” throughout the entire image. Additionally, with respect to low-light operation, the lens assembly must maximize the concentration of light-per-pixel on the image sensor. This is achieved by increasing the aperture (i.e., the opening of the lens measured as the F# or “F number”) to admit more light. The lower the F#, the better the low-light performance. However, lowering the F# comes at a cost because it increases the complexity of the design and—if not implemented correctly—may impact the quality of the image. A good low-light lens assembly must also provide a crisp image, whose sharpness can be measured as the Modulation Transfer Function (MTF) of the lens.
- The Image Sensor: This is the component that converts the light from the lens assembly into a digital equivalent that will be processed by the rest of the system. A good low-light camera must use a sensor with high sensitivity and quantum efficiency. Such sensors typically have a large pixel size, which contributes to the light sensitivity of the camera module by capturing more light-per-pixel.
- The Image Signal Processor: The digital data generated by the image sensor is typically relayed to an Image Signal Processor (ISP). This component (or function in a larger integrated circuit) is tasked with obtaining the best image possible according to the application requirements. The ISP controls the parameters associated with the image sensor, such as the exposure, and also applies its own. The calibration of an ISP is called Image Quality tuning (IQ tuning). This is a complex science that has been mastered by few companies, of which Immervision is one.
What’s Available Now
New advancements in low-light cameras and vision systems are helping expand the scope of applications (e.g., location mapping, visual odometry, and obstacle avoidance) and improve operational capabilities by empowering drones to take off, navigate, and land efficiently in challenging lighting conditions and adverse weather scenarios.
At Immervision, we’re developing advanced vision systems combining optics, image processing, and sensor fusion technology. Blue UAS is a holistic and continuous approach to rapidly prototyping and scaling commercial UAS technology for the DoD. As part of the Blue UAS program, the Immervision InnovationLab team developed a wide-angle navigation camera called IMVISIO-ML that can operate in extreme low-light environments under 1 lux.
Along with the camera module, an advanced image processing library is available with features such as dewarping, sensor fusion, camera stitching, image stabilization, and more. We also provide IQ tuning services, to optimize the performance of the system based on the target application.
The IMVISIO-ML low-light navigation camera system is now broadly available to drone and robotics manufacturers. Integrated with the Qualcomm RB5 and ModalAI VOXL2 platforms, this camera module is already being adopted by drone manufacturers such as Teal Drones, which is a leading drone provider from the Blue UAS Cleared list. As reported here on DroneLife, the newest model of Teal’s Golden Eagle drone will be equipped with two Immervision low-light camera modules, which will improve navigation in low-light conditions and provide stereoscopic vision to Teal’s autonomous pilot system.
Ludimila Centeno has over 15 years of experience in Telecommunications industry in the segments of wireless communications and semiconductor industry. Having contributed to Presales, Customer Support and Operations, she has joined Immervision as Associate Vice President, Technology Offer & Support. She holds a Masters degree in Electrical Engineering where she developed research in the areas of Cognitive Radio and spectrum sensing techniques.
Miriam McNabb is the Editor-in-Chief of DRONELIFE and CEO of JobForDrones, a professional drone services marketplace, and a fascinated observer of the emerging drone industry and the regulatory environment for drones. Miriam has penned over 3,000 articles focused on the commercial drone space and is an international speaker and recognized figure in the industry. Miriam has a degree from the University of Chicago and over 20 years of experience in high tech sales and marketing for new technologies.
For drone industry consulting or writing, Email Miriam.
TWITTER:@spaldingbarker
Subscribe to DroneLife here.
Leave a Reply