• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Skip to footer
  • DroneRacingLife
  • DroneFlyers
  • Newsletter
DroneLife

DRONELIFE

Stay up to date on all the latest Drone News

  • News
  • Products
  • Industries
    • Agriculture
    • Construction
    • Delivery
    • Dual Use
    • Inspection
    • Public Safety
    • Surveying
  • Enthusiasts
  • Regulations
  • Business
  • Video
  • Podcasts

The Art and Science of Low-Light Drone Operations: LiDAR or Camera? A DRONELIFE Exclusive from Immervision

November 20, 2023 by Miriam McNabb 1 Comment

art and science of low-light drone operations ImmervisionImmervision is a leading provider of “Deep Seeing Technology”: wide-angle optics, processing, and sensor fusion for next generation technology.  Here, Immervision AVP Ludimila Centano gives a deep dive on the sensor options available for safe, low-light drone operations.  Read on to understand the pros and cons of low-light cameras vs. LiDAR sensors, what actually qualifies as a low-light camera, and things to look for when choosing a sensor.

The following is a guest post by Ludimila Centeno, Associate Vice President of Technology and Support, Immervision.  DRONELIFE neither accepts nor makes payment for guest posts.

It isn’t always possible to fly drones in full daylight and in wide open spaces. There are many applications for which the ability to operate drones in low-light environments is a necessity. Oftentimes, the problem is exacerbated by the need to work in confined spaces (e.g., mines, sewers, waterways in hydroelectric dams) or spaces with obstructions (e.g., factory buildings, warehouses, woods).

A few low-light application examples include filmmaking, surveilling persons and objects of interest, inspecting infrastructure like the undersides of bridges and the insides of railway tunnels, delivering medications to rural areas and isolated locations, and life-and-death situations like search and rescue operations that need to run day and night because every second counts.

New opportunities are being made available to commercial drone operators with the FAA providing Beyond Visual Line-of Sight (BVLOS) waivers (more information here.) In addition to flying over greater ranges and at higher altitudes, this includes flying in low-light conditions and at night.

Unfortunately, these opportunities remain out of reach in the absence of an efficient and effective solution for operating drones safely under less-than-ideal lighting situations.

Alternative Low-Light Sensor Options

By default, drones are not designed to operate in low-light conditions or at night. One option is to augment the drones with specialist sensor technologies.

Ultrasonic sensors are small, light, function in all light conditions, and may be of limited interest for certain applications, such as detecting the altitude of the drone while landing. However, these sensors have limited range, limited accuracy, inflexible scanning methods, and extremely limited resolution that provides only “something is there” or “nothing is there” information.

Radar sensors suitable for use on drones also work in all light conditions, are tolerant of bad weather (fog, rain, snow), and have a reasonable range. Once again, however, these sensors provide limited resolution, have a narrow Field of View (FoV), and are of limited interest for most low-light applications.

There are two main LiDAR technologies—Time-of-Flight (ToF) and Frequency Modulated Continuous Wave (FMCW)—both with their own advantages and disadvantages. Their main advantage in the case of low-light operations is that they use a laser to “illuminate” the object, which means they are unaffected by the absence of natural light. Although LiDAR can offer significantly higher resolution than radar, this resolution is only a fraction of that offered by camera technologies. Also, LiDAR data is not usually colorized, making its interpretation and analysis uncertain. Furthermore, the Size, Weight, and Power (SWaP) characteristics of LiDAR sensors limit their use in all but the largest drones.

All the sensors discussed above are active in nature, which means they emit energy and measure the reflected or scattered signal. By comparison—assuming they aren’t augmented by an additional light source to illuminate their surroundings—cameras are passive in nature, which means they detect the natural light reflected or emitted by objects in the surrounding environment. This passive capability may be mandatory in certain applications. Cameras also convey many other advantages, including low cost, low weight, and low power consumption coupled with high resolution and—when equipped with an appropriate lens subsystem—a 180-degree or a 360-degree FoV.

What Actually Qualifies as a Low-Light Camera?

There are many cameras available that claim to offer low-light capabilities. However, there is no good definition as to what actually qualifies as a low-light camera. Humans can subjectively appreciate the quality of an image, but how does one objectively quantify the performance of a low-light system?

At Immervision, we are often asked questions like “How dark can it be while your camera can still see?” This is a tricky question because these things are so subjective. It many respects, the answer depends on what there is to be seen. In the context of computer vision for object detection, for example, the type of object, its shape, color, and size all impact how easily it can be detected. This means that “How dark can it be while your camera can still see?” is the wrong question to ask if one wishes to determine whether a camera is good for low-light conditions… or not.

Fortunately, there are options available that offer a more deterministic and quantitative approach. The Harris detector model, for example, detects transitions in an image (e.g., corners and edges). This model can be used to quantify the image quality produced by a camera for use in machine vision applications. Also, using artificial intelligence (AI) models for object detection and recognition can provide a good approach to measure the performance of a camera and to compare different options.

Creating a Great Low-Light Camera

There are three main elements that impact a camera’s low-light sensitivity and capabilities: the lens assembly, the sensor, and the image signal processor.

  • The Lens Assembly: Many wide-angle lenses cause the ensuing image to be “squished” at the edges. To counteract this, the multiple sub-lenses forming the assembly need to be crafted in such a way as to result in more “useful pixels” throughout the entire image. Additionally, with respect to low-light operation, the lens assembly must maximize the concentration of light-per-pixel on the image sensor. This is achieved by increasing the aperture (i.e., the opening of the lens measured as the F# or “F number”) to admit more light. The lower the F#, the better the low-light performance. However, lowering the F# comes at a cost because it increases the complexity of the design and—if not implemented correctly—may impact the quality of the image. A good low-light lens assembly must also provide a crisp image, whose sharpness can be measured as the Modulation Transfer Function (MTF) of the lens.
  • The Image Sensor: This is the component that converts the light from the lens assembly into a digital equivalent that will be processed by the rest of the system. A good low-light camera must use a sensor with high sensitivity and quantum efficiency. Such sensors typically have a large pixel size, which contributes to the light sensitivity of the camera module by capturing more light-per-pixel.
  • The Image Signal Processor: The digital data generated by the image sensor is typically relayed to an Image Signal Processor (ISP). This component (or function in a larger integrated circuit) is tasked with obtaining the best image possible according to the application requirements. The ISP controls the parameters associated with the image sensor, such as the exposure, and also applies its own. The calibration of an ISP is called Image Quality tuning (IQ tuning). This is a complex science that has been mastered by few companies, of which Immervision is one.

What’s Available Now

New advancements in low-light cameras and vision systems are helping expand the scope of applications (e.g., location mapping, visual odometry, and obstacle avoidance) and improve operational capabilities by empowering drones to take off, navigate, and land efficiently in challenging lighting conditions and adverse weather scenarios.

At Immervision, we’re developing advanced vision systems combining optics, image processing, and sensor fusion technology. Blue UAS is a holistic and continuous approach to rapidly prototyping and scaling commercial UAS technology for the DoD. As part of the Blue UAS program, the Immervision InnovationLab team developed a wide-angle navigation camera called IMVISIO-ML that can operate in extreme low-light environments under 1 lux.

Along with the camera module, an advanced image processing library is available with features such as dewarping, sensor fusion, camera stitching, image stabilization, and more. We also provide IQ tuning services, to optimize the performance of the system based on the target application.

The IMVISIO-ML low-light navigation camera system is now broadly available to drone and robotics manufacturers. Integrated with the Qualcomm RB5 and ModalAI VOXL2 platforms, this camera module is already being adopted by drone manufacturers such as Teal Drones, which is a leading drone provider from the Blue UAS Cleared list.  As reported here on DroneLife, the newest model of Teal’s Golden Eagle drone will be equipped with two Immervision low-light camera modules, which will improve navigation in low-light conditions and provide stereoscopic vision to Teal’s autonomous pilot system.

Ludimila Centeno has over 15 years of experience in Telecommunications industry in the segments of wireless communications and semiconductor industry. Having contributed to Presales, Customer Support and Operations, she has joined Immervision as Associate Vice President, Technology Offer & Support. She holds a Masters degree in Electrical Engineering where she developed research in the areas of Cognitive Radio and spectrum sensing techniques.

 

Miriam McNabb

Miriam McNabb is the Editor-in-Chief of DRONELIFE and CEO of JobForDrones, a professional drone services marketplace, and a fascinated observer of the emerging drone industry and the regulatory environment for drones. Miriam has penned over 3,000 articles focused on the commercial drone space and is an international speaker and recognized figure in the industry.  Miriam has a degree from the University of Chicago and over 20 years of experience in high tech sales and marketing for new technologies.
For drone industry consulting or writing, Email Miriam.

TWITTER:@spaldingbarker

Subscribe to DroneLife here.

Filed Under: DL Exclusive, Drone News, Drone News Feeds, Drones in the News, Featured, News, Products, Sensors Tagged With: drone based LiDAR, Immervision, LIDAR, low-light cameras, low-light drone operations, low-light sensors

Reader Interactions

Trackbacks

  1. Maximizing Low-Light Drone Operations - Defence Agenda says:
    June 11, 2025 at 1:28 pm

    […] Source […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

LATEST

SimActive Integrates Phase One iXM-FS130 for High-Resolution Aerial Mapping

Correlator3D enables sub-centimeter processing for fixed-wing survey missions SimActive has announced full support for the Phase One iXM-FS130 sensor in…

Continue Reading SimActive Integrates Phase One iXM-FS130 for High-Resolution Aerial Mapping

WISPR Systems’ SkyScout 2+ Achieves Green UAS Certification

WISPR Systems announced that its SkyScout 2+ has earned Green UAS Certification from the Association for Uncrewed Vehicle Systems International (AUVSI). The designation confirms…

Continue Reading WISPR Systems’ SkyScout 2+ Achieves Green UAS Certification

EagleNXT Expands in Europe as Defense Drone Market Demand Grows

European Market for Defense Drones, Counter-UAS Grows By Dronelife Features Editor Jim Magill As western European nations contend with the…

Continue Reading EagleNXT Expands in Europe as Defense Drone Market Demand Grows

AeroDefense Launches No-Cost Drone Detection Access Program for Law Enforcement

AirWarden Essentials customers can now share drone detection data with SLTT, DHS, and FBI partners at no added cost AeroDefense…

Continue Reading AeroDefense Launches No-Cost Drone Detection Access Program for Law Enforcement

Versaterm Acquires Aloft to Expand Drone Capabilities for Public Safety

Deal adds FAA-approved airspace intelligence to DroneSense platform Versaterm has announced the acquisition of Aloft, an FAA-approved provider of airspace…

Continue Reading Versaterm Acquires Aloft to Expand Drone Capabilities for Public Safety

Trojan Horse or Trade Dispute? Texas Attorney General Targets Anzu in High-Stakes Drone Lawsuit

Texas AG sues Anzu, claims company sells DJI clones By DRONELIFE Features Editor Jim Magill Claiming that the company is…

Continue Reading Trojan Horse or Trade Dispute? Texas Attorney General Targets Anzu in High-Stakes Drone Lawsuit

Geo Week to Relocate to Salt Lake City in 2027

Leading Geospatial Event Moves to Utah’s Expanding Technology Hub Geo Week will relocate to Salt Lake City, Utah in 2027,…

Continue Reading Geo Week to Relocate to Salt Lake City in 2027

Eric Trump Invests in XTEND as Israeli Drone Firm Announces Plan to Go Public via Nasdaq Merger

AI-powered robotics company targets $1.5B valuation in U.S. listing Israeli drone and robotics company XTEND has announced plans to go…

Continue Reading Eric Trump Invests in XTEND as Israeli Drone Firm Announces Plan to Go Public via Nasdaq Merger

New Partnership Building Farming Drone Batteries in Texas

KULR Technology Group and Hylio have entered a joint collaboration to produce NDAA-compliant battery systems for agricultural drones built in…

Continue Reading New Partnership Building Farming Drone Batteries in Texas

FlytBase Unveils FlytBase One Management System

FlytBase has introduced FlytBase One, a unified control platform built to connect autonomous drones, robots, and physical infrastructure. The company also…

Continue Reading FlytBase Unveils FlytBase One Management System

Secondary Sidebar

Footer

SPONSORED

Inspired Flight Gremsy IF800 VIO F1 drones geo week

What Will It Take to Strengthen U.S. Drone Manufacturing? A Conversation with Inspired Flight’s CEO

Global Mapper Mobile data collection

Collection Ground Control Points with Global Mapper Mobile

Military Drone Mapping Solutions

How SimActive’s Correlator3D™ is Revolutionizing Military Mapping: An Exclusive Interview with CEO Philippe Simard

Photogrammetry Accuracy Standards

SimActive Photogrammetry Software: Enabling Users to Meet Accuracy Standards for Over 20 Years

NACT Engineering Parrot ANAFI tether indoor shot

Smart Tether for Parrot ANAFI USA from NACT Engineering

Blue Marble, features global mapper, features Blue Marble

Check Out These New Features in Global Mapper v25 from Blue Marble

About Us | Contact Us | Advertise With Us | Write for Us | Privacy Policy | Terms of Service

The Trusted Source for the Business of Drones.

This website uses cookies and third party services. By clicking OK, you are agreeing to our privacy policy. ACCEPT

Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT