SPIE is working with SAE International to develop lidar measurement standards for active safety systems. This multi-year effort aims to develop standard tests to measure the performance of low-cost lidar sensors developed for autonomous vehicles or advanced driver assistance systems, commonly referred to as automotive lidars. SPIE is sponsoring three years of testing to support this goal. We discuss the second-year test results. In year two, we tested nine models of automotive grade lidars, using child-size targets at short ranges and larger targets at longer ranges. We also tested the effect of high reflectivity signs near the targets, laser safety, and atmospheric effects. We observed large point densities and noise dependencies for different types of automotive lidars based on their scanning patterns and fields of view. In addition to measuring point density at a given range, we have begun to evaluate the point density in the presence of measurement impediments, such as atmospheric absorption or scattering and highly reflective corner cubes. We saw dynamic range effects in which bright objects, such as road signs with corner cubes embedded in the paint, make it difficult to detect low-reflectivity targets that are close to the high-reflectivity target. Furthermore, preliminary testing showed that atmospheric extinction in a water-glycol fog chamber is comparable to natural fog conditions at ranges that are meaningful for automotive lidar, but additional characterization is required before determining general applicability. This testing also showed that laser propagation through water-glycol fog results in appreciable backscatter, which is often ignored in automotive lidar modeling. In year two, we have begun to measure the effect of impediments to measuring the 3D point cloud density; these measurements will be expanded in year three to include interference with other lidars.
Automotive lidar is rapidly becoming a mainstream enabling technology for object detection and localization in advanced driver assistance systems and automated vehicles like robotaxis. In response to this demand, lidar characterization standards and specifications are being developed, with DIN SAE specification 91471 as one of the first published efforts. A core purpose of this presentation is to compare the recommendations in specification 91471 to what automotive lidar manufacturers are publishing and to discuss the differences. We will also make a case for employing component specifications like these in the context of vehicle and perception system level goals.
Snapshot polarimetry relies on the micro-polarizer array (MPA)—a spatial multiplexing of pixel-sized wiregrid analyzers. The limitation of MPA-based polarimetric imaging is the loss of spatial resolution and light received by each pixel. Reconstructing the degree and the angle of linear polarization (DoLP and AoLP, respectively) from MPA accurately requires the joint application of demosaicking (to demodulate the spatial modulation of the wiregrid analyzers) and denoising (to account for photon and thermal noise). We propose a wavelet-based Bayesian estimation technique for jointly demosaicking and denoising 2 × 2 MPA-sampled sensor data.
In 2014, we laid out the theory for how a 2×4 microgrid polarimetric array provides superior spatial resolution when compared to a conventional 2×2 array. In this paper, we provide experimental evidence to support our claims via a prototype 2×4-patterned infrared microbolometer camera developed by Polaris Sensors Technologies. The benefits of the 2×4 array are obtained through a combination of the physical arrangement of the pattern itself and though the application of a log-based framework for reconstructing degree and angle of linear polarization directly, without calculating Stokes parameters or interpolating intensity channels as intermediate steps.
In polarimetric imaging, degree and angle of linear polarization (DoLP and AoLP, respectively) are computed from ratios of Stokes parameters. In snapshot imagers, however, DoLP and AoLP are degraded by inherent mismatches between the spatial bandwidth supports of S0, S1, and S2 parameters reconstructed by demosaicking from microgrid polarizer array (MPA)-sampled data. To overcome this shortcoming, we rigorously show that log-MPA-sampled data approximately decouples DoLP and AoLP from the intensity component (S0) in the spatial Fourier domain. Based on this analysis, we propose an alternative demosaicking strategy aimed at estimating DoLP and AoLP directly from MPA-sampled data. Our method bypasses Stokes parameter estimation, alleviating the spatial bandwidth mismatch problems altogether and reducing the computational complexity. We experimentally verify the superior DoLP and AoLP reconstructions of the proposed log-MPA demosaicking compared to the conventional Stokes parameter demosaicking approach in simulation. We simulated the conventional 2 × 2 MPA patterns as well as the more recently introduced 2 × 4 MPA patterns, and report quantitative (mean squared error, structural similarity index, and polarization angular error) and qualitative results. We also provide a closed-form approximation error analysis on the log-MPA-sampled data to demonstrate that the approximation error is negligible for real practical applications.
Random, time-varying blur due to atmospheric turbulence is known to significantly limit image quality. In division-of-time polarimeters, turbulence also randomly distorts the images that are used to infer the polarization content of the scene, leading to substantial Stokes image estimation errors. This research proposes a method to jointly estimate the full polarization properties of a scene along with the point spread functions (PSFs) for each imaging channel in the presence of isoplanatic turbulence. In particular, this research significantly expands on an existing algorithm, to include circular polarization in the framework of a generalized expectation maximization approach. The effectiveness of the approach is demonstrated on laboratory data using a surrogate phase screen placed near the entrance pupil of an imaging system.
In the framework of NATO task group SET 226 on turbulence mitigation techniques for OA systems, a trial was conducted in the premises of RDDC-Valcartier, using indoor and outdoor facilities in September 2016. Images data sets were collected under various turbulence conditions, both controllable (indoor) and natural (outdoor). The imagery of this trial was used in the Grand Challenge, where different experts were asked to process identical input data with state-of-the-art algorithms. The trial also provided a data-base to validate theoretical and numerical models. The paper will give an overview of the experiment set-up (target, sensors, turbulence screens generators…) and present some preliminary results obtained with the collected data in terms of effectiveness of image processing techniques, new methods for turbulence characterisation, modelling of laser beam propagation.
The Python Based Sensor Model (pyBSM) provides open source functions for modeling electro-optical and infrared imaging systems. In this paper, we validate pyBSM predictions against laboratory measurements. Compared quantities include modulation transfer function, photoelectron count, and signal-to-noise ratio. Experiments are explained and code is provided with the details required to recreate this study for additional camera and lens combinations.
Computational efficiency and accuracy of wave-optics-based Monte–Carlo and brightness function numerical simulation techniques for incoherent imaging of extended objects through atmospheric turbulence are evaluated. Simulation results are compared with theoretical estimates based on known analytical solutions for the modulation transfer function of an imaging system and the long-exposure image of a Gaussian-shaped incoherent light source. It is shown that the accuracy of both techniques is comparable over the wide range of path lengths and atmospheric turbulence conditions, whereas the brightness function technique is advantageous in terms of the computational speed.
There are components that are common to all electro-optical and infrared imaging system performance models. The
purpose of the Python Based Sensor Model (pyBSM) is to provide open source access to these functions for other
researchers to build upon. Specifically, pyBSM implements much of the capability found in the ERIM Image Based
Sensor Model (IBSM) V2.0 along with some improvements. The paper also includes two use-case examples. First,
performance of an airborne imaging system is modeled using the General Image Quality Equation (GIQE). The results
are then decomposed into factors affecting noise and resolution. Second, pyBSM is paired with openCV to evaluate
performance of an algorithm used to detect objects in an image.
We describe a numerical wave propagation method for simulating long range imaging of an extended scene under anisoplanatic conditions. Our approach computes an array of point spread functions (PSFs) for a 2D grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. To validate the simulation we compare simulated outputs with the theoretical anisoplanatic tilt correlation and differential tilt variance. This is in addition to comparing the long- and short-exposure PSFs, and isoplanatic angle. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. The simulation tool is also used here to quantitatively evaluate a recently proposed block- matching and Wiener filtering (BMWF) method for turbulence mitigation. In this method block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged and processed with a Wiener filter for restoration. A novel aspect of the proposed BMWF method is that the PSF model used for restoration takes into account the level of geometric correction achieved during image registration. This way, the Wiener filter is able fully exploit the reduced blurring achieved by registration. The BMWF method is relatively simple computationally, and yet, has excellent performance in comparison to state-of-the-art benchmark methods.
We present a numerical wave propagation method for simulating imaging of an extended scene under anisoplanatic conditions. While isoplanatic simulation is relatively common, few tools are specifically designed for simulating the imaging of extended scenes under anisoplanatic conditions. We provide a complete description of the proposed simulation tool, including the wave propagation method used. Our approach computes an array of point spread functions (PSFs) for a two-dimensional grid on the object plane. The PSFs are then used in a spatially varying weighted sum operation, with an ideal image, to produce a simulated image with realistic optical turbulence degradation. The degradation includes spatially varying warping and blurring. To produce the PSF array, we generate a series of extended phase screens. Simulated point sources are numerically propagated from an array of positions on the object plane, through the phase screens, and ultimately to the focal plane of the simulated camera. Note that the optical path for each PSF will be different, and thus, pass through a different portion of the extended phase screens. These different paths give rise to a spatially varying PSF to produce anisoplanatic effects. We use a method for defining the individual phase screen statistics that we have not seen used in previous anisoplanatic simulations. We also present a validation analysis. In particular, we compare simulated outputs with the theoretical anisoplanatic tilt correlation and a derived differential tilt variance statistic. This is in addition to comparing the long- and short-exposure PSFs and isoplanatic angle. We believe this analysis represents the most thorough validation of an anisoplanatic simulation to date. The current work is also unique that we simulate and validate both constant and varying Cn2(z) profiles. Furthermore, we simulate sequences with both temporally independent and temporally correlated turbulence effects. Temporal correlation is introduced by generating even larger extended phase screens and translating this block of screens in front of the propagation area. Our validation analysis shows an excellent match between the simulation statistics and the theoretical predictions. Thus, we think this tool can be used effectively to study optical anisoplanatic turbulence and to aid in the development of image restoration methods.
Computational efficiency and accuracy of wave-optics-based Monte-Carlo and brightness function numerical simulation techniques for incoherent imaging through atmospheric turbulence are evaluated. Simulation results are compared with theoretical estimates based on known analytical solutions for the modulation transfer function of an imaging system and the long-exposure image of a Gaussian-shaped incoherent light source.
Differential tilt variance is a useful metric for interpreting the distorting effects of turbulence in incoherent imaging systems. In this paper, we compare the theoretical model of differential tilt variance to simulations. Simulation is based on a Monte Carlo wave optics approach with split step propagation. Results show that the simulation closely matches theory. The results also show that care must be taken when selecting a method to estimate tilts.
Uncorrected or poorly corrected bad pixels reduce the effectiveness of polarimetric clutter suppression. In conventional microgrid processing, bad pixel correction is accomplished as a separate step from Stokes image reconstruction. Here, these two steps are combined to speed processing and provide better estimates of the entire image, including missing samples. A variation on the bilateral filter enables both edge preservation in the Stokes imagery and bad pixel suppression. Understanding the newly presented filter requires two key insights. First, the adaptive nature of the bilateral filter is extended to correct for bad pixels by simply incorporating a bad pixel mask. Second, the bilateral filter for Stokes estimation is the sum of the normalized bilateral filters for estimating each analyzer channel individually. This paper describes the new approach and compares it to our legacy method using simulated imagery.
KEYWORDS: Short wave infrared radiation, Skin, Polarimetry, Received signal strength, Polarization, Imaging systems, Polysomnography, Target detection, Linear polarizers, Infrared signatures
The reflective bands in modern imaging, i.e., the visible through the short wave infrared (SWIR), have become very
attractive for use in both daytime and low light target acquisition and surveillance. In addition, the nature of the target in
modern conflict again includes the human body as a principle target. The spectral natures of the reflectivities of humans,
their clothing, what they may be carrying, and the environments in which they are immersed, along with the spectral
nature and strength of the light sources that illuminate them, have been the essential components of the contrasts in the
signatures that are used in models that predict probabilities of target acquisition and discrimination. What has been
missing is the impact that polarization in these signatures can have on image contrast. This paper documents a
preliminary investigation into the contrast in active and passive polarimetric signatures of humans holding two-handed
objects in the SWIR.
Microgrid polarimetric imagers sacrifice spatial resolution for sensitivity to states of linear polarization. We have recently shown that a 2 × 4 microgrid analyzer pattern sacrifices less spatial resolution than the conventional 2× 2 case without compromising polarization sensitivity. In this paper, we discuss the design strategy that uncovered the spatial resolution benefits of the 2 × 4 array.
Nighttime active SWIR imaging has resolution, size, weight, and power consumption advantages over passive MWIR and LWIR imagers for applications involving target identification. We propose that the target discrimination capability of active SWIR systems can be extended further by exerting polarization control over the illumination source and imager, i.e. through active polarization imaging. In this work, we construct a partial Mueller matrix imager and use laboratory derived signatures to uniquely identify target materials in outdoor scenes. This paper includes a description of the camera and laser systems as well as discussion of the reduction and analysis techniques used for material identification.
An aerosol modulation transfer function (MTF) model is developed to assess the impact of aerosol scattering on passive long-range imaging sensors. The methodology extends from previous work to explicitly address imaging scenarios with a nonuniform distribution of scattering characteristics over the propagation path and incorporates the moderate resolution transfer code database of aerosol cross-section and phase function characteristics in order to provide an empirical foundation for realistic quantitative MTF assessments. The resulting model is compared with both predictions from a Monte-Carlo scattering simulation and a scene-derived MTF estimate from an empirical image, with reasonable agreement in both cases. Application to long-range imaging situations at both visible and infrared wavelengths indicates that the magnitude and functional form of the aerosol MTF differ significantly from other contributors to the composite system MTF. Furthermore, the image-quality impact is largely radiometric in the sense that the contrast reduction is approximately independent of spatial frequency, and image blur is practically negligible.
Pixel-to-pixel response nonuniformity is a common problem that affects nearly all focal plane array sensors.
This results in a frame-to-frame fixed pattern noise (FPN) that causes an overall degradation in collected
data. FPN is often compensated for through the use of blackbody calibration procedures; however, FPN is
a particularly challenging problem because the detector responsivities drift relative to one another in time,
requiring that the sensor be recalibrated periodically. The calibration process is obstructive to sensor operation
and is therefore only performed at discrete intervals in time. Thus, any drift that occurs between
calibrations (along with error in the calibration sources themselves) causes varying levels of residual calibration
error to be present in the data at all times. Polarimetric microgrid sensors are particularly sensitive to
FPN due to the spatial differencing involved in estimating the Stokes vector images. While many techniques
exist in the literature to estimate FPN for conventional video sensors, few have been proposed to address the
problem in microgrid imaging sensors. Here we present a scene-based nonuniformity correction technique
for microgrid sensors that is able to reduce residual fixed pattern noise while preserving radiometry under
a wide range of conditions. The algorithm requires a low number of temporal data samples to estimate the
spatial nonuniformity and is computationally efficient. We demonstrate the algorithm's performance using
real data from the AFRL PIRATE and University of Arizona LWIR microgrid sensors.
Image quality in high altitude long range imaging systems can be severely limited by atmospheric absorption, scattering,
and turbulence. Atmospheric aerosols contribute to this problem by scattering target signal out of the optical path and by
scattering in unwanted light from the surroundings. Target signal scattering may also lead to image blurring though, in
conventional modeling, this effect is ignored. The validity of this choice is tested in this paper by developing an aerosol
modulation transfer function (MTF) model for an inhomogeneous atmosphere and then applying it to real-world
scenarios using MODTRAN derived scattering parameters. The resulting calculations show that aerosol blurring can be
effectively ignored.
The LWIR microgrid Polarized InfraRed Advanced Tactical Experiment (PIRATE) sensor was used to image
several types of RC model aircraft at varying ranges and speeds under different background conditions. The data
were calibrated and preprocessed using recently developed microgrid processing algorithms prior to estimation
of the thermal (s0) and polarimetric (s1 and s2) Stokes vector images. The data were then analyzed to assess the
utility of polarimetric information when the thermal s0 data is augmented with s1 and s2 information for several
model aircraft detection and tracking scenarios. Multi-variate analysis tools were applied in conjunction with
multi-hypothesis detection schemes to assess detection performance of the aircraft under different background
clutter conditions. We find that polarization is able to improve detection performance when compared with
the corresponding thermal data in nearly all cases. A tracking algorithm was applied to a sequence of s0 and
corresponding degree of linear polarization (DoLP) images. An initial assessment was performed to determine
whether polarization information can provide additional utility in these tracking scenarios.
We present a comparative study involving five distinctly different polarimetric imaging platforms that are designed
to record calibrated Stokes images (and associated polarimetric products) in either the MidIR or LWIR spectral
regions. The data set used in this study was recorded during April 14-18, 2008, at the Russell Tower Measurement
Facility, Redstone Arsenal, Huntsville, AL. Four of the five camera systems were designed to operate in the LWIR
(approx. 8-12μm), and used either cooled mercury cadmium telluride (MCT) focal-plane-arrays (FPA), or a near-room
temperature microbolometer. The lone MidIR polarimetric sensor was based on a liquid nitrogen (LN2) cooled
indium antimonide (InSb) FPA, resulting in an approximate wavelength response of 3-5μm. The selection of
cameras was comprised of the following optical designs: a LWIR "super-pixel," or division-of-focal plane (DoFP)
sensor; two LWIR spinning-achromatic-retarder (SAR) based sensors; one LWIR division-of-amplitude (DoAM)
sensor; and one MidIR division-of-aperture (DoA) sensor. Cross-sensor comparisons were conducted by examining
calibrated Stokes images (e.g., S0, S1, S2, and degree-of-linear polarization (DOLP)) recorded by each sensor for a
given target at approximately the same test periods to ensure that data sets were recorded under similar atmospheric
conditions. Target detections are applied to the image set for each polarimetric sensor for further comparison, i.e.,
conventional receiver operating characteristic (ROC) curve analysis and an effective contrast ratio are considered.
We present a comparative study involving five distinctly different polarimetric imaging platforms that are designed
to record calibrated Stokes images (and associated polarimetric products) in either the MidIR or LWIR spectral
regions. The data set used in this study was recorded during April 14-18, 2008, at the Russell Tower Measurement
Facility, Redstone Arsenal, Huntsville, AL. Four of the five camera systems were designed to operate in the LWIR
(approx. 8-12μm), and used either cooled mercury cadmium telluride (MCT) focal-plane-arrays (FPA), or a near-room
temperature microbolometer. The lone MidIR polarimetric sensor was based on a liquid nitrogen (LN2) cooled
indium antimonide (InSb) FPA, resulting in an approximate wavelength response of 3-5μm. The selection of
cameras was comprised of the following optical designs: a LWIR "super-pixel," or division-of-focal plane (DoFP)
sensor; two LWIR spinning-achromatic-retarder (SAR) based sensors; one LWIR division-of-amplitude (DoAM)
sensor; and one MidIR division-of-aperture (DoA) sensor. Cross-sensor comparisons were conducted by examining
calibrated Stokes images (e.g., S0, S1, S2, and degree-of-linear polarization (DOLP)) recorded by each sensor for a
given target at approximately the same test periods to ensure that data sets were recorded under similar atmospheric
conditions. Target detections are applied to the image set for each polarimetric sensor for further comparison, i.e.,
conventional receiver operating characteristic (ROC) curve analysis and an effective contrast ratio are considered.
Polarimetric sensors are valued for their capability to distinguish man-made objects from surrounding clutter. The
SPITFIRE (Spectral Polarimetric Imaging Test Field InstRumEnt) polarimetric camera is designed to function in
multiple bands in the Short Wave Infrared (SWIR) and Mid-Wave Infrared (MWIR) regions. SPITFIRE is a Stokes
micro-grid polarimetric system with a 4 band spectral filter wheel. The focal plane array (FPA) as well as the filter
wheel are located in a Dewar which is cooled via liquid nitrogen. By cooling the band-pass filter to the same
temperature as the FPA, self-emission noise is decreased. In this paper we discuss the design and fabrication of the
polarimetric camera (optics, Dewar, filter wheel and FPA), the data capture and processing system, initial
characterization of the camera's performance, and future plans for the camera.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.