Filter-Based Instrument

../_images/superpixel_filter_scheme.png

Filter based instruments are imaging systems that place a wavelength filter in the optical path. The filter may be a regular optical filter or a tuneable filter such as an AOTF. Pixelated areas on the back-end detection plane have a one-to-one correspondence with incoming directions at the front end. Incoming signals from direction \(\Omega\) at the front end contribute to all pixelated areas that overlap on the detection plane with its spatial point spread function. All wavelengths that pass through the wavelength filter contribute to the pixelated area.

The spatial point spread function of an incoming direction, \(\Omega\), is determined by the optical properties of the system and is often a slowly varying function of wavelength and incoming direction.

../_images/superpixel_psf.png

Fig. 6 All incoming directions, \(\Omega\), whose point spread function overlaps with the target pixel will contribute to that pixel. All wavelengths within the SRF contribute to the pixel signal.

Background Theory

We outline the theory used to calculate photon signals incident upon finite-sized, pixelated areas in the back-end detection plane. First we work out the signal contribution at a given wavelength to an infinitessimal section of the pixelated area from the PSF integrated over all directions. The signal is then integrated across all infinitessimal areas in the pixelated area and finally integrated across all wavelengths.

PSF

The PSF describes how a signal at wavelength \(\lambda\), incident from direction \(\Omega\), is distributed across the detection plane. The PSF is typically Gaussian-like. The PSF can be simplified if it is symmetric as it becomes a function of only radial distance, \(r\), betweenon the detection plane.

\[\mathrm{PSF}(\lambda, \Omega, y, z) \rightarrow \mathrm{PSF}(\lambda, \Omega, r)\]

The area under the PSF is normalized to unity to guarantee that photons incident at the front-end optics are distributed somewhere across the detection plane. This is refined in the integrals below by adding a vignetting term, \(V(\lambda,\Omega,y,z)\), which describes loss terms that may occur within the optical train.

\[\int_x\int_y \mathrm{PSF}(\lambda, \Omega, y, z)\: \mathrm{d}x\,\mathrm{d}x = 1\]

We note for convenience the following definite integral which is useful when using radial Gaussian point spread functions,

\[\int_0^\infty r \mathrm{e}^{-r^2}\,\mathrm{d}r = \frac{1}{2}\]

SRF

The spectral response function describes the sensitivity of the optical system to different wavelengths. It is usually a function of wavelength, \(\lambda\) and direction, \(\Omega\). The SRF should be chosen so it is between 0 and 1.

\[0.0 \geq \mathrm{SRF}(\lambda,\Omega) \leq 1.0\]

Steps

Step 1): PSF Contribution to infinitessimal area in target location at (y,z) and apply vignetting. The signal contribution from all rays at wavelength \(\lambda\) to an infinitessimal area on the detection plane at location \((y,z)\) is given by integrating over all incoming directions, \(\Omega\).

\[e(x,y,\lambda)\Delta\lambda\,\Delta y\,\Delta z = \left[\int_{\Omega} I(\lambda, \Omega)\,V(\lambda,\Omega,y,z)\,\mathrm{PSF}(\lambda,\Omega,y,z)\,\mathrm{d}\Omega\right]\:\Delta\lambda\,\Delta y\,\Delta z\]

where, \(I(\lambda,\Omega)\) is the incident radiance at wavelength \(\lambda\) in direction \(\Omega\) and \(V(\lambda,\Omega,y,z)\) is the vignetting of the signal within the optical train.

Step 2): Integrate PSF contribution across all locations in target pixel. The differential signal can be integrated across the entirety of the pixelated area,

\[e(\lambda)\:\Delta\lambda = \left[\int_{(y_0,z_0)}^{(y_1,z_1)}\left[\int_{\Omega}I(\lambda,\Omega)\,V(\lambda,\Omega,y,z)\,\mathrm{PSF}(\lambda, \Omega, y,z)\,\mathrm{d}\Omega\right]\mathrm{d}y\,\mathrm{d}z\right]\:\Delta\lambda\]

Step 3): Apply the SRF contribution to the target pixel. The differential signal can be integrated across all wavelengths to give the irradiance incident upon the pixelated area,

\[E(y,z) = \int_\lambda \mathrm{SRF}(\lambda,\Omega) \left[\int_{(y_0,z_0)}^{(y_1,z_1)}\left[\int_{\Omega}I(\lambda,\Omega)\,V(\lambda,\Omega,y,z)\,\mathrm{PSF}(\lambda,\Omega,y,z)\,\mathrm{d}\Omega\right]\mathrm{d}y\,\mathrm{d}z\right]\:\mathrm{d}\lambda\]

or

Note

\[E(y,z) = \int_\lambda\int_{(y_0,z_0)}^{(y_1,z_1)}\int_{\Omega}I(\lambda,\Omega)\,\mathrm{SRF}(\lambda,\Omega)\,V(\lambda,\Omega,y,z)\,\mathrm{PSF}(\lambda,\Omega,y,z)\,\mathrm{d}\Omega\,\mathrm{d}y\,\mathrm{d}z\:\mathrm{d}\lambda\]

where \(E(y,z)\) is the total irradiance incident upon the pixelated area indexed by \((y,z)\) due to signal incident upon unit-area of the front aperture. The units of the irradiance are photons/cm2/sec where the cm2 refers to the unit area at the front aperture and not the area of the pixelated area.

Nomenclature

Signal

Units

Symbol

radiance

photons/cm2/nm/sec/ster

\(I\)

irradiance

photons/cm2/sec

\(E\)

photon flux

photons/sec

\(P\)

vignetting

fraction

\(V\)

spectral response function

fraction

\(\mathrm{SRF}\)

point spread function

fraction per cm2

\(\mathrm{PSF}\)