Okay, if we are derailing the thread, I would like to mention a project I did sometime in the early 1980s. The company I worked for had just acquired an imaging infrared camera (FLIR or Forward Looking Infra Red) with two liquid nitrogen cooled detectors, one "tuned" to 3 to 6 μm and the other to 8 to 12 μm wavelengths. These are the two atmospheric "windows" that more or less easily transmit infrared radiation with little attenuation from water vapor and aerosols naturally present in air. The camera used miniature galvanometers to scan the focal plane of a germanium lens in two directions across each sensor. Being mechanical (and resonant to boot!) the horizontal scan rate was about half the normal NTSC monochrome line rate. To compensate for this disparity, the camera digitized the sensor output and stored the results in a FIFO (First In, First Out) buffer. It did this for both directions of the horizontal scan (right to left and left to right). Being resonant, the scan was sinusoidal. So, without correction, the video image would have been quiet distorted if samples were taken at uniform intervals in time. IIRC correctly, the A/D samples were controlled by an oscillator that ran at some multiple of the line rate to produce about 512 samples per scan line, but the samples were taken at uniformly spaced amplitude increments of the original sinusoidal drive signal. Voila! Non linear sweep results in linearly spaced samples!
Now all of this was marvelous enough, and we all had a fun time pointing the camera around and taking pictures (using a rack-mounted digital frame grabber about the size of large computer monitor) and discovering how infrared imaging worked in the "real world". But to see both wavelength bands you had to flip a switch because the camera would only display one or the other at its NTSC composite video output. The powers-that-be wanted to display both bands simultaneously on a TV monitor and capture the superimposed images on the frame grabber. After examining the circuitry for awhile, I realized that the FIFO offered a solution. To accommodate NTSC scan rates, the FIFO was actually read out twice for each right-to-left or left-to-right horizontal scan. All I had to do was take the video, derived with D/As from FIFO outputs (one from each of the two sensors), and rapidly switch them back and forth at the TV line rate. Fortunately for me, I found an integrated circuit video switch that would operate at the TV line rate. Along with a couple of TTL "glue" chips, the whole circuit, including BNC connectors, fit inside a small Bud die-cast aluminum box that mounted to the back of the camera electronics box.
When I fired it up the dual-band imaging worked as advertised and the powers-that-be were suitably impressed. However, I wanted to go further and adapt a CCD visible light camera to coaxially view the same scene as the infrared camera and display the combined (and suitably scaled) images (visible plus two infrared) on a single monitor. Well, it turned out that this was already being done for remote reconnaissance and heads-up targeting systems, but using very expensive staring infrared arrays instead of the crude mechanically scanned version we had. The technology was called sensor fusion and included visible, infrared, and radar. At that time it was all highly classified, and I didn't yet have the clearances or the need-to-know to access information about that technology. So when I proposed adding the CCD camera to the mix, the powers-that-be (who did have the appropriate "tickets" and knew about sensor fusion) got strangely quiet and then quickly told me, "Never-mind about that. This FLIR modification is all we need."
Unfortunately I don't remember the part number for the high-speed video switch, and it has probably long since been replaced by newer devices. Witness the "magic" special video effects available today at your local TV station or Hollywood production studio to see where this eventually wound up in commercial products. Later that same decade we bought a push-broom airborne hyperspectral scanner from a small company in New York City that hand-built them. It took 512 images simultaneously, each representing a narrow range of the spectrum, from ultraviolet to near-infrared, and recorded them in real time on 9-track digital magnetic tape. Of course today it would use a flash drive, but back then this was hot sh*t. Our software weenies had a field day extracting data from the humongous image set and displaying it in a form human beings could look at and understand. One example I remember used the spectral "signature" of a certain type of camouflage netting, nominally invisible to the naked eye, to color it bright red in the image displayed on a color monitor. Good luck trying to hide something under that particular camouflage. It's also useful for determining vegetation varieties and the health of same. Today there are many multi-spectral orbiting satellite imagers available for commercial use. Similar eye-in-the-sky hyper-spectral imaging systems may have existed in the 1980s (or earlier), but they would have been (and may still be) highly classified. I haven't worked in that area since 1990, but Aviation Week and Space Technology magazine does a pretty good job of keeping me up to date.