GOES, Himawari, Meteosat, FENGYUN, Elecktro and other Satellites don’t “see” in color, they all basically “see” in various greyscale IR bands,
There are two visible bands, blue and red. These two wavelengths are named relative to their location on the visible portion of the electromagnetic spectrum.
While these visible images would appear in blue and red hues in their natural state, colors are desaturated to always appear in grayscale. Since these are “visible” channels the images will appear black at night.
There is no “green” channel. That is important as all three colors, red, green and blue, are needed to produce a true color image. While most software systems use at least three bands to create a false color image, (2 Visible, and 1 IR), those of us who receive HRIT do not have that luxury, as we only get one visible band (02), and combining that with an IR band (13) and applying a color Look Up Table which gives us a fairly decent false-color image.
This has its issues though, since the look-up table uses reflected ground temperatures to try and closely match a color from the LUT to the observed surface temp, it has the result of always turning the upper and lower latitudes bluer in the winters. More saturation can be applied at the cost of some visibility of these latitudes during the winter, but it has the effect of ‘blowing out’ the cloud tops. I have built my own LUT for GOES satellites, which, I think, balances out the seasons of change.
While it works quite well, some of those receiving imagery only get one IR image, (typically a relayed image) usually band 13. So, what can be done? Use one method that EUMETSAT, NOAA, NWS, and other agencies have done and continue to do- apply an underlay created from polar-orbiting satellites in the best clarity and color possible.
One fantastic tool to do this is Sanchez, by Matt Painter. Think of it like a swiss army knife, and a building block for other tasks. Sanchez allows the end user to use an underlay to colorize a previously greyscale image, and if you desire, reproject it, animate it, create different views, etc.
Using Sanchez as just one part of a series of processes allows me to create some really stellar views of earth, coming close to that Tru-Color imagery from only HRIT greyscale images. In the animation below, you can see the results of this multi layered process which is listed next.
This animation uses many different subroutines, utilizing software such as ImageMagick, FFMPEG, Sanchez, and Xplanet. I use an Xplanet subroutine to generate a dynamic background that is time synched by date and time UTC, and timed to create a matched day/night underlay. The underlay is comprised of two global images of earth, one, the daylight view is from the NASA Visible Earth Project using the most clear and detailed imagery I can find. The nighttime imagery is from the Suomi NPP polar orbiter, showing the earth at night. This is then combined with two bands of imagery, both visible band 02, and IR band 13. I use both bands because if I used just the visible, the night side would have no cloud layers as it would be black.
ELEVATION DATA – The underlay also utilizes a few more features. I use elevation data of the landmasses of Earth to build a “bump map” which gives a little more 3D detail to the planet, and allows accurate shadows to fall across mountain ranges and valleys (which can be seen in the animation, especially along the Andes mountain range).
SPECULAR LIGHTING – Specular Lighting is also used to reproduce sunglint, also called a specular reflection, this creates a glint of sunlight reflection on all water surfaces, lakes, rivers, and oceans utilizing yet another ‘map’ known as a specular map. This mirror-like reflection, known as the specular point, is processed by time and date with known keplerian planetary and solar data to show the reflection of the sunlight as closely as possible as the earth moves in orbit, in relation to the position of the sun.
The last process done to the imagery is to apply a subtle rayleigh light scattering effect. Rayleigh scattering refers to the scattering of light off of the molecules of the air, and can be extended to scattering from particles up to about a tenth of the wavelength of the light. It is Rayleigh scattering off the molecules of the air which gives us the blue sky we see. The software needs to create this effect by first creating ‘scattering’ tables
Tables are calculated for fixed phase angles (satellite latitude and longitude – ground point – sun angle). For lines of sight intersecting the disk, the tables are a 2D array of scattering intensities tabulated as a function of incidence (the zenith angle of the sun seen from the vantage of the satellite) and emission (the zenith angle of the observer seen from the vantage of the satellite) angles.
Tables are created at each degree between 0 and 180 degrees for the limb and between 0 and 90 degrees for the disk using the incidence, emission, and tangent height values specified:
For my projections, I use these values:
|Planetary radius of||6378.140 km|
|Atmospheric scale height||8000 meters*|
|Index of refraction of air||1.000293|
|The density of air at the reference level, |
in molecules/cubic meter
**(changing this can result in the atmosphere looking like watery milk!)
Below is the process chart that outlines the steps in the creation of these types of images and animations:
Below is a reduced static image created with the above process, clicking on it will open it up in a new window that you can zoom better.