How is Color Derive...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

How is Color Derived?

6 Posts
3 Users
2 Likes
502 Views
(@acatalano)
Molecular Cloud
Joined: 3 years ago
Posts: 3
Topic starter  

I am a newbie to AP and APP and am using a Nikon D5100 on a C8 to create images. I have Bortle 4-5 skies and generally avoid using any filters. APP is great because it has allowed me to get very good results with only basic understanding by structuring the workflow. So a BIG Thank You!

While I understand the concept of stretching an image with a very narrow luminance histogram, I do not understand how color is achieved in a processed image. Although the colors from the broadband pixel filters on a Nikon allow for individual channels of color to be processed, is it just the differences in the stretching of each channel that creates the colors? I assume that the "actual" colors of a celestial object such as a galaxy in actuality are simply different color temperatures of "white" rather than narrow band colors (except for nebula e.g., H-alpha). So all apparent colors in a processed image are in effect artifacts of processing? ...or is there more to it?

Just trying to get the general idea here, not necessarily the specifics of how APP does it. I have been using the slider controls in APP on the final image and they give fine results. 😘 


   
ReplyQuote
Topic Tags
(@wvreeven)
Quasar
Joined: 6 years ago
Posts: 2133
 

@acatalano Hi Anthony,

The sensors of your camera has, as you also indicate, color sensitive pixels. During the processing of the images, the CFA or Bayer pattern is used to decode the sensor data into colors. There are several algorithms to do that and this wikipedia page gives a lot of info about that. When the final stretched image is made, stretching of the channels is done equally for all three color channels and then saturation is applied.

By the way, the histogram of your images before stretching do not contain a luminace channel but three color channels: one for R, G and B each.

 

Wouter


   
ReplyQuote
(@acatalano)
Molecular Cloud
Joined: 3 years ago
Posts: 3
Topic starter  

Wouter

Thanks very much for the reply. I'm sure my question seemed hopelessly naive. Simply put is there any relationship between the reality of color temperature (eg 5500K for the sun) of astronomical objects particularly galaxies and the heavily processed images that result from APP or other image processing applications? I assume there is not since photos are often too colorful and so colors are the result of the  processing algorithm math. Also looking at the photos of a galaxy such as M33 on the internet, the  photos are all over the map in terms of coloration. (My own efforts with M33 using APP are here: https://www.boulderwx.com/astro.php ) Nebula are an exception since the red coloration results from narrow band H-alpha.

One could imagine an algorithm that might map color temperature into more vivid output (I'm thinking of the color created by IR cameras here, where the light in question is invisible to the human eye). Since one does not resolve individual stars in photo of a galaxy for the most part, I suppose it is just impossible to do. The colorful nature of processed images of galaxies has me wondering what the color actually means, if anything. For example, dust lanes in galaxy photos; so that is an identifiable structure. I'm just wondering if there is any other real meaning in the coloration.


   
ReplyQuote
(@wvreeven)
Quasar
Joined: 6 years ago
Posts: 2133
 

@acatalano There IS an astrophysical relationship between the color of a black body radiator (like a star) and it's temperature. This is the displacement law of Wien. In short: cooler objects are redder and hotter objects are bluer. In case of emission line objects (like emission nebulae like the Orion Nebula) it is more complicated because the color depends on the components in the nebula (hydrogen, oxygen, sulfur and other elements), how much of each element is there and the temperature of the gas. Hydrogen emits visible light at wavelengths that are known as the Balmer lines.

However, if you take a picture with your camera then a stretch will NOT change that color. In fact, apart from the quantum efficiency of the sensor (basically the sensitivity of the sensor for each wavelength of light), your camera registers the colors and, apart from the way the Bayer pattern is interpreted, APP outputs those colors exactly as they are. This is what I meant when I wrote that each color gets stretched in the same way.

After that you can increase the saturation to make the colors more vivid and even play with the color balance in many ways to make the result more appealing (which is a matter of taste) but your original question was about stretching only 🙂


   
ReplyQuote
(@acatalano)
Molecular Cloud
Joined: 3 years ago
Posts: 3
Topic starter  

Wouter

Ah, I see now. I thought that perhaps there was mathematical "stretch" (spectral re-distribution) involved (or could be invoked if desired). So color accuracy is maintained. For a black-body radiator such as a star,- temperature is everything. It specifies the spectral distribution and perceived color. Thank you, that answers the question.


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Yes and the color correction tool also makes use of this black body characteristic. So if you have broadband data, you can use that to have a correction which is scientifically as accurate as possible (not necessarily to taste 😉 ).


   
ReplyQuote
Share: