Color Working Space...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

Color Working Space of APP

6 Posts
2 Users
0 Likes
2,033 Views
(@samueljohnchia)
Brown Dwarf
Joined: 3 years ago
Posts: 4
Topic starter  

Hello Everyone, I'm new to APP and have just started using the software. About a year ago I tried a trial version and I just purchased a license. I have two questions for the moment.

1. I was wondering what is the native colour working space of APP when loading and processing raw images? Is it Adobe RGB? Probably not in all cases, since I see that APP asks for what type of channel data I am providing when I load files. I am using a mirrorless camera, so it's the same as regular dslr data, and I selected just 'RGB' channel data that I want to process. So specifically, what native space is used when processing RGB data? And for the sake of curiosity, what kind of colour space it is for multi-channel data processing?

2. If I load my own processed TIFFs (I have good reasons to do so at the moment, I'll explain in a bit), and my TIFFs are in ProPhoto RGB working space, will APP honour this working space and continue to process the data in whatever the document colour profile is, or will it convert it to Adobe RGB or some other working space?

This is no longer a question but rather a feature request, which I made to Mabula over a year ago: I see that I am forced to choose between sRGB and Adobe RGB when I save out my processed imaged from APP. It would be so great if users can be allowed to save out images in any colour profile of their choice, which can be loaded from the profiles directory of whatever OS one is using. Most raw conversion software support this. My primary motivation is that in many wide field and also deep sky astro images, the subject's dynamic range is huge; there are a lot stars that are very bright and near or at clipping. The additional gamut headroom of larger colour spaces are good to help prevent clipping of these colours, which can then be later compressed in a pleasing way into a smaller output space, for printing let's say, or sRGB for the web. But if these colours are hard clipped at an early stage, they will be lost forever and may not look as good as one might like. There may be other reasons to use colour spaces other than sRGB or Adobe RGB, maybe DCI P3 space if you are on MacOS (Instagram supports it now from iOS devices I gather) or if one is creating HDR images that are in Rec2020 colour space for HDR displays/TV, etc.

At this point, I am still supply APP my own processed TIFFs (I use RawTherapee for my conversions) because even the latest Adaptive Airy Disc demosaicing is significantly worse than the available options in RawTherapee, especially the newer hybrid demosaicers, AMaZE+VNG4, DCB+VNG4 and RCD+VNG4, the latter usually the best for astro-type imagery. AAD shows significantly more chroma noise and pattern noise (pixel-chain-like clumping artifacts, a common problem as there is no one perfect interpolation, so if you aim for more resolution, there will be more noise in the result, hence the hybrid demosaicers in RawTherapee), lower resolution, significantly more false colour artifacts and the interpolation often results in a gross error where faint stars 'bloom' and become a 3x3 colour blob that is supersaturated in colour. There also seems to be something wrong with the default black level setting for raw conversion (too high), but since I'm not yet familiar, that could be down to user error. My conclusions about the demosaicing are not, however. Perhaps I should open another thread to address this separate topic, though I have raised all this to Mabula over a year ago already and provided raw files, conversion details as well as my own TIFF conversions to demonstrate what I was referring to. Please do not take this as a bashing of APP. I think it is wonderful in many ways. The Lanczos interpolation with overshoot prevention is a godsend. Many features seem to be indispensable to many astro-imagers. I would love to see APP become even better than it is and hopefully one day I can feed it directly raw data and get better results. Unfortunately, carefully pre-processed non-linear TIFFs actually result in better quality output for me at the moment, despite every advice against it. APP is still invaluable to me for registering and stacking and I will continue to use it for that. Thanks to Mabula for creating this wonderful software!

This topic was modified 3 years ago 3 times by Samuel Chia

   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Hi Samuel @samueljohnchia ,

1. I was wondering what is the native colour working space of APP when loading and processing raw images? Is it Adobe RGB? Probably not in all cases, since I see that APP asks for what type of channel data I am providing when I load files. I am using a mirrorless camera, so it's the same as regular dslr data, and I selected just 'RGB' channel data that I want to process. So specifically, what native space is used when processing RGB data? And for the sake of curiosity, what kind of colour space it is for multi-channel data processing?

The color space is linear RGB, because in Astro Pixel Processor, we assume that you load linear & raw data into APP for optimized data calibration and subsequent processing, for instance: removing light pollution and performing star color calibration needs the data to be linear to make sense scientifically and mathematically. If your data is already stretched before performing data calibration, then data calibration can never work reliably. If the data is already stretched, star analysis will suffer and thus registration/alignment precision will suffer as a consequence. If the data is already stretched, background subtraction or making a smooth background while removing light pollution will completely destroy color ratios because a subtraction is done on data that is no longer linear. Mathematically, this does not make sense. The resulting data can never be used reliably for the purposes of good colors in the entire field of view if you use pre-stretched data. This is not on opinion, but plain mathematics I think.

Adobe/sRGB or other profiles are non-linear colorspaces and thus will only enter the workflow when you save stretched data with the preview filter on the right 😉 In the next APP version, the image saver module will be improved in a big and important way, you will be able to use many more ICC Color Space profiles when saving stretched data:

  • sRGB v2 legacy
  • sRGB v2 2014
  • sRGB v4 Preference
  • sRGB v4 Appearance
  • ColorMatch RGB
  • Adobe RGB 1998
  • Apple RGB
  • Wide-Gamut RGB
  • ProPhoto RGB

I will answer the other questions in separate posts 😉


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

@samueljohnchia

2. If I load my own processed TIFFs (I have good reasons to do so at the moment, I'll explain in a bit), and my TIFFs are in ProPhoto RGB working space, will APP honour this working space and continue to process the data in whatever the document colour profile is, or will it convert it to Adobe RGB or some other working space?

If the TIFFs include metadata for the ProPhoto RGB ICC profile, the colorspace should be honoured because the TIFF loader in APP will apply the ICC profile to the data contained in the TIFF file as it is supposed to do for the purpose of color management.

Now APP is assuming linear data, and your TIFFs will obviously no longer be linear. (If they are still linear, then the use of ProPhoto makes no sense in terms of color management technically). APP will NOT make a conversion from ProPhoto to the LinearRGB space behind the scenes, so the colorspace should be preserved of the data that is loaded into APP.

I do need to warn you that data calibration will really never work reliable if you start with non-linear TIFFs.

And perhaps a clarification is in place for you but also for others that are reading this:

Most computer/laptop monitors will only show sRGB 100%, some even don't. Only expensive laptops and monitors will show more colors than the sRGB space, those expensive ones will be able to show Adobe1998 or DCI-P3 to a high degree. Monitors that show the so-called wide-gamut spaces like ProPhoto RGB are technically not even realized I think... if they are, they will be very expensive I would think.

I understand that working in a large color space has it's benefits to not clip colors, but you need to realize that you are not fully seeing on your monitor what you are doing if your monitor can only show sRGB. (Now assuming we work on a monitor that can only show sRGB 100% and nothing more...this assumption is reasonable for the majority of the astrophotographers worldwide)

Now, if you save the data with a ProPhoto ICC profile, it might look great on your sRGB monitor on which you did the processing. If your friend, that has an Adobe1998 monitor looks at your image, it will look different than on your screen... since his screen will be able to show more colors than your monitor and will show the colors outside of the sRGB colorspace. (and this actually is not how color management should be used, the purpose of color management is that you can present your image in the way you want/need it to look 😉 on other computer screens.)

So my recommendation is that by using sRGB most users will get reliable results and will be able to control the colors. If you start playing with Adobe 1998 and even bigger colorspaces things can get messy and complicated quickly. I would only recommend this to users that have expensive computer screens that are able display the larger gamuts/color spaces and that have a good understanding of color spaces in general.

This post was modified 3 years ago 2 times by Mabula-Admin

   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

@samueljohnchia

This is no longer a question but rather a feature request, which I made to Mabula over a year ago: I see that I am forced to choose between sRGB and Adobe RGB when I save out my processed imaged from APP. It would be so great if users can be allowed to save out images in any colour profile of their choice, which can be loaded from the profiles directory of whatever OS one is using. Most raw conversion software support this. My primary motivation is that in many wide field and also deep sky astro images, the subject's dynamic range is huge; there are a lot stars that are very bright and near or at clipping. The additional gamut headroom of larger colour spaces are good to help prevent clipping of these colours, which can then be later compressed in a pleasing way into a smaller output space, for printing let's say, or sRGB for the web. But if these colours are hard clipped at an early stage, they will be lost forever and may not look as good as one might like. There may be other reasons to use colour spaces other than sRGB or Adobe RGB, maybe DCI P3 space if you are on MacOS (Instagram supports it now from iOS devices I gather) or if one is creating HDR images that are in Rec2020 colour space for HDR displays/TV, etc.

From my first answer:

Adobe/sRGB or other profiles are non-linear colorspaces and thus will only enter the workflow when you save stretched data with the preview filter on the right  In the next APP version, the image saver module will be improved in a big and important way, you will be able to use many more ICC Color Space profiles when saving stretched data:

  • sRGB v2 legacy
  • sRGB v2 2014
  • sRGB v4 Preference
  • sRGB v4 Appearance
  • ColorMatch RGB
  • Adobe RGB 1998
  • Apple RGB
  • Wide-Gamut RGB
  • ProPhoto RGB

   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

@samueljohnchia

At this point, I am still supply APP my own processed TIFFs (I use RawTherapee for my conversions) because even the latest Adaptive Airy Disc demosaicing is significantly worse than the available options in RawTherapee, especially the newer hybrid demosaicers, AMaZE+VNG4, DCB+VNG4 and RCD+VNG4, the latter usually the best for astro-type imagery. AAD shows significantly more chroma noise and pattern noise (pixel-chain-like clumping artifacts, a common problem as there is no one perfect interpolation, so if you aim for more resolution, there will be more noise in the result, hence the hybrid demosaicers in RawTherapee), lower resolution, significantly more false colour artifacts and the interpolation often results in a gross error where faint stars 'bloom' and become a 3x3 colour blob that is supersaturated in colour. There also seems to be something wrong with the default black level setting for raw conversion (too high), but since I'm not yet familiar, that could be down to user error. My conclusions about the demosaicing are not, however. Perhaps I should open another thread to address this separate topic, though I have raised all this to Mabula over a year ago already and provided raw files, conversion details as well as my own TIFF conversions to demonstrate what I was referring to. Please do not take this as a bashing of APP. I think it is wonderful in many ways. The Lanczos interpolation with overshoot prevention is a godsend. Many features seem to be indispensable to many astro-imagers. I would love to see APP become even better than it is and hopefully one day I can feed it directly raw data and get better results. Unfortunately, carefully pre-processed non-linear TIFFs actually result in better quality output for me at the moment, despite every advice against it. APP is still invaluable to me for registering and stacking and I will continue to use it for that. Thanks to Mabula for creating this wonderful software!

First of all, thank you very much Samuel 🙂 That is highly appreciated.

I understand that you see issues with demosaicing your data. I can only say, that AAD demosaicing works very well in my opinion and it was actually improved about 2 APP versions ago, to better maintain color ratio's and thus color in general. The old AAD algorithm was not good in that regard, that is true.

Normally when you process data, you combine several images into a stack/integration. Any demosaicing artefacts should be gone whatsoever, especially if you slightly dither. The purpose in my opinion is that you need raw linear data. If you use the processed tiffs, these will be stretched with s-curves/log-conversion for the appropriate color space. Visually that data will look different yes, when compared to the linear data. So if you compare APP's linear AAD result to Raw Therapee's non-linear results then you will see differences for sure, because you are comparing very different things... the non-linear Raw-Therapee data will have the

  • s-curve/log conversion applied for the appropriate color space
  • camera matrix applied : data goes from linear sensor colors to XYZ neutral color space to the appropriate non-linear color space
  • white balance applied (which only really makes sense when both the log conversion is applied and the camera matrix)

No wonder it looks differently than the raw linear sensor data.

The default black level for a RAW file is not subtracted in APP when processing linear data for good reason... if you would remove it when you don't supply calibration data, chances are that black clipping occurs. The camera white balance is applied properly though if you enable it in 0) RAW/FITS. If you use bias or darks, then the black level is removed automatically off course. Please realize that APP now uses Libraw for this and Libraw knows the black level for sure. So in my opinion there is no issue here at all.

Besides the debate regarding demosaicing, have you considered Bayer/X-Trans Drizzle to process your raw linear frames? Please try it, perhaps it will surprise you 😉 Then demosaicing is not done at all, but only the CFA pixels are used to process the data. The argument of whether data is correcly demosaiced or not can be compared to Bayer Drizzle results and I clearly see that AAD is then doing a very fine job in terms of noise, sharpness and color.

Kind regards,

Mabula

This post was modified 3 years ago 2 times by Mabula-Admin

   
ReplyQuote
(@samueljohnchia)
Brown Dwarf
Joined: 3 years ago
Posts: 4
Topic starter  

@mabula-admin Thank you Mabula for all your detailed replies!

The color space is linear RGB, because in Astro Pixel Processor, we assume that you load linear & raw data into APP for optimized data calibration and subsequent processing, for instance: removing light pollution and performing star color calibration needs the data to be linear to make sense scientifically and mathematically. If your data is already stretched before performing data calibration, then data calibration can never work reliably. If the data is already stretched, star analysis will suffer and thus registration/alignment precision will suffer as a consequence. If the data is already stretched, background subtraction or making a smooth background while removing light pollution will completely destroy color ratios because a subtraction is done on data that is no longer linear. Mathematically, this does not make sense. The resulting data can never be used reliably for the purposes of good colors in the entire field of view if you use pre-stretched data. This is not on opinion, but plain mathematics I think.

Would you be so kind to comment on Roger Clark's articles which purport that linear data is not necessary for astro processing?

https://clarkvision.com/articles/astrophotography.image.processing/    (see especially the appendix of this article)

And also this: https://clarkvision.com/articles/astrophotography-color-and-critics/

Where Roger shows traditional processing technique leads to worse results for him. I do not agree with some of the things Roger does but when I was independently processing my images and trying both the traditional method versus his recommended approach, I get noticeably better results too with the latter. Of course my words have no credibility but Roger is a NASA scientist among his many luminary achievements, and has published a lot about this, so I don't think it is my place to question him, because I do not know enough. But if you feel some of the things he says is wrong, I would like to know and learn also. Much appreciated 😊 

In the next APP version, the image saver module will be improved in a big and important way, you will be able to use many more ICC Color Space profiles when saving stretched data:

That is wonderful, thank you! Perhaps it would be great to include the Rec2020 and DCI P3 spaces, please consider.

If the TIFFs include metadata for the ProPhoto RGB ICC profile, the colorspace should be honoured because the TIFF loader in APP will apply the ICC profile to the data contained in the TIFF file as it is supposed to do for the purpose of color management.

Now APP is assuming linear data, and your TIFFs will obviously no longer be linear. (If they are still linear, then the use of ProPhoto makes no sense in terms of color management technically). APP will NOT make a conversion from ProPhoto to the LinearRGB space behind the scenes, so the colorspace should be preserved of the data that is loaded into APP.

Unfortunately, when I load TIFFs in ProphotoRGB space and output the processed stack in AdobeRGB, the tonality is massively different. I neglected to turn off normalisation so I cannot yet tell if it's a simple issue of re-assigning ProPhotoRGB to restore the original colour appearance (I had better run the experiment again). Nonetheless, the colour management is not working as I would expect, in that colour appearance is preserved.

When I load AdobeRGB Tiffs and output in AdobeRGB, everything works as expected.

My understanding is that you can be in linear space but you still have to define the colour coordinates of the linear working space to be able to map colours from the source file to the display etc. so the user can see something meaningful.

(If they are still linear, then the use of ProPhoto makes no sense in terms of color management technically)

However, the colour data is still bounded by some colour coordinates to be displayed on our display monitors. Lightroom for example, uses ProPhoto RGB coordinates but in linear space, so we cannot call that 'ProPhoto RGB'. It is rather known as 'Melissa RGB'. RawTherapee allows the user to use any colour coordinates during internal processing, before the user outputs the rendered result to a TIFF or JPEG etc. Is APP bounded by certain colour coordinates during internal processing?

Most computer/laptop monitors will only show sRGB 100%, some even don't. Only expensive laptops and monitors will show more colors than the sRGB space, those expensive ones will be able to show Adobe1998 or DCI-P3 to a high degree. Monitors that show the so-called wide-gamut spaces like ProPhoto RGB are technically not even realized I think... if they are, they will be very expensive I would think.

Because I know a little something about this, I must add a comment here. About 10% of Prophoto is imaginary colours, so it is totally impossible for any technology now or in the future to display the entire gamut of that space.

However, there is little truth that we should only use working spaces which our displays can show. The premise of that is a futile stance in and of itself. If an advanced user is fluent with handling colour, using a larger working space than your display can display is not going to suddenly make say blue colour or red colour turn into snazzy rainbow effects. There is nothing to fear. Also, there is nothing special about Adobe RGB and I'm glad to see the movie industry, which is leading colour processing in many ways, is not adopting it. It was merely a typo error of the coordinate values by an Adobe employee many years ago, and somehow it has become a high standard for graphics art displays, a big tragedy. I think AdobeRGB is poorly designed (it wasn't even designed, it was a mistake!) and does not include a lot of useful real-world colours into its gamut. DCI P3 and REC2020 are better that way, while avoiding imaginary colours. Joseph Holmes' working spaces are also very well designed. They are input centric, as opposed to output centric, which is the correct way I believe to make choices about these working colour spaces for our images. I highly commend anyone interested to look at them. Do not be limited by what your display can show, you may be throwing away a lot of useful colour information that way and greatly limiting yourself and perhaps requiring time-consuming re-processing when output technology gets better. This thinking is not different than how many approach astro image processing, preserve as much data as possible moving forward, do as little damage as possible.

So if you compare APP's linear AAD result to Raw Therapee's non-linear results then you will see differences for sure, because you are comparing very different things... the non-linear Raw-Therapee data will have the

Perhaps I should clarify certain details. RawTherapee, like many other good raw conversion software, handles the data linearly in 32-bit floating point space until output. I can output linear raw sensor data. In making this demosaicing comparison, the RawTherapee output is identical in tonality and colour to the APP output, so I am able to make apples to apples comparison for demosaicing with no other processing other than slightly different camera matrix but that does not invalidate the comparison. The output TIFFs are essentially identical except for demosiacing, in fact, it is difficult for an ordinary person to see any differences until zooming in to 100% or more.

Unfortunately, while I did see notable improvements in AAD demosaicing, the problems I describe are of the latest algorithm, and the visibility of the issues, while greatly minimised after integration and averaging (as you pointed out), are still visible and noticeably inferior to RCD+VNG4 hybrid demosaicing in RawTherapee, which is an open-source software so you can have a look at the code if you so wish.

Here is a crop demonstrating what I mean. As we can see, there is no difference in tone curve and overall colour, just demosacing differences that are visible. No other processing was done to each, just demosaic and output, with of course the necessary conversions with the camera matrix to Adobe RGB in the case of APP, and I can do the same for RawTherapee but for now it is Prophoto (does not affect comparison), both using camera WB, no noise reduction, sharpening, stretching, etc. These flaws are greatly minimised after stacking but not totally eliminated. I get lower noise and cleaner results with RCD+VNG4.

RT vs APP demosaicing

Also, I have another problem: If I load a single raw file into APP and save it out, I do not get a similar result in terms of image brightness than if I load 11 raw light frames and align and integrate them. The integrated stack from raw is extremely dark compared to the single raw image. No stretching or normalisation is done at all yet in my testing. Is there a way to get the integrated stack to output with the same tone curve as when I am processing a single raw image? The single raw result is exactly as I expect, I know single raw image processing does not make sense for astro processing but I was trying to evaluate the demosaicing quality independent of other processing like integration, which can hide flaws in the demosaicing. 

Please realize that APP now uses Libraw for this and Libraw knows the black level for sure. So in my opinion there is no issue here at all.

My bad, the black clipping is fine, it was a user error as mentioned in the original post. Sorry!

I do need to warn you that data calibration will really never work reliable if you start with non-linear TIFFs.

 removing light pollution and performing star color calibration needs the data to be linear to make sense scientifically and mathematically. If your data is already stretched before performing data calibration, then data calibration can never work reliably. If the data is already stretched, star analysis will suffer and thus registration/alignment precision will suffer as a consequence. If the data is already stretched, background subtraction or making a smooth background while removing light pollution will completely destroy color ratios because a subtraction is done on data that is no longer linear. Mathematically, this does not make sense.

At this point, I never use or need this function since I am not creating scientific images, but aesthetic ones. Beautiful tonality, unclipped colours and freedom from demosaicing errors rank higher to me in the pursuit of image quality. Thankfully, APP is able to handle these situations well if I decide to make such images 🙂 

This post was modified 3 years ago 3 times by Samuel Chia

   
ReplyQuote
Share: