Not getting enough ...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

[Sticky] Not getting enough colour in my final integrated stack, how to increase?

32 Posts
11 Users
13 Likes
21.6 K Views
(@rudypohlgmail-com)
Main Sequence Star
Joined: 7 years ago
Posts: 23
Topic starter  

Hi Mabula:

My normal workflow for my astro images with my Nikon cameras for the last year has been to convert the raw NEF files to 16-bit TIFF files in Photoshop and then stack them in Deep Sky Stacker using a carefully selected set of parameters. The results have always been pretty good especially which respect to getting good, rich colour.

Unfortunately, I am not able to reproduce these colour results in APP and I was wondering why my images always end up fairly dull with little colour? The two thing I have not yet tried are the "force CFA" parameter or the "White Balance" parameter found in the FAWS/FITS panel? 

Should I be trying these? What do they do? Should I use one or the other or both?

Thanks,
Rudy


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Hi Rudy,

The Force CFA option is not needed for your NEF files, this setting is needed if APP doesn't automatically recognize that the data is monochrome CFA data that needs debayering. In case of your NEF files, APP knows that these need debayering 😉 and it also automatically knows the CFA pattern (RGGB, GBRG etc..).

You can try to use the White Balance option to check if that helps you. This setting will use the camera white balance stored in the metadata of your NEF frames. Technically, this setting only makes sense if the NEF is processed like a normal photograph though, so this is directly related to your other question:

https://www.astropixelprocessor.com/community/main-forum/does-astro-pixel-processor-apply-the-color-matrix-correction-to-digital-camera-raw-data/

Getting enough color out of my NEF frames never seems to be a problem but I think this is a somewhat relative statement. It would be most helpfull if you can show the differences in colors between your old workflow and what you achieve with APP? Maybe share 2 integrations, one from DSS using your old workflow and one from APP using APP's workflow?

Kind regards,

Mabula


   
ReplyQuote
(@sharkmelley)
White Dwarf
Joined: 7 years ago
Posts: 15
 

I think I know what Rudy means.  Here's a link to a TIFF file of one of my images which you can download and use as an example.  It was taken using a modified Sony A7S on a Tak Epsilon 180.

http://www.markshelley.co.uk/webdisk/NGC7000_Linear_Small.tif

It is linear unstretched data and has already been background subtracted and white balanced:

NGC7000 original

If I perform a colour preserving stretch on it, which is a single quick operation, this is the result:

NGC7000 stage1

Contrast and saturation can then be adjusted to improve the appearance:

NGC7000 stage2

In APP, without adjusting Black or White levels or Gamma, if I let APP perform an auto-stretch this is the result:

NGC7000 APP

The stretch is perfect but the colours are bleached.  Now it becomes a huge job to attempt to get the colour back into the image using the saturation controls but I can never get it to look like the colour preserving original.  I'm probably approaching the problem in the wrong way 🙂

It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.

You're welcome to use the TIFF file any way you wish as long as you credit me with the original data.

Mark


   
Mabula-Admin reacted
ReplyQuote
(@rudypohlgmail-com)
Main Sequence Star
Joined: 7 years ago
Posts: 23
Topic starter  

Hi Mabula;

I'm sorry that I am not very available today because of family commitments, but tomorrow I will search through my dozens of test images (NEFs and TIFFs) that I have done recently; images integrated in DSS, APP and Pixinsight and processed with and without Roger Clark's rnrc-color-stretch algorithm and finished in Photoshop. My hard drive is a bit of shambles right now and I have to sort it out first so I can be sure that I'm posting for you the correct images.

However, in the meantime Mark has posted some images above and described quite well my experiences and I would only be able to say essentially the same thing, which is, once I have integrated my subs in APP, the resultant stacked image is quite bland and lifeless in colour with very little colour data present to work with. Take heart though, because I found exactly the same result with Pixinsight, and also with DSS when I stacked the raw NEF files instead of stacking TIFFs after a raw conversion in Adobe Camera Raw.

I really don't understand why my results are what they are because I have seen beautiful colour images produced by APP, one fellow in out forum recently posted some wonderfully colourful astro images he did last week in APP , mind you he did use an astro-modded Nikon D5300a camera... anyways, thanks for taking the time to investigate, I'm sure we'll get to the bottom of this.

More later...
Best reagrds,

Rudy

 


   
Mabula-Admin reacted
ReplyQuote
(@sharkmelley)
White Dwarf
Joined: 7 years ago
Posts: 15
 

I have another test example.  It is a totally synthetic test image that I generated a few years ago in order to test various processing workflows for their colour fidelity.  You can download it here:

http://www.markshelley.co.uk/webdisk/star_colour_test_image_uncompressed.tif

It has the following features:

  • A light pollution background with added noise.
  • Sets of "stars" with constant RGB ratios, except where the "sensor" begins to saturate
  • White balance that needs correcting (using white reference "stars" in the top row)

When the image is opened it will look like this:

star colour test before

After processing it should ideally look something like this:

star colour test after

The challenge is to stretch the image so that all the "stars" can be seen and so that the RGB ratios are correct in each star except for where "sensor" saturation occurred.  It's deliberately quite a demanding challenge which can reveal potential issues in a processing workflow.

Mark


   
Mabula-Admin reacted
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 
Posted by: Mark Shelley

I think I know what Rudy means.  Here's a link to a TIFF file of one of my images which you can download and use as an example.  It was taken using a modified Sony A7S on a Tak Epsilon 180.

http://www.markshelley.co.uk/webdisk/NGC7000_Linear_Small.tif

It is linear unstretched data and has already been background subtracted and white balanced:

NGC7000 original

If I perform a colour preserving stretch on it, which is a single quick operation, this is the result:

NGC7000 stage1

Contrast and saturation can then be adjusted to improve the appearance:

NGC7000 stage2

In APP, without adjusting Black or White levels or Gamma, if I let APP perform an auto-stretch this is the result:

NGC7000 APP

The stretch is perfect but the colours are bleached.  Now it becomes a huge job to attempt to get the colour back into the image using the saturation controls but I can never get it to look like the colour preserving original.  I'm probably approaching the problem in the wrong way 🙂

It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.

You're welcome to use the TIFF file any way you wish as long as you credit me with the original data.

Mark

Excellent Mark, thank you for the Tiff image, I tried myself and I agree, it's not trivial to get the same results by boosting saturation in APP. The difference is quite remarkable to say the least.

I have read the Lupton article yesterday. The color preserving aspect of stretching is well formulated. I do wonder why specifically the arc-sinh function is used, another type of stretch would work as well, or not? Have you tested this ? Maybe I missed something in the article about the arc-sinh function having a special mathematical property somehow, but I don't think it has a special property in this regard?

It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.

None taken ;-), I do know how other application perform in this sense. I clearly see a big difference with your color preserving stretch so I will be happy to work soon on color preserving stretching in APP as well 😉

Mabula


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 
Posted by: Rudy Pohl

Hi Mabula;

I'm sorry that I am not very available today because of family commitments, but tomorrow I will search through my dozens of test images (NEFs and TIFFs) that I have done recently; images integrated in DSS, APP and Pixinsight and processed with and without Roger Clark's rnrc-color-stretch algorithm and finished in Photoshop. My hard drive is a bit of shambles right now and I have to sort it out first so I can be sure that I'm posting for you the correct images.

However, in the meantime Mark has posted some images above and described quite well my experiences and I would only be able to say essentially the same thing, which is, once I have integrated my subs in APP, the resultant stacked image is quite bland and lifeless in colour with very little colour data present to work with. Take heart though, because I found exactly the same result with Pixinsight, and also with DSS when I stacked the raw NEF files instead of stacking TIFFs after a raw conversion in Adobe Camera Raw.

I really don't understand why my results are what they are because I have seen beautiful colour images produced by APP, one fellow in out forum recently posted some wonderfully colourful astro images he did last week in APP , mind you he did use an astro-modded Nikon D5300a camera... anyways, thanks for taking the time to investigate, I'm sure we'll get to the bottom of this.

More later...
Best reagrds,

Rudy

 

No problem at all Rudy, indeed, Mark has presented us with a very good example of the issue at hand 😉 and I am convinced that I need to work on this in APP.

APP users have made fantastic images using DSLR data with very good and lots of color indeed. But I think the main reason for this would be that if you have an increasing amount of integration time in your stack the colors will be better and more saturated as a consequence. This color preserving stretch does seem to give better results for the same integration time which really is a huge gain 😉

Mabula

 

 


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 
Posted by: Mark Shelley

I have another test example.  It is a totally synthetic test image that I generated a few years ago in order to test various processing workflows for their colour fidelity.  You can download it here:

http://www.markshelley.co.uk/webdisk/star_colour_test_image_uncompressed.tif

It has the following features:

  • A light pollution background with added noise.
  • Sets of "stars" with constant RGB ratios, except where the "sensor" begins to saturate
  • White balance that needs correcting (using white reference "stars" in the top row)

When the image is opened it will look like this:

star colour test before

After processing it should ideally look something like this:

star colour test after

The challenge is to stretch the image so that all the "stars" can be seen and so that the RGB ratios are correct in each star except for where "sensor" saturation occurred.  It's deliberately quite a demanding challenge which can reveal potential issues in a processing workflow.

Mark

Excellent, thank you once more Mark 😉

I have downloaded this tif image as well and I will use it for testing as well probably (I won't forget your help in this regard 😉 ).

Mabula


   
ReplyQuote
(@sharkmelley)
White Dwarf
Joined: 7 years ago
Posts: 15
 
Posted by: Mabula Haverkamp

Excellent Mark, thank you for the Tiff image, I tried myself and I agree, it's not trivial to get the same results by boosting saturation in APP. The difference is quite remarkable to say the least.

I have read the Lupton article yesterday. The color preserving aspect of stretching is well formulated. I do wonder why specifically the arc-sinh function is used, another type of stretch would work as well, or not? Have you tested this ? Maybe I missed something in the article about the arc-sinh function having a special

There is nothing special about the arcsinh function but it's a particularly useful kind of transformation because it's linear at the bottom end and logarithmic at the top end.  The Digital Development Processing (DDP) function you have already implemented has properties very similar to the arcsinh, so keep with what you already have.  You simply need the option to make it colour preserving.  That ought to be straightforward - for any pixel it is just a matter of using the same multiplier on all 3 channels: R,G&B.

It is possible you may also want the ability to reduce saturation - especially for the brightest stars.  Too much colour sometimes appears a bit unnatural in the brightest objects.

Mark


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Hi Mark,

The Digital Development Processing (DDP) function you have already implemented has properties very similar to the arcsinh, so keep with what you already have.

Indeed, my thoughts exactly, I will need to adjust my stretching algorithms to include the color preserving aspect. Thanks again.

It is possible you may also want the ability to reduce saturation - especially for the brightest stars.  Too much colour sometimes appears a bit unnatural in the brightest objects.

Yes, I already have protection for high saturation in the saturation algorithms, but I'll check if it needs adjusting after having implemented the color preserving aspect of stretching 😉

Mabula

 


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Upgraded to a Sticky 😉


   
ReplyQuote
(@gregwrca)
Black Hole
Joined: 7 years ago
Posts: 227
 

Good stuff


   
Mabula-Admin reacted
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

😉 will soon be implemented ! probably in 1.057

Tomorrow I'll probably release 1.056...


   
ReplyQuote
(@singding)
Neutron Star
Joined: 6 years ago
Posts: 72
 

Hey thanks! I feel I am able to get decent results w/ ryb combine only but....when I do star calibration, I never get good results. I've tried making image decent in rgb  tben taking to star calibration and no good results. Then I've tried simply rgb combine and leaving it all at x1 weights, still without great results. Just looking for some help Mabula and thank you so much!


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366

   
ReplyQuote
(@sterrenwacht-altair)
Brown Dwarf
Joined: 7 years ago
Posts: 5
 

Hi Mabula

Newbie here, Beatrice Heinze. I experience the same problem as Mark Shelley and Rudy with my final integrated stack. I stacked and pre-processed earlier in DSS, and post-processed in APP 1.067 before yesterday. 

My DSS stacked image of the Western Veil Nebula went from an image with vivid colours to a green one. The final integrated stack turned out bleached. The problem is, I like the background of the green one, cause it's more dark, but I want the vivid colours back from my DSS image. 

I tried the solution you posted in this link -> https://www.astropixelprocessor.com/community/main-forum/color-issues-images-look-unbalanced-and-green/    ... but without succes.  

Could you please help me with this, Mabula?

 

(I've wanted to add my pictures in here to show you as example, but I don't know how).

Btw: APP is a great software as far as I can tell as a newbie :-).

         And thanks for solving the problem with "analyzing stars". I had also problems with it.

 

Best regards

Beatrice

 


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Hi Beatrice and welcome to APP!

So you mention using DSS to calibrate the data and then use that result in APP. I highly suggest to do the entire process in APP. DSS is known not to be the greatest when it comes to proper calibration and it might be why APP looks weird as it tries to do its normal statistics on what it expects to be proper calibrated data. Might be worth a shot.


   
ReplyQuote
(@sterrenwacht-altair)
Brown Dwarf
Joined: 7 years ago
Posts: 5
 

Hi Vincent!

Thank you and thanks for your answer! Ok, I will do the whole proces again in APP. I'm sure my image will look much better after stacking&processing in APP. But I thought that I could correct my image in one or another way.

 

 

 


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Did you try things like a star color calibration and playing around a bit with the color tools? If that doesn't help at all, it might still be the data itself.


   
ReplyQuote
(@b4silio)
Main Sequence Star
Joined: 6 years ago
Posts: 27
 

Since this is a sticky from a couple of years back by now, I was just wondering if the color-preserving stretching was indeed integrated into APP?


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Since then I haven't seen any complaints regarding this so I'm assuming it became better. 🙂


   
ReplyQuote
(@chrisl)
Main Sequence Star
Joined: 4 years ago
Posts: 12
 

Hello,

I experienced better colour data in the stack when NOT using LNC, untick "local normalization rejection" in (6) and when NOT ticking neutralizing background in (5).

To me it seems that these three options reduce/flatten the colours in your stack. I guess any software has a hard time distinguishing between wanted signal and unwanted noise/light pollutin.

And: I do not stretch my data in APP, but in Photoshop (I only use the light pollution, background calibration and star calibration of APP).

Chris


   
Kullumm reacted
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

LNC normalizes the background very nicely for all stacks, resulting in a much more even frame. Neutralize background in 5) can actually influence nebulosity, but you can fully restore that when calibrating the background in the tools 9) menu. On the right hand side there are presets for stretching and lower down for color saturation, where you can saturate with protecting the background.

A more detailed explanation of the right hand side can be seen when clicking the red question mark.


   
Kullumm reacted
ReplyQuote
(@rfmarshall)
Brown Dwarf
Joined: 4 years ago
Posts: 6
 

Here is a link I did on FB and need help with the same issue.


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

I can't access that group as it's private, can you post the screenshot here?


   
ReplyQuote
(@rfmarshall)
Brown Dwarf
Joined: 4 years ago
Posts: 6
 

@vincent-mod

DCB2300A 9168 4B91 8B05 BC5CC746D015

 


   
ReplyQuote
(@rfmarshall)
Brown Dwarf
Joined: 4 years ago
Posts: 6
 

I am new to APP and it is probably something I am doing wrong but thought I would post here to see. The image was taken with a canon 60da.


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

So you left everything standard and the data should be from a color camera I assume? If so and you do see an RGB spectrum (top right, red blue and green graph) you may need to add saturation. On the right side of  APP you have a switch to turn on saturation. Then under that there are sliders called "Sat." and "Sat. Th.". Sat is for the amount of saturation, to the right adds more, Sat. Th. is the lower threshold protecting the background noise from being saturated.


   
ReplyQuote
(@rfmarshall)
Brown Dwarf
Joined: 4 years ago
Posts: 6
 

@vincent-mod, I really appreciate the help. Yes it is from a modified DSLR Canon 60da, I did check the saturation box and played with the sliders, I will give it another try this evening and double check everything and let you know how it goes, thanks.


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

No problem, otherwise I'll have a look at processing that data as well to point you to some things maybe.


   
Mabula-Admin reacted
ReplyQuote
Page 1 / 2
Share: