My normal workflow for my astro images with my Nikon cameras for the last year has been to convert the raw NEF files to 16-bit TIFF files in Photoshop and then stack them in Deep Sky Stacker using a carefully selected set of parameters. The results have always been pretty good especially which respect to getting good, rich colour.
Unfortunately, I am not able to reproduce these colour results in APP and I was wondering why my images always end up fairly dull with little colour? The two thing I have not yet tried are the "force CFA" parameter or the "White Balance" parameter found in the FAWS/FITS panel?
Should I be trying these? What do they do? Should I use one or the other or both?
The Force CFA option is not needed for your NEF files, this setting is needed if APP doesn't automatically recognize that the data is monochrome CFA data that needs debayering. In case of your NEF files, APP knows that these need debayering 😉 and it also automatically knows the CFA pattern (RGGB, GBRG etc..).
You can try to use the White Balance option to check if that helps you. This setting will use the camera white balance stored in the metadata of your NEF frames. Technically, this setting only makes sense if the NEF is processed like a normal photograph though, so this is directly related to your other question:
Getting enough color out of my NEF frames never seems to be a problem but I think this is a somewhat relative statement. It would be most helpfull if you can show the differences in colors between your old workflow and what you achieve with APP? Maybe share 2 integrations, one from DSS using your old workflow and one from APP using APP's workflow?
I think I know what Rudy means. Here's a link to a TIFF file of one of my images which you can download and use as an example. It was taken using a modified Sony A7S on a Tak Epsilon 180.
It is linear unstretched data and has already been background subtracted and white balanced:
If I perform a colour preserving stretch on it, which is a single quick operation, this is the result:
Contrast and saturation can then be adjusted to improve the appearance:
In APP, without adjusting Black or White levels or Gamma, if I let APP perform an auto-stretch this is the result:
The stretch is perfect but the colours are bleached. Now it becomes a huge job to attempt to get the colour back into the image using the saturation controls but I can never get it to look like the colour preserving original. I'm probably approaching the problem in the wrong way 🙂
It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.
You're welcome to use the TIFF file any way you wish as long as you credit me with the original data.
I'm sorry that I am not very available today because of family commitments, but tomorrow I will search through my dozens of test images (NEFs and TIFFs) that I have done recently; images integrated in DSS, APP and Pixinsight and processed with and without Roger Clark's rnrc-color-stretch algorithm and finished in Photoshop. My hard drive is a bit of shambles right now and I have to sort it out first so I can be sure that I'm posting for you the correct images.
However, in the meantime Mark has posted some images above and described quite well my experiences and I would only be able to say essentially the same thing, which is, once I have integrated my subs in APP, the resultant stacked image is quite bland and lifeless in colour with very little colour data present to work with. Take heart though, because I found exactly the same result with Pixinsight, and also with DSS when I stacked the raw NEF files instead of stacking TIFFs after a raw conversion in Adobe Camera Raw.
I really don't understand why my results are what they are because I have seen beautiful colour images produced by APP, one fellow in out forum recently posted some wonderfully colourful astro images he did last week in APP , mind you he did use an astro-modded Nikon D5300a camera... anyways, thanks for taking the time to investigate, I'm sure we'll get to the bottom of this.
I have another test example. It is a totally synthetic test image that I generated a few years ago in order to test various processing workflows for their colour fidelity. You can download it here:
Sets of "stars" with constant RGB ratios, except where the "sensor" begins to saturate
White balance that needs correcting (using white reference "stars" in the top row)
When the image is opened it will look like this:
After processing it should ideally look something like this:
The challenge is to stretch the image so that all the "stars" can be seen and so that the RGB ratios are correct in each star except for where "sensor" saturation occurred. It's deliberately quite a demanding challenge which can reveal potential issues in a processing workflow.
I think I know what Rudy means. Here's a link to a TIFF file of one of my images which you can download and use as an example. It was taken using a modified Sony A7S on a Tak Epsilon 180.
It is linear unstretched data and has already been background subtracted and white balanced:
If I perform a colour preserving stretch on it, which is a single quick operation, this is the result:
Contrast and saturation can then be adjusted to improve the appearance:
In APP, without adjusting Black or White levels or Gamma, if I let APP perform an auto-stretch this is the result:
The stretch is perfect but the colours are bleached. Now it becomes a huge job to attempt to get the colour back into the image using the saturation controls but I can never get it to look like the colour preserving original. I'm probably approaching the problem in the wrong way 🙂
It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.
You're welcome to use the TIFF file any way you wish as long as you credit me with the original data.
Mark
Excellent Mark, thank you for the Tiff image, I tried myself and I agree, it's not trivial to get the same results by boosting saturation in APP. The difference is quite remarkable to say the least.
I have read the Lupton article yesterday. The color preserving aspect of stretching is well formulated. I do wonder why specifically the arc-sinh function is used, another type of stretch would work as well, or not? Have you tested this ? Maybe I missed something in the article about the arc-sinh function having a special mathematical property somehow, but I don't think it has a special property in this regard?
It's not intended as a criticism of APP because all the packages I know of behave in a similar way when given a linear stacked image.
None taken ;-), I do know how other application perform in this sense. I clearly see a big difference with your color preserving stretch so I will be happy to work soon on color preserving stretching in APP as well 😉
I'm sorry that I am not very available today because of family commitments, but tomorrow I will search through my dozens of test images (NEFs and TIFFs) that I have done recently; images integrated in DSS, APP and Pixinsight and processed with and without Roger Clark's rnrc-color-stretch algorithm and finished in Photoshop. My hard drive is a bit of shambles right now and I have to sort it out first so I can be sure that I'm posting for you the correct images.
However, in the meantime Mark has posted some images above and described quite well my experiences and I would only be able to say essentially the same thing, which is, once I have integrated my subs in APP, the resultant stacked image is quite bland and lifeless in colour with very little colour data present to work with. Take heart though, because I found exactly the same result with Pixinsight, and also with DSS when I stacked the raw NEF files instead of stacking TIFFs after a raw conversion in Adobe Camera Raw.
I really don't understand why my results are what they are because I have seen beautiful colour images produced by APP, one fellow in out forum recently posted some wonderfully colourful astro images he did last week in APP , mind you he did use an astro-modded Nikon D5300a camera... anyways, thanks for taking the time to investigate, I'm sure we'll get to the bottom of this.
More later... Best reagrds,
Rudy
No problem at all Rudy, indeed, Mark has presented us with a very good example of the issue at hand 😉 and I am convinced that I need to work on this in APP.
APP users have made fantastic images using DSLR data with very good and lots of color indeed. But I think the main reason for this would be that if you have an increasing amount of integration time in your stack the colors will be better and more saturated as a consequence. This color preserving stretch does seem to give better results for the same integration time which really is a huge gain 😉
I have another test example. It is a totally synthetic test image that I generated a few years ago in order to test various processing workflows for their colour fidelity. You can download it here:
Sets of "stars" with constant RGB ratios, except where the "sensor" begins to saturate
White balance that needs correcting (using white reference "stars" in the top row)
When the image is opened it will look like this:
After processing it should ideally look something like this:
The challenge is to stretch the image so that all the "stars" can be seen and so that the RGB ratios are correct in each star except for where "sensor" saturation occurred. It's deliberately quite a demanding challenge which can reveal potential issues in a processing workflow.
Mark
Excellent, thank you once more Mark 😉
I have downloaded this tif image as well and I will use it for testing as well probably (I won't forget your help in this regard 😉 ).
Excellent Mark, thank you for the Tiff image, I tried myself and I agree, it's not trivial to get the same results by boosting saturation in APP. The difference is quite remarkable to say the least.
I have read the Lupton article yesterday. The color preserving aspect of stretching is well formulated. I do wonder why specifically the arc-sinh function is used, another type of stretch would work as well, or not? Have you tested this ? Maybe I missed something in the article about the arc-sinh function having a special
There is nothing special about the arcsinh function but it's a particularly useful kind of transformation because it's linear at the bottom end and logarithmic at the top end. The Digital Development Processing (DDP) function you have already implemented has properties very similar to the arcsinh, so keep with what you already have. You simply need the option to make it colour preserving. That ought to be straightforward - for any pixel it is just a matter of using the same multiplier on all 3 channels: R,G&B.
It is possible you may also want the ability to reduce saturation - especially for the brightest stars. Too much colour sometimes appears a bit unnatural in the brightest objects.
The Digital Development Processing (DDP) function you have already implemented has properties very similar to the arcsinh, so keep with what you already have.
Indeed, my thoughts exactly, I will need to adjust my stretching algorithms to include the color preserving aspect. Thanks again.
It is possible you may also want the ability to reduce saturation - especially for the brightest stars. Too much colour sometimes appears a bit unnatural in the brightest objects.
Yes, I already have protection for high saturation in the saturation algorithms, but I'll check if it needs adjusting after having implemented the color preserving aspect of stretching 😉
Hey thanks! I feel I am able to get decent results w/ ryb combine only but....when I do star calibration, I never get good results. I've tried making image decent in rgb tben taking to star calibration and no good results. Then I've tried simply rgb combine and leaving it all at x1 weights, still without great results. Just looking for some help Mabula and thank you so much!
Newbie here, Beatrice Heinze. I experience the same problem as Mark Shelley and Rudy with my final integrated stack. I stacked and pre-processed earlier in DSS, and post-processed in APP 1.067 before yesterday.
My DSS stacked image of the Western Veil Nebula went from an image with vivid colours to a green one. The final integrated stack turned out bleached. The problem is, I like the background of the green one, cause it's more dark, but I want the vivid colours back from my DSS image.
So you mention using DSS to calibrate the data and then use that result in APP. I highly suggest to do the entire process in APP. DSS is known not to be the greatest when it comes to proper calibration and it might be why APP looks weird as it tries to do its normal statistics on what it expects to be proper calibrated data. Might be worth a shot.
Thank you and thanks for your answer! Ok, I will do the whole proces again in APP. I'm sure my image will look much better after stacking&processing in APP. But I thought that I could correct my image in one or another way.
Did you try things like a star color calibration and playing around a bit with the color tools? If that doesn't help at all, it might still be the data itself.
I experienced better colour data in the stack when NOT using LNC, untick "local normalization rejection" in (6) and when NOT ticking neutralizing background in (5).
To me it seems that these three options reduce/flatten the colours in your stack. I guess any software has a hard time distinguishing between wanted signal and unwanted noise/light pollutin.
And: I do not stretch my data in APP, but in Photoshop (I only use the light pollution, background calibration and star calibration of APP).
LNC normalizes the background very nicely for all stacks, resulting in a much more even frame. Neutralize background in 5) can actually influence nebulosity, but you can fully restore that when calibrating the background in the tools 9) menu. On the right hand side there are presets for stretching and lower down for color saturation, where you can saturate with protecting the background.
A more detailed explanation of the right hand side can be seen when clicking the red question mark.
So you left everything standard and the data should be from a color camera I assume? If so and you do see an RGB spectrum (top right, red blue and green graph) you may need to add saturation. On the right side of APP you have a switch to turn on saturation. Then under that there are sliders called "Sat." and "Sat. Th.". Sat is for the amount of saturation, to the right adds more, Sat. Th. is the lower threshold protecting the background noise from being saturated.
@vincent-mod, I really appreciate the help. Yes it is from a modified DSLR Canon 60da, I did check the saturation box and played with the sliders, I will give it another try this evening and double check everything and let you know how it goes, thanks.