19 May 2020 - APP 1.080 will soon be released with full Fujifilm RAF support, so that will include SuperCCD & X-Trans camera's 🙂 !
2019 September: Astro Pixel Processor and iTelescope.net celebrate a new Partnership!
Light Pollution Removal Tool Question
My location in UK suffers from significant light pollution, frequently exacerbated by mist in the atmosphere from some nearby lakes. Consequently I get complex gradients in my images, particularly when using my OSC camera (ASI 294 Pro).
Lately, while correcting LP, I have attempted to use the Correction Model display as a guide to getting the best result - as a general rule, should the Correction Model ideally show an overall white background, or is that an over-simplification of the function of the tool?
From my experience solely as a user of APP, I think the answer to your question is a definite No. I think the LPC correction model is a representation of the mask that APP's LPC correction tool has calculated needs to be applied (added or subtracted?) from the linear image values in order to provide a LP neutralised image when the underlying image file is displayed to the user via the APP image viewer.
I think that the image correction model should be a uniform white only if it is used prior to any attempt at LPC on an actual astro image. And possibly also if LPC was attempted on a lights file from which all light was excluded during capture stage and thus by definition (ignoring potential sensor artifacts) perfectly free of LP!
I would be interested in other users' / Mabula's views but I have not found the Correction Model to be of real practical value during LPC but I am probably overlooking something.
I hope I am offering a reasonably well founded opinion.
Best of luck
PS: I image in LRGB and find it easier though more time consuming to do LPC on the layers individually rather than post RGB Combine.
Many thanks for your thoughts, which pretty much concur my findings. I believe that some of the gradient issues with my ASI294 are due to the purple/green colour artefacts that the sensor of that camera generates (well documented on other forums), hence cannot be classed as 'normal' light pollution. I find that I can quickly remove most LP from an image with 15 - 20 regularly spaced boxes, then can spend upwards of an hour trying to remove the remainder. To illustrate my issue, see below a couple of screenshots- first the uncorrected image, then the result after a first pass of the LPC Tool, which reveals the gradients which prove so difficult to remove. All Light Frames are calibrated correctly (I have tried Flats at 15k, 20k, 25k and 30k ADU with 60+ Darks and Dark Flats). The views shown are heavily stretched and saturated in APP as recommended:
One thing I have found, but cannot explain, is that if I integrate with Bayer Drizzle and a small droplet size (0.5), the odd gradients are easier to remove (?). I am also going to try generating artificial flats to see if this makes the green/purple colours easier to remove, however I think that is a rather heavy-handed approach to the problem. I also have an ASI1600MMPro and while I still get plenty of LP in LRGB images, it is much easier to remove than with my OSC images.
I can, of course just darken the background until the colours are gone, but this clips a lot of dark detail and blackens the sky a bit too much 🙂
Perhaps I misunderstand, but if the sensor leaves purple/green colour artefacts then shouldn't flats correct that?
Clear skies, Wouter
Strangely, they don't. I have tried making flats at the same exposure as the lights (i.e. reducing illumination to get correct Flat density at 1 or 2 minute exposures) and the flats show no colour aberrations at all.
One of the proposed theories regarding the sensor is that the cooling is uneven across the chip, however my experiments in this regard have been inconclusive (cooling the camera only to ambient temperature, to reduce delta T between the sensor and surroundings, etc). There are a number of discussions on the Cloudy Nights Forum and of course the ZWO Forum for reference.
It is not an insurmountable problem, but extremely time-consuming to correct, which is a great shame because the camera is extremely sensitive and an otherwise excellent tool. If nothing else, it excels as an EAA device.
I don't use a ZWO camera so am not familiar with its quirks.
Looking at the placement of your select boxes compared with what I think I normally do when I use the LPC tool, I observe that:
a) I generally use a rather greater number eg 30-50 of smaller boxes than you have used - I agree these can be rather a pain to draw;
b) Some of your boxes straddle the boundaries between eg your red and green LP boundaries. I would place my boxes either side of the transition boundary and following the contour line of the boundary line. So with reference to your example image I would have a ring of boxes following the green area of the image and then boxes either side in the red/purple areas.
c) I think it helps to place an initial set of boxes in areas where the light pollution looks the strongest and where you expect the background to be reasonably neutral. This seems to give an initial coarse correction that can then be refined by the addition of further small boxes in areas where further refinement is deemed necessary.
I haven't a clue about the mathemetics behind light pollution corrction but my experience is that placement of the boxes is more art than science.
I hope these observations prove of some value.
Thanks for sharing that - yes, I usually start with large boxes as in my example above, then add smaller ones in the unwanted coloured areas, removing the red boxes as I go. I typically end up with over 100 boxes to get an optimal result, but find that one poorly placed sample can totally change the overall image without an obvious reason, hence my question about using the model display as a guide. I find that if I select a small area which encloses a bright star, it usually results in a very dark area around the star. IN the M35 image above, I can make the background areas around the two main clusters go almost completely black, but can never replicate that dark background across the entire image. I basically aim for an overall even colour - if it is a bit purple or a bit green, I can remove that with HSL, or do it subsequently with Photoshop.
I will certainly try more accurate 'bracketing' of the problem areas - that makes a lot of sense and is great advice, thanks.
I am glad that I'm not alone in thinking that box placement is more art than science (I am sure that Mabula will disagree :)) and will continue to try to improve my technique for correcting OSC images. At the end of the day, I get much better quality from my mono camera, so probably shouldn't expect too much from the OSC. The latter is, however, great for short imaging sessions (typically limited by weather here) as it can grab a great deal of data in couple of hours or so of imaging:
This shot is a total of 134 minutes captured over 3 nights, so is a good example of the sensitivity of the ASI294 sensor.
Many thanks again for your good advice and help Mike!
Yes this is not regular light pollution for which the tool is optimized. It expects a gradient of some sort to work optimally I think and the signal here isn't a gradient. Haven't seen it like this before I have to say, but data like that is always difficult to correct. @mabula-admin knows the details for sure.
No, it is certainly a strange phenomenon, which cannot be reproduced in calibration frames. It can be minimised sufficiently to yield decent images (as M45 above shows), but typically requires in excess of 200 selection boxes to achieve.
I think that artificial flats are the way to go to reduce this effectively, so I am experimenting with star removal and frame subtraction in other programs.