20 January 2021: Soon to be released APP 1.083-beta2 : improved comet registration, updated tools, new Star Reducer Tool and more...
16 November 2020 - Wouter van Reeven has officially joined the Astro Pixel Processor Team as a moderator on our forum, welcome Wouter 🙂 !
New Camera, New Issues
I recently purchased a new dedicated Mono Astro camera, QHY174M and I am struggling to fathom what is going wrong when I am collecting H-alpha data & processing through APP.
OK, last night I setup on target NGC6823, a star cluster with some nebulosity.
I am imaging using my Celestron CPC925 via a f/6.3 Focal Reducer with SX OAG & SX USB Filter Wheel.
Using a Baader H-alpha filter 2".
Opening up SharpCap Pro and with my camera cooled to -20°C I chose the Histogram > Brain and started a measurement to give me the Exposure Time, Gain & Offset figures for Max Dynamic Range.
The results were 474s at Gain 80 & Offset 25 which I then used to image the target through Astro Photography Tool.
I did not Dither as for some reason lately, APT seems to get stuck occasionally on Dither Settle.
In the end I captured 20 Light subs, 20 Dark (same settings, exposure length & temperature as Lights), 25 Flats (same settings as Lights but exposure length of 3.25s to get mid histogram) and 25 Dark Flats (matching settings and exposure length of Flats).
*** Looks Like An Issue With This Forum MAX File Size For Attachments Is 30B ***
*** Links Below To DropBox Files ***
Straight away you can see that the Calibrated single RAW image has all but destroyed any detail that was in the Linear frame, granted my subs are not great with amp-glow etc but the Master Dark does show that this should be subtracted out.
I just seem to be losing a huge percentage of the detail from my subs and the final Integration is totally awful, Stars look terrible & there is way too much noise.
Any ideas folks?
While I don't have enough experience to just look at the images and say "there's your problem," I can suggest a technique used by developers when debugging: begin with a minimal number of inputs and then add one at a time to see how it affects the outcome.. So start with a fresh APP and just give it your lights but no calibration frames, use default settings, and take a screenshot of the final integration. Close APP, start it fresh, this time add your darks with your lights, use default settings, and take a screenshot. Each new calibration input you add *should* improve the final image.
I tried the “debugging” procedure as detailed above, thanks for the suggestion.
It would seem the issue is in my Darks.
I did have an issue last night with APT just before capturing my Darks, where it froze & I had to shut it down via Task Manager.
On restarting APT the images were taking an age to download from my camera & I kept on stopping the plan, altering the USB Speed setting and trying again.
This happened a few times until it was either a speed setting I chose or APT started to play ball & my Darks were captured.
After some searching I found information that varying USB Speeds can alter the amount of amp-glow in Dark frames.
I may have some clear sky tonight to try to see if indeed this could be the case.
I’ll post back any findings.
Well the weather did not play ball last night so instead I have spent a couple of hours playing around with various settings inside APP.
I already came to the conclusion that my Darks were at fault, see previous post, although I thought that it may have been to do with a change in USB Speed setting between Lights & Darks capture.
Through trial and error I found that all I needed to do was tick the box next to "adaptative pedestal/reduce Amp-Glow" and hey presto a half decent result:
again, this is only 20 x 474s Integration with no post processing done and yes my focus is a little off.
I had skipped past this setting as my QHY camera boasts some anti amp-glow software built in 😉, probably works better on shorter exposures but also my Darks did also show the amp-glow which I assumed would be all that is needed to remove it from the Lights during Calibration.
It's more to do I think with the Black Point as prior to ticking this option as on my first data run I was left with a zero Black Point.
So an interesting lesson learnt, once again APP saved the day as I was about to ditch this data and start over.
Question, is it safe to check this option as a matter of course for my particular camera, i.e. will it cause detrimental effects to images with a lot less amp-glow?
It shouldn't, as stated in the tooltip it doesn't effect subs with no glow.
It shouldn't, as stated in the tooltip it doesn't effect subs with no glow.
Well my happiness was short-lived.
I had lovely clear skies last night and decided to capture more NGC6823 H-alpha data.
All I made were two changes, I lengthened the exposure time from 474s as previous data, to 480s to make calculation easier and I rotated my camera through 90° for better framing.
New Darks, Flats & Dark Flats were taken to match this new data.
I loaded everything in to APP, made sure to tick the "adaptative pedestal/reduce Amp-Glow" box and hit Calibrate.
This is a single Linear RAW frame:
Now this same frame but Calibrated:
I am at a total loss as to why the simple fix to my earlier data does not now have any effect on this new data, also I have tried Calibrating this data without the "adaptative pedestal/reduce Amp-Glow" box ticked and this is the result of that same frame when Calibrated:
You can see that the image is still very dark as the last image attached to this post but it does not show any top or bottom amp-glow.
Could it be as simple as the camera being at 90° that is causing some issue with the 'adaptative pedestal' function, seem unlikely but aside from increasing my exposure by 6s, that is the only difference between this and the last set of data and like I have stated, I have taken all fresh Calibration frames to match this new exposure time and camera orientation.
Also, as a second point I did try to Integrate this data with my previous data using the 'Multi-session' option and made sure that session 1 data was correctly loaded along with the session 2 data. When in the Calibration step APP gave me a warning that session 2 Flats could not be Calibrated because there was no matching MasterDarkFlat or MasterBias present:
Individually I can Calibrate both sets of data without getting this error as there are all matching Lights & Calibration frame present.
Here is a screen shot of APP log at the time of the error message 09:30:29 up to the time I hit the OK button on the error message, in case this helps:
I have just been having a much closer look at my data in order to shed more light on the matter and noticed something quite odd.
When I was taking Flats for the first set of data I used APT's Flats Aid tool and got an exposure length of 3.25s at my chosen Gain & Offset & I use a Flat Panel in order to get consistency.
With my second set of data, I just used the same exposure length as I was using the same Gain & Offset as previously and this was saved as a plan in APT.
Looking at the MasterFlat for my first data, the histogram in APP is almost dead centre but the histogram for the MasterFlat of my second data set is only 1/3 of the way from the left.
Now, I don't think that ordinarily this should matter too much but it is leading me down the path of there being something different in the way the camera, capture software or APP is interacting with the data when the camera is in this new orientation.
Tonight I may experiment by taking some fresh Flats & Dark Flats but this time using the Flats Aid tool and see if there is a marked increase in exposure time given and how these new F's & DF's effect my data.
I am still working through this whenever the weather allows 😉
I did come across this thread: https://www.astropixelprocessor.com/community/main-forum/adding-pedestal-fro-lights-calibration/#post-3524
where the OP is having a similar issue to me and in the thread it is suggested to Add Data in the Batch Modify section.
I have tested this on my data starting by adding 100 and then increments of 100 up to 2400 where I stopped as I could not see any further improvement.
This did sort of clean things up a bit but I think that I have just over-cooked my Lights for the Gain setting used.
My next step is to reduce exposure time at this Gain setting to a point where amp-glow can be completely cleaned up but that the target is visible in the captured Lights, failing this I'll be dialing back my Gain and tweaking exposure length to achieve the same.
Looking at the data frames I see, it does have a lot of noise it seems. Indeed trying to get that dialed down by maybe playing with the gain might help. Are you using typical gain values for this type of camera that you know?
there were no typical Gain settings setup within the camera driver & nothing much I could find online.
For the moment I have been using the Histogram Brain feature in SharpCap Pro which has given me a starting point.
Ok, yes maybe doing a test run in which you take 3 gain values relatively far apart might give you a good idea as well and then honing into the value that gives the best processing. It will probably be a balans between signal and noise.