Overcorrecting flat...
 
Share:
Notifications
Clear all

Mar 28 2026 APP 2.0.0-beta40 will be released in 7 days.

It did take a long time to have the work finished on this and it  will have a major performance boost of 30-50% over 2.0.0-beta39 from calibration to integration. We extensively optimized many critical parts of APP. All has been tested to guarantee correct optimizations. Drizzle and image resampling is much faster for instance, those modules have been completely rewritten. Much less memory usage. LNC 2.0 will be released which works much better and faster than LNC in it's current state. And more, all will be added to the release notes in the coming weeks...

Update on the 2.0.0 release & the full manual

We are getting close to the 2.0.0 stable release and the full manual. The manual will soon become available on the website and also in PDF format. Both versions will be identical and once released, will start to follow the APP release cycle and thus will stay up-to-date to the latest APP version.

Once 2.0.0 is released, the price for APP will increase. Owner's license holders will not need to pay an upgrade fee to use 2.0.0, neither do Renter's license holders.

 

Overcorrecting flats? Help me please 🙂

17 Posts
4 Users
2 Reactions
4,925 Views
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

Hi,

I think I have the case of flats over correcting the lights. I am not sure what I am doing wrong. I have been using APP for ~2y but this camera is new to me. I have some frames that I prepared and can upload if that helps. Some info:

  • APP is 1.083
  • Camera is QHY 268m, Mode 1, Gain 56, Offset 30 for all frames, at the same temperature
  • Telescope is Esprit 100 for the 1x SW field flattener
  • Flats are 3 seconds longs
  • The light frames are attrocious due to fog and low altitude. Please ignore that.
  • The problem also occurs with OII/SIII as well, but easiest to show with the Ha
  • All images shown below are stretched with the same settings (max preset in APP)

Integration result (processed 4 lights, 4 flats, 4 darkflats, 4 darks); the center is darker than the sky background at any other point - seems wrong:

flatdemo Ha St

A single light:

rosette 2022 03 20 Light Ha 002 St

Master flat:

MF IG 56.0 E 3.0s QHY268M 020204 6252x4176  Ha 1 St

A single light processed through only vignette correction (without actual flat); no dark patch in the middle:

rosette 2022 03 20 Light Ha 002 vc St

 

Any ideas or suggestions would be welcome. Thank you in advance.



   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

Mm, it might be that the vignetting is corrected by the flat, but that the sky conditions are so bad that you still end up with a lot of gradients? The flat does seem to show uneven vignetting, as if the light-path is not in the center, could that be the case?



   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

Dear @vincent-mod ,

Thanks for your prompt reply.

On the sky gradients. I think there the gradient that is still visible after a simple VC (last image) is mild and simple. Over an extended period I too have seen that many light frames can create all sorts of interesting patterns due to LP. But this is just 4 frames in 20 minutes, and as I said I am not too concerned by the gradient (which appears a mild first degree to me, consistent with a low altitude). So I don't think that is the source of the problem. The black patch appears after flat correction almost perfectly in the middle, circular, so I think it must be a mistake with flat (either mine, or the camera is playing tricks on me, or something I do not even expect). However, I will try make some pictures that do not suffer from sky gradient, just to verify.

About the light path not being in the center. That is indeed concerning. I think, despite my best effort, the filter is not exactly in the middle, and 36mm (at f5.5) is just enough to cover the sensor so even a tiny mistake is enough to get some vignetting. However, I'd expect that a flat should be able to handle this. When I create an artifical flat with the VC tool (see below) that is also not completely centered, but corrects the images fine (apart from the dust mote of course). And this brings me back to the flat frame being incorrect somehow.

I am not at all saying this is somehow APP's fault, I am just hoping that maybe someone has seen this problem with this QHY camera and have an idea.

Artifical flat that corrects fine:

flatdemo current vignetting model St


   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

No problem, if it would be an APP fault, we love to know. 😉 But since the artificial flat works, it's likely a slight data issue. You're also right that a flat should be able to correct that.

So are your bias or darkflats looking good and have the same settings as the flats?



   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

I have not used bias. I have tried taking very short frames before and I've found them to be unstable (say, under 0.1s) with various lovely bands. I think this is a known problem.

The darkflats look good to me and were shot at the same settings. I can upload the sample data if you'd be kind enough to look at it? :-]

About data issue, and perhaps I should have opened with this. I do get a warning from APP. I think it is due to the optically inactive region on the left side of frames. I find it interesting though that APP does not pick up on that fact. I used to shoot with a DSLR that had a similar area in the frames and those were detected without a pinch.

WARNING: you might have a serious data issue!

This is only a warning that you might have a serious data problem.
You can safely continue processing your data and this warning is only shown once.

Despite having enabled an adaptive pedestal on your data,
to prevent the clipping of pixels to a value of 0,
14116 pixels are still clipping to 0 while using the adaptive pedestal.
Without the pedestal, 14448 pixels were clipping to 0.

This might be a serious data issue which you want to solve,
because it could lead to a problem with your integrated data.

It is caused by either:

1) your data is underexposed. If your setup allows it, try to expose longer.

2) your bias/MasterBias/dark/MasterDark frame(s) are not compatible
with your lights due to having a different sensor offset/pedestal.

3) you have a light leakage in your optical train so your bias/dark frames are not good.



   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

Ah that's a good warning to show as that does indicate that data is clipping. This means that likely your exposure can be set higher and then this may work much better or perhaps even get solved.

Bias is indeed tricky, but when you take it at around 0.2 s per frame or use darkflats, that should be fine.

No problem uploading data, I can have a look as well (please allow for a day), please upload like 10 lights and your master calibration frames;



   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

@vincent-mod , I've uploaded the files into lenart-flatCalIssue-rosette. I only have 4 lights (this was meant as a test for the rig), I've uploaded those.

I've also uploaded the artificial flat that calibrates fine.

Thank you in advance.



   
ReplyQuote
(@kijja)
Black Hole
Joined: 8 years ago
Posts: 149
 

@lenart

I used to have this issue when I did not check ‘remove overscan area’ in ASCOM driver of the camera. My new data is free from flat over correcting. But nothing can be done with old data. 



   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

Thank you @Kijja, this is a new angle I haven't considered!

I'm using Linux, so no ASCOM drivers for me, but I can adjust the ROI in capture, or cut the area from the already shot frames and see if that changes anything. I will report back my findings.

 

As I was researching the overscan and the optically black area for sensor I've found some instruction from a QHY rep in a competitor's forum (so I won't link the source, but quote):

Calibrated image = (L-D)/(F-B)

L=light frame
D= dark frame
F=flat frame
B=bias frame

And it is better to use this instead of the above

Calibrated image = (L-D)/(F-DF)
DF= dark flat frame . It is the same exposure time but with no light coming in.

The on-chip calibration part of the cmos sensor may cause the drift of the whole image when the image is bright. By using the overscan area, you can correct for this drift as follows:

Calibrated image = (L-D+drift of light frame)/(F-DF+drift of flat frame)

The drift of light frame will have little impact. But drift of the flat frame appearing in the denominator can cause over/under calibration.

The best way to handle this is do a overscan calibration. The method is :

(1)Keep the overscan area (in manual we have shown this part)
(2)Stack all flat frame and get a master flat frame, get the overscan area average value , for example: 1000
(3) Stack all dark flat frame and get a master dark frame. get the overscan area average value , for example, 1500

We see a 500 difference accountable to bias drift. Add a constant 500 to the master flat frame. After you add the 500, you will see the master frame overscan area is 1500, matching the dark flat frame.

(4) Do (F-DF) calculation and then you will get the correctly calibrated flat frame.

Normally speaking there is such a function in the stack software to handle this.

Considering this, perhaps this topic can be turned into a feature request for APP? Such functionality would be useful not just for this sensor but for others as well and implemented well it would be a novel features. I've found a manual for scientific cameras that talk about these areas in more detail, and also how to precalibrate frames using it (calibrating individual frames before mastering, including the darks, flats):

http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/stis/documentation/_documents/stis_dhb.pdf

Section "3.4.4 BLEVCORR: Large Scale Bias & Overscan Subtraction", page 95 in the pdf.

WDYT Vincent, would Mabula be interested in chiming in on this?



   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

Mm, I don't have knowledge about this specific approach, but I'll let Mabula know of course. 🙂



   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

I'm looking at your data now and I notice that the flats may have some underexposure going on. I see the main peak in the unstretched histogram close to the left of the range and a bit of data that is clipping on the left. This may actually cause an issue when calibrating that flat, which substracts the background noise. So I think this ties well with the warning you also get on the lights and exposing for longer will likely solve your issues.

If I zoom in onto the dark region in a light, you can see that you have lack of data, this can cause a correction with a flat to not work correctly.

image


   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

@Kijja, @vincent-mod, just reporting back on the overscan area. So on my images the first 24 columns are overscan/optically black. I've used the batch crop tool to remove those columns. That does get rid of the warning about clipping pixels. Unfortunately it does not solve the overcorrection (same result).

 

@vincent-mod, I've also taken more exposed flat on the night. I've just uploaded those and darkflats as well. They are 15s exposure (but not 5x the light compared to the 3s ones, as there was an extra sheet of paper for these). I've actually tried calibrating with these as well.. the result is the same. Are these still underexposed? The histograms seems to be roughly in the middle.



   
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

Yes, I'll have a look at those as well. 15s shouldn't be necessary though, I think you then need to try and make the flat-method brighter. Staying within 5s is usually a good idea. The issue then remains in the lights, if the pixels are already clipping there then APP still doesn't have a very good noise estimation and it will be different from the flat (the longer one will likely have a proper signal now).



   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 9 years ago
Posts: 5056
 
Posted by: @lenart

Thank you @Kijja, this is a new angle I haven't considered!

I'm using Linux, so no ASCOM drivers for me, but I can adjust the ROI in capture, or cut the area from the already shot frames and see if that changes anything. I will report back my findings.

 

As I was researching the overscan and the optically black area for sensor I've found some instruction from a QHY rep in a competitor's forum (so I won't link the source, but quote):

Calibrated image = (L-D)/(F-B)

L=light frame
D= dark frame
F=flat frame
B=bias frame

And it is better to use this instead of the above

Calibrated image = (L-D)/(F-DF)
DF= dark flat frame . It is the same exposure time but with no light coming in.

The on-chip calibration part of the cmos sensor may cause the drift of the whole image when the image is bright. By using the overscan area, you can correct for this drift as follows:

Calibrated image = (L-D+drift of light frame)/(F-DF+drift of flat frame)

The drift of light frame will have little impact. But drift of the flat frame appearing in the denominator can cause over/under calibration.

The best way to handle this is do a overscan calibration. The method is :

(1)Keep the overscan area (in manual we have shown this part)
(2)Stack all flat frame and get a master flat frame, get the overscan area average value , for example: 1000
(3) Stack all dark flat frame and get a master dark frame. get the overscan area average value , for example, 1500

We see a 500 difference accountable to bias drift. Add a constant 500 to the master flat frame. After you add the 500, you will see the master frame overscan area is 1500, matching the dark flat frame.

(4) Do (F-DF) calculation and then you will get the correctly calibrated flat frame.

Normally speaking there is such a function in the stack software to handle this.

Considering this, perhaps this topic can be turned into a feature request for APP? Such functionality would be useful not just for this sensor but for others as well and implemented well it would be a novel features. I've found a manual for scientific cameras that talk about these areas in more detail, and also how to precalibrate frames using it (calibrating individual frames before mastering, including the darks, flats):

http://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/stis/documentation/_documents/stis_dhb.pdf

Section "3.4.4 BLEVCORR: Large Scale Bias & Overscan Subtraction", page 95 in the pdf.

WDYT Vincent, would Mabula be interested in chiming in on this?

Hi @lenart,

This issue is new to me, but I can do say that this instruction is technically quite bad... You should never subtract a MasterDarkFlat from a MasterFlat.

The MasterDarkFlat should be subtracted from all individual flats and those individual masterdarkflat-subtracted flats need to be normalized and then integrated to a MasterDarkFlat. The reason is that normalization is needed to correct all flats for illumination differences. If flats are not normalized, you have little chance of making proper masterflats anyway....

Now, if you can find the overscan average value in the individual flats, then compensate the flats first with that value relative to the masterdarkflat average overscan value. You can simply use the Batch Add/Multiply tool on all flats to do that adjustment.

Then load the masterdarkflat and the adjusted flats and make the masterflat.

Mabula

 

 



   
Kijja reacted
ReplyQuote
(@Anonymous 174)
Joined: 9 years ago
Posts: 5702
 

I think it would be a very good idea to see if you can use the official ascom driver (not a fan of ascom myself either) to see if that gives you regular output that APP can expect. Then in addition to that, exposing for longer and I think you might solve the issue, at least for now.



   
ReplyQuote
(@lenart)
White Dwarf
Joined: 5 years ago
Posts: 11
Topic starter  

Dear @mabula-admin, @vincent-mod and @kijja,

Thank you for your help with this one. Using the the hints from all of you I think I managed to tackle this issue. I will need to do further tests with better quality data to be sure, but I can already see improvement. For posterity:

- The overscan/optically dark area messes with the calibration. If you are using ASCOM, you can set it to crop it. It will not resolve the issue completely, but it can help. Otherwise it can be cut with the batch crop tool (in this sensor/mode the left 24 columns)

- The individual flat frames needed to be adjusted with an offset that was calculated from the overscan areas as a difference between the avarage of the dark flats and the flats. In this case I've added 10 to the individual flat frames. Note that this alone also does not resolve the issue, you do need to get rid of this area before integration.

- As @vincent-mod said, there were also some gradients. Easily done away with the LP tool. (Which is a fantastic tool, period.)

- The correct exposure for the flat also helped.

As I wrote in the above comments, neighter or these 4 one their own fixes the problem but all used together does (I hope, at least).

 

@mabula-admin, while it is possible to do these steps with the batch tool (apart from the avarage calculation), it would be really nice to have crop & offset built into the 0 RAW/FITS tab, it seems to be a logical place for it. The calculation part is another question, but most of the pain comes from the batch crop/add tool (a lot harder to select the correct frames for example, extra space on the drive, need to clean up, etc). Also, for some reason the batch tool messes up the FILTER tag in the fits, so all layers must be marked manually. Again, doable, but error prone. Thank you for your consideration.



   
Kijja reacted
ReplyQuote
(@kijja)
Black Hole
Joined: 8 years ago
Posts: 149
 

@lenart 

I'm glad that this problem has been investigated thoroughly. Previously, I've tried with short exposure and not taking image on full moon nights but they are just 'symptomatic' treatments at the cost of poor and limited data quality. Until this new APP version reported my data quality that made me realize the effects of overscan area that not only causing problems on mosaic processing, but also making incorrect  calibration. 

I also use WBPP in PixInsight for calibration (which never complains about data quality, your poor data will always processed no matter how bad they are). At default settings on data with overscan area, WBPP give me a smooth correction but dust doughnuts are not removed, they are just smoothed out. 

Here are my integrated OIII and its ugly master flat taking with QHY268m gain 60 offset 50 mode 1 processed with APP1083.3

Rosette OIII session 1 session 2 St
MF IG 60.0 E 8.33s QHYCCD Cameras Capture 6252x4176  OIII session 2 St

Single frame uncalibrated and after calibration

ngc2244 600sec 1x1 O 0018 St
calibrated


   
ReplyQuote
Share: