Share:
Notifications
Clear all

Error messages

Page 2 / 2

(@mabula-admin)
Quasar Admin
Joined: 5 years ago
Posts: 3117
 
Posted by: @heno

My NB lights have a median DN of at least 1300, 16bit and the lowest pixel value read in each frame is usually between 600-1000, these are 6 minutes frames. For LRGB all values are much higher. This as read by NINA. I find it hard to believe they are under exposed.
Flats/dark flats are taken using the NINA flats wizard and have the exact same value and offset. I always use offset 10 on my camera, but I will certainly check it that could have accidentally changed during any of the calibration frame captures.

@heno, well.. median ADU values of 1300 in the 16bit range of 0-65355 is very low, but ! not uncommon when you shoot narrowband of course. It means the sky background in your images are only at 2% of the data range.... I expose to have the sky background at least at 10% of the data range, and I go much deeper with the faint details as a consequence. I understand that there are practical limits in exposing long. I make narrowband exposures of at least 15 minutes or longer because it gives much better results and my mount can handle that.

Now an offset of 10 is also rather low in my experience. I use an offset of 20 I believe with my 12bit asi1600mm. which means that the offset in 16bits is already at 20*16 = 320.

I use the median value in my lights during capturing to determine if I am exposing long enough. I have calculated a value which when reached, the sky fog/light pollution will swamp the read noise by a factor of 20.

Okay, but that is only part of what you should look at when determining exposure time, because read noise is not the complete story here if you take into account possible  data calibration.

If data calibration is involved (and it always is) besides read noise, you need to look at the median value of your masterdark/bias and the standard deviation (sigma) and relate that to the median value of your light frames. The median value of your light frames need to be well above the median of your MasterBias/MasterDark + 24 * the Master's standard deviation... Even with 24 * the standard deviaton, pixels can clip, because the dark current signal can be partly non-gaussian due to causes like amp-glow. In my experience, 24 * standard deviation is reasonable on many datasets especially with CMOS sensors where read noise is different per pixel. In CCD's, read noise is the same for the whole sensor (or quadrants of it), not so with CMOS !

If you make masters from more frames, that standard deviation in the master gets lower and thus the exposure time can be reduced as a consequence. Also the sensor offset can be lower when you shoot more calibration subs.

So if you don't take into account the masterdark/masterbias median and standard deviation, you run the risk, that some pixels will clipp and that they will have no meaningfull information as a consequence. The adaptive pedestal that APP uses helps you prevent this, but the warning we implemented happens when pixels still clip when the artificial pedestal is used, which is already 24* standard deviaton... so if your pixels still clip then in significant numbers, you really want to know this...


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 5 years ago
Posts: 3117
 
Posted by: @heno

So I think I cracked it. I have made a complete, new darks library to replace the old one. Yesterday I reran two of the integrations that gave me the data issue warning. No warning this time so I assume the problem is gone. Why my old darks were clipping pixels to black, I have no idea. I have tried to recreate using various camera settings, but I cannot.

If I, with my rather limited knowledge in this topic, am able to identify this fault, so should APP. In fact, this problem could already have been identified when the master darks were created if such a check was implemented. I am sure that it would not even pose a challenge to @mabula-admin to create and implement relevant checks on the calibration files. 

Helge

@heno, I think we are doing just that, aren't we? APP noticed something is fishy when the lights were calibrated, right?  Only by looking at a master, we can not say much. Even the header's gain and offset value are sometimes not what we think they are because of the camera control issues of capture software packages.


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 5 years ago
Posts: 3117
 
Posted by: @heno

@wvreeven

I'm thinking that that in a MD there should not be any more clipped pixels than would show in the BPM. But I could be totally wrong here?

Helge 

Clipping pixels to zero has to do with the pixel distribution of the bias + dark current signals. Not so much with bad/hot pixels which the BPM indicates 😉


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 5 years ago
Posts: 3117
 

Hi @heno,

Thank you for the suggestions. I have added the following to my ToDo List:

  • add offset, temperature and binning info in masters if present
  • I will show offset and temperature if present in the frame list as well per image
  • Console Output logs will be possible to generate per module 2) to 6)
  • I will keep track of the SWCREATE tag per master and compare it internally to the lights to see if that can explain possible issues.

Now to check on flats if they are really flats or darks if they are really darks is fraud with danger... Some users have flats with no vignetting at all... it is very hard to discriminate then... So I don't think that is very viable or robust. And APP can perfecly show you problems with calibration to start with using the l-calibrated image viewer mode which is there for that purpose.

Continuing processing when not all data passes registration or star analysis is also difficult. An old version did it and then users started asking that APP should stop because there is an issue that needs to be solved first, which i agree on as workflow and thus implemented that...But... the next APP version will be completely about save/load settings and projects and we might as well add the option in general settings to choose this behaviour with a selectbox continue processing with problems and you can then enable/disable this, that would make everyone happy then I think 🙂 ?

Mabula


ReplyQuote
 Heno
(@heno)
Red Giant Customer
Joined: 5 years ago
Posts: 113
Topic starter  
Posted by: @mabula-admin

Now an offset of 10 is also rather low in my experience. I use an offset of 20 I believe with my 12bit asi1600mm. which means that the offset in 16bits is already at 20*16 = 320.

I have had the same camera. I used an offset of 13.
Dr. Robin Glover (or maybe it was Craig Stark) said in a lecture that the easy way to determine you proper offset was to take 0,5 sek dark exposures. Increase the offset until you're in the median 500-1000 range. And don't sweat it. 🙂 Not sure if you agree with this, but that is what he sad, as I remember it. And that is what I do, and I read 628 (just checked). SD value was 8,15. (NINA figures) A higher offset will decrease the dynamic range, but you knew that already. 
My camera is the ASI294MM.

Posted by: @mabula-admin

If data calibration is involved (and it always is) besides read noise, you need to look at the median value of your masterdark/bias and the standard deviation (sigma) and relate that to the median value of your light frames. The median value of your light frames need to be well above the median of your MasterBias/MasterDark + 24 * the Master's standard deviation... Even with 24 * the standard deviaton, pixels can clip, because the dark current signal can be partly non-gaussian due to causes like amp-glow. In my experience, 24 * standard deviation is reasonable on many datasets especially with CMOS sensors where read noise is different per pixel. In CCD's, read noise is the same for the whole sensor (or quadrants of it), not so with CMOS !

I'm do not understand this 24* SD. Why 24? Below is statistic of a 60s MD, as read by NASA/ESA FITS liberator. The figures honestly seems a little strange to me. 

Statistics
Posted by: @mabula-admin

Now to check on flats if they are really flats or darks if they are really darks is fraud with danger... Some users have flats with no vignetting at all... it is very hard to discriminate then... So I don't think that is very viable or robust. And APP can perfecly show you problems with calibration to start with using the l-calibrated image viewer mode which is there for that purpose.

OK, I accept that that could pose a problem. It was just a suggestion anyway.

Posted by: @mabula-admin

Continuing processing when not all data passes registration or star analysis is also difficult. An old version did it and then users started asking that APP should stop because there is an issue that needs to be solved first, which i agree on as workflow and thus implemented that...But... the next APP version will be completely about save/load settings and projects and we might as well add the option in general settings to choose this behaviour with a selectbox continue processing with problems and you can then enable/disable this, that would make everyone happy then I think?

I appreciate that not everybody would agree with me in this, but if you could make this a setting, as you say, everybody should be pleased. 👍 😊 

Posted by: @mabula-admin

@heno, I think we are doing just that, aren't we? APP noticed something is fishy when the lights were calibrated, right?  Only by looking at a master, we can not say much. Even the header's gain and offset value are sometimes not what we think they are because of the camera control issues of capture software packages.

I never said or meant that the warning was incorrect, but running the same process a second time did not produce the warning. That certainly made me wonder. But you have explained why. What frustrated me was that the root cause of the problem was not identified, which I think it could have been and should have been. The timing of the warning, in the middle of a process indicated to me that the problem was related to one specific (light) file that were handled at that point in time. It wasn't. In this case, clipped pixels in the master dark was the problem. I'm still think APP could have warned about this problem when the MD was loaded, if such a check had been implemented. (I know, I demand a lot! 😀 )

@Mabula-admin I really appreciate you taking so much time to respond. It means a lot.  I just hope that others reading this will also find it useful.

Helge


ReplyQuote
Page 2 / 2
Share: