Why are all images ...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

Why are all images normalized?

2 Posts
2 Users
0 Likes
729 Views
(@mkeller0815)
Main Sequence Star
Joined: 4 years ago
Posts: 24
Topic starter  

Hello,

I'm currently trying to stack my first light images (Skywatcher Evostar 72ED) of M13. I have 100 lights and 30 darks and bias frames. The image quality is not very good due to the fact that I still waiting for my field flattener to arrive. To get something out of my images I only stack 70 out of 100 light frames but during the normalization process all 100 frames are processed. Why is this done if only 70 of them are used at the end? The quality score of an image is calculated during the calibration in step 2 so it is already there when the normalization process (step 5) starts and at the end (step 6) only 70 of my images are used for integration. 
For me it could have saved 30% of the time the normalization step took saving about 15 minutes or so. 

Maybe somebody could bring some light into this.

With best regards,

Mario.


   
ReplyQuote
Topic Tags
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

The quality score does change when subsequent steps are processed, when APP knows more about the data because of those steps, things can change.


   
ReplyQuote
Share: