A theoretical quest...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

A theoretical question about stacking

5 Posts
3 Users
1 Likes
993 Views
 Heno
(@heno)
Neutron Star
Joined: 7 years ago
Posts: 131
Topic starter  

One thing that has been puzzling my mind lately is this:
Assuming imaging a target with faint outer regions where you may only receive one or a few photons pr pixel every three minutes. If you take one minutes frames, the majority of frames this pixel will probably be without photons at all, and some pixels will have one or a few.
How will APP treat the pixels that contains photons? Will these photons be looked upon as outlayers or will they be accumulated even though they do not exist in every frame (pixel)?


   
ReplyQuote
(@wvreeven)
Quasar
Joined: 6 years ago
Posts: 2133
 

@heno Every pixel will contain a value due to read noise and light pollution. Even in the darkest skies on earth possible, the sky background will be much brighter than those faint outer regions so you would need a LOT of images to make them visible. Statistically speaking APP would eventually make them visible but practically speaking you would never be able to because of the number of images needed. 


   
ReplyQuote
 Heno
(@heno)
Neutron Star
Joined: 7 years ago
Posts: 131
Topic starter  

@wvreeven OK, thank you. Even so, at some point in this nebula you will be at a break point where your wanted signal will exceed the background by only a few pixels/time period. But I see what you mean. Neither the censor nor APP will be able to discern between real signal and background/sky fog signal. So in essence all the photons will be taken into account. That is what was puzzling me, but now I've got it. 
Have a nice evening. 😀 


   
ReplyQuote
(@wvreeven)
Quasar
Joined: 6 years ago
Posts: 2133
 

@heno As a matter of fact, a very faint part of a galaxy will always result into a few more photons arriving at a pixel than the average background. The challenge is the statistical fluctuations in the background that are larger that the number of these few more photons. That is why you need so many many pictures to make those faint parts become visible. OR you need to take very very long exposures so the signal of the galaxy (that doesn't fluctuate) exceeds the (fluctuating) background. Both are challenging if not impossible. 


   
Heno reacted
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

@heno Indeed that's the crux of the imaging signal versus background variance. I can see where the trickiness lies for a given optical system that is trying to pull signal from small faint objects and the possibility that they will always remain either swamped by background signal or never fixed enough with respect to background variance not to be picked off by rejection algorithms. I've tried to pick out smaller companion galaxies before by adding more and more time and just about did it but with horrendous resolution on a medium field system with OK pixel scale!. It gets fixed on larger scopes with longer focal lengths so that faint object takes up more real-estate on a given chip with pixel scale that is possible without seeing limitations and the fixed signal can build up much easier...folks with setups that go after faint fuzzies and small PNs might have the option, not so much for most of us 🙁


   
ReplyQuote
Share: