Tried APP for the first time last night...  

  RSS

(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
January 21, 2018 02:51  

...and my jaws dropped at how awesome this program is, for someone that wants to try PI for what it's capable of but is scared off by its complexity this is just brilliant

got a few general questions if I may, apologise in advance if it's already been answered elsewhere

1) with the creation of BPM, does it matter what ISO is used? that is, I'd imagine the flats and darks should have the same ISO when creating the BPM but does it needs to be the same ISO as the light frames?

2) if my understanding is correct the BPM is purely for removing bad pixels in the sensor, so it would still be advisable to take the usual dark and flat frames etc for other imaging sessions would that be correct?

3) in regards to flat and bias frames in general, I've read that ISO should be the same as the lights then I read someone else said it doesn't matter, which one is true (thinking in terms of creating master flat and bias whether or not I need to create it for different ISO's)

4) is there any way to perform star shrink/reduction in APP? if not would it be something that might be added to the program in the future?

5) any suggestions on how to deal with halo around brighter stars?

thanks very much!


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 2 years  ago
Posts: 1797
January 22, 2018 14:46  
Posted by: tony.liu.photography

...and my jaws dropped at how awesome this program is, for someone that wants to try PI for what it's capable of but is scared off by its complexity this is just brilliant

Hi Tony Liu,

Thank you for your nice compliments and welcome to the APP forum  😉

In aswer to your questions:

 

got a few general questions if I may, apologise in advance if it's already been answered elsewhere

1) with the creation of BPM, does it matter what ISO is used? that is, I'd imagine the flats and darks should have the same ISO when creating the BPM but does it needs to be the same ISO as the light frames?

No, ISO (or gain) does not matter, you can make a perfectly fine BPM with darks shot at ISO 400 and flats shot at ISO 100. No problem. And the ISO value of the used darks and flats does not matter for the ISO value of the light frames as well.

2) if my understanding is correct the BPM is purely for removing bad pixels in the sensor, so it would still be advisable to take the usual dark and flat frames etc for other imaging sessions would that be correct?

Yes indeed, your understanding is correct ;-). The Bad Pixel Map will only correct the pixels that don't behave like they should, they are bad and need be corrected ! The BPM will not correct issues like amp glow are fixed pattern noise in the dark current of your camera's sensor. You will need darks then besides the BPM. And the BPM will not correct dust spots or vignetting, so regular flat-field correction with flat frames is needed as well 😉

The BPM will have the effect that your calibrated light frames will nearly be free of hot/cold pixels and a benefit is that you can relax in outlier rejection filter settings, leaving you more SNR in your final integrations.

If outlier rejection is turned on in integration to remove bad pixels without having used a BPM, usually, it is turned on so strong that the bad pixels are removed completely, but so is part of your good data... Outlier rejection turned on too strong can have a severly degrading effect on your data, so therefore I would recommend to make 1 good BPM that works efficiently and use it on all your data for calibration. Once you have a good working BPM, you will be able to use it for months to come. (have been using one to good effect for over 3 years now ).

 

3) in regards to flat and bias frames in general, I've read that ISO should be the same as the lights then I read someone else said it doesn't matter, which one is true (thinking in terms of creating master flat and bias whether or not I need to create it for different ISO's)

Very good question and there is indeed some confusion about what would be the right path to take.

Flats can have a different ISO value (and I would even recommend it in certain cases) when compared to the ISO value of the light frames. No problem. But ! the flat frames need to be corrected with either a masterbias or masterdark (flat darks) with the same ISO value as the flats. That will ensure that the flats themselves are properly calibrated and that you will have a correct masterflat. If you would use ISO1600 bias frames on ISO100 flat frames, then obviously, the bias signal is not properly calibrated in the flats which can become apparant if you start to integrate lots of data.

The function of a masterflat is flat-field correction: remove dust spots and correct vignetting. This can be done with any iso or gain value, it's a matter of capturing the correct illumination profile when you shoot your flats. A difference in iso/gain will only have a multiplicative effect on the illumination profile and since this profile is normalized anyway in the flat-field calibration, the iso/gain value of the masterflat should have no influence.

Personally, I shoot my flats on ISO 100 with a DSLR and gain 0 with my asi1600mm-c camera. It is giving me much better flats because it's easier to increase the exposure time of the flats (to several seconds per flat) and the flats will therefore be of better quality. Flats created at short exposure times can give all sorts of problems so I like to prevent that.

A calibration workflow would be:

shoot flats at ISO100 and create a masterbias of ISO 100 or a masterflatdark of ISO 100 with the same exposure time as the flats (this is what I do normally). Then create a masterflat by loading the flats and bias/or flatdarks (load your BPM as well, will do BPM correction in the flats) and click on calibrate. Now you have properly created masterflat of ISO 100.

For your lights, we have the BPM and the masterflat. You only need to create either a Masterbias or Masterdark of the same ISO value as the light frames, and for darks of the same exposure time as the light frames. If you have created the masterbias or the masterdark for the lights then:

load your light frames, the BPM, the masterflat and the masterbias or masterdark for your lights (so don't load the masterbias/masterdarkflat for the flats, it's not needed anymore, since it's applied in the masterflat) and then you should get very good calibration of your data 😉

All current calibration rules are here:

https://www.astropixelprocessor.com/community/tutorials-workflows/astronomical-data-calibration-priciples-must-read/

4) is there any way to perform star shrink/reduction in APP? if not would it be something that might be added to the program in the future?

Currently, there is only 1 option: you can control the size of the brightest stars with the HL slider on the right which is part of the preview filter. This will reduce the stretch in the highlights. 

Yes, it's a range of features that will come, first to come is deconvolution.

5) any suggestions on how to deal with halo around brighter stars?

Currently, there is no function in APP to deal with that. How would you normally fix that? I guess with some Photoshop technique? I will be happy to add a feature for this in APP as well,  I will have to add it to my ToDo list.

thanks very much!

It's my pleasure 😉

Kind regards,

Mabula

Main developer of Astro Pixel Processor and owner of Aries Productions


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
January 22, 2018 22:55  

Hi Mabula, thanks so much for the detailed response!

While I am asking questions, do you mind helping me with a few more...

1) on one of my completed stacks the middle and the left side of the frame stars looks tight and round but as I move towards the right side of the frame the stars look progressively elongated, is that something that can be fixed with settings in APP or is it just a matter of me needing to acquiring better subs

2) in terms of applying drizzle, when I was using Deep Sky Stacker it was simply a matter of ticking a 2x box (for example) what would be a comparable setting to use in APP?

Thanks again for your help!

Edit: thought I'd add a couple more questions while I thought of them...

3) what reason to choose bayer drizzle over normal drizzle? is one better than the other?

4) for a high DR object such as Orion, what I've been doing is separately exposing for the brighter core and then merge them in photoshop, is there a way to do it in APP?


ReplyQuote
(@gregwrca)
Neutron Star Customer
Joined: 1 year  ago
Posts: 217
January 22, 2018 23:51  

If you have Photoshop too, or CC, there are ways to reduce the Blue Halo , post or pre-process.

GW


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
January 23, 2018 08:23  
Posted by: gdwats@comcast.net

If you have Photoshop too, or CC, there are ways to reduce the Blue Halo , post or pre-process.

yea i stumbled on a youtube video on how to do it in PS, going to give it a crack soon

interested in how you'd do it pre-processing tho!


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 2 years  ago
Posts: 1797
January 26, 2018 13:17  
Posted by: tony.liu.photography

Hi Mabula, thanks so much for the detailed response!

You're most welcome Tony 😉

While I am asking questions, do you mind helping me with a few more...

1) on one of my completed stacks the middle and the left side of the frame stars looks tight and round but as I move towards the right side of the frame the stars look progressively elongated, is that something that can be fixed with settings in APP or is it just a matter of me needing to acquiring better subs

Have you tried enabling dynamic distortion correction in 4)Register. How many stars are detected in 3) analyse stars, on average in your frames?

2) in terms of applying drizzle, when I was using Deep Sky Stacker it was simply a matter of ticking a 2x box (for example) what would be a comparable setting to use in APP?

APP uses a complete Drizzle implementation follow the Nasa technique and is thus much more advanced than DSS's 2x and 3x implementation. Drizzle 2x in APP would be:

  • set the integration scale to 2x
  • enable drizzle
  • set drizzle droplet to 0,50-0,70 ( Hubble's 2x drizzle was scale 2x and droplets at 0.65 if I recall correctly 😉 )

use drizzle kernel tophat for usually the best results in terms of smoothness and sharpness.

The smaller the droplets, the sharper and noisier the result will be.

The difference between the kernels are that the point kernel is simply the sharpest and noisiest. The droplet size has no infleunce then. For the other kernels, the differences are minor. If you integrate a lot of frames (100+) then the gauss kernel is probably what you want to use. It will be smooth, and sharpest.

Realize that drizzle is always a choice between sharpness and nosie in the settings that you use.

Thanks again for your help!

Edit: thought I'd add a couple more questions while I thought of them...

3) what reason to choose bayer drizzle over normal drizzle? is one better than the other?

Firstly, Bayer drizzle is only to be used for Bayer CFA data. And should be considered an alternative to the regular debayering of the CFA data. So a bayer drizzle setting would be:

  • leave integration scale at 1.0
  • set integration to bayer drizzle
  • I advise to use the tophat kernel
  • set droplets to 2.0-3.0 pixels.

These settings will be a good alternative to debayering. Again, if you lower the droplet size, the result becomes sharper and noisier.

Off course this can be extended with the regular drizzle principle. Increase the scale and lower the droplets. Depending on the amount of frames that you have, you need different settings. The more frames, the lower you can set the droplet size to get a good result.

4) for a high DR object such as Orion, what I've been doing is separately exposing for the brighter core and then merge them in photoshop, is there a way to do it in APP?

Simply integrate all the different exposures together, that in itself will increase bitdepth of the integration and will accomplish directly what you would get in photoshop. Use star shape weights to further prevent burning out the core of Orion in this case.

A real HDR implementation will come in the future and will be part of the RGB combine tool probably.

Kind regards,

Mabula

Main developer of Astro Pixel Processor and owner of Aries Productions


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
January 26, 2018 23:10  

thanks so much again Mabula for your detailed response, will have to give those things a try

so to confirm my understanding is correct images taken with dslr would be suited to doing bayer drizzle then? i was looking up google and found a pixinsight comparison showing that the bayer drizzle technique was superior to that of regular drizzle!

keep up the good work!


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 2 years  ago
Posts: 1797
January 29, 2018 18:23  

Hi Tony,

Yes, DLSR images are suited to use bayer drizzle and should give nice results with the settings I layed out in my previous post.

But!... Pixinsight doesn't have the Adaptive Airy Disc AAD debayer algorithm. Which is much better than VNG or AHD that Pixinsight has. In a lot of tests that I ran myself, AAD gave the sharpest results with lower noise than any Bayer Drizzle setting that I tried. The setings for Bayer Drizzle that I proposed do come close though.

I only recommend to use Bayer Drizzle (or regular drizzle for that matter) if you meet the following criteria:

  • you have lots of data ! (100s of frames)
  • all light frames are dithered well. (this one is crucial)
  • the exposures are under-sampled (this one is crucial)

If the exposures aren't dithered and not under sampled, dithering will not accomplish anything that direct upscaling (set scale in integration higher than 1.0) in integration will accomplish with the downside of having higher noise in the end result. Drizzle is a big noise injector, therefor you need a lot of data to be able to get a good result with more sharpness than a regular integration.

All technical arguments aside, you can always test off course, run bayer drizzle and normal integration with debayering with AAD and simply compare the data with similar stretchs.

Let me know your findings 😉

Mabula

Main developer of Astro Pixel Processor and owner of Aries Productions


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
January 30, 2018 07:42  
Posted by: Mabula Haverkamp - Admin

Hi Tony,

Yes, DLSR images are suited to use bayer drizzle and should give nice results with the settings I layed out in my previous post.

But!... Pixinsight doesn't have the Adaptive Airy Disc AAD debayer algorithm. Which is much better than VNG or AHD that Pixinsight has. In a lot of tests that I ran myself, AAD gave the sharpest results with lower noise than any Bayer Drizzle setting that I tried. The setings for Bayer Drizzle that I proposed do come close though.

I only recommend to use Bayer Drizzle (or regular drizzle for that matter) if you meet the following criteria:

  • you have lots of data ! (100s of frames)
  • all light frames are dithered well. (this one is crucial)
  • the exposures are under-sampled (this one is crucial)

If the exposures aren't dithered and not under sampled, dithering will not accomplish anything that direct upscaling (set scale in integration higher than 1.0) in integration will accomplish with the downside of having higher noise in the end result. Drizzle is a big noise injector, therefor you need a lot of data to be able to get a good result with more sharpness than a regular integration.

All technical arguments aside, you can always test off course, run bayer drizzle and normal integration with debayering with AAD and simply compare the data with similar stretchs.

Let me know your findings 😉

Mabula

thanks again Mabula! now i must ask...what do you mean by light frames are dithered well (and as extension, what exactly is dithering) and exposures are undersampled?

sorry for so many questions haha


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 2 years  ago
Posts: 1797
February 2, 2018 14:49  

Hi Tony,

Okay, both dithering and under sampled images are required if you want to benefit from drizzle integration. Otherwise don't use it ! The results will be worse for noise 😉 and not sharper when compared to normal debayering with APP's AAD algorithm.

Dither your images, probably nice explanation by Jerry Lodrigus on Sky and Telescope website:

Why and How to Dither Your Astro Images

Under sampled:

this has to do with the pixel size of your camera, the resolution of your optics and the resolution of the incomind data from outer space 😉

If you google with search term:

astrophotography under sampled images

you will find all the information you need. Simply said, in under sampled images, the stars are bery tiny, so the image pixels are too big when compared to the resolution of the incoming data. This happens if you shoot data with a very short focal lenght and good optics.

Or: If you have atmospheric seeing of about 2" and you image with an image scale of 1"/pixel then you aren't undersampling.

But If you have atmospheric seeing of about 2" and you image with an image scale of 4"/pixel then you are undersampling.

So under sampling will not happen if you use long focal length optics, unless you can image from outer space 😉 so drizzling data with such a long focal length is pointless if seeing limits the image scale that can be achieved. Drizzling will not accomplish more than normal upscaling of the data in that case and drizzle should really be avoided, because it's terrible for the amount of noise in the result..

Cheers,

Mabula

 

 

Main developer of Astro Pixel Processor and owner of Aries Productions


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
February 3, 2018 11:21  
Posted by: Mabula Haverkamp - Admin

Hi Tony,

Okay, both dithering and under sampled images are required if you want to benefit from drizzle integration. Otherwise don't use it ! The results will be worse for noise 😉 and not sharper when compared to normal debayering with APP's AAD algorithm.

Dither your images, probably nice explanation by Jerry Lodrigus on Sky and Telescope website:

Why and How to Dither Your Astro Images

Under sampled:

this has to do with the pixel size of your camera, the resolution of your optics and the resolution of the incomind data from outer space 😉

If you google with search term:

astrophotography under sampled images

you will find all the information you need. Simply said, in under sampled images, the stars are bery tiny, so the image pixels are too big when compared to the resolution of the incoming data. This happens if you shoot data with a very short focal lenght and good optics.

Or: If you have atmospheric seeing of about 2" and you image with an image scale of 1"/pixel then you aren't undersampling.

But If you have atmospheric seeing of about 2" and you image with an image scale of 4"/pixel then you are undersampling.

So under sampling will not happen if you use long focal length optics, unless you can image from outer space 😉 so drizzling data with such a long focal length is pointless if seeing limits the image scale that can be achieved. Drizzling will not accomplish more than normal upscaling of the data in that case and drizzle should really be avoided, because it's terrible for the amount of noise in the result..

Cheers,

Mabula

 

 

thank you Mabula! learning so much here, hopefully one last question for a while...I noticed under integration tab at the top there is option to select % of lights to stack, I am assuming should you not choose 100% the APP will automatically determine the best ones to stack would that be correct?


ReplyQuote
(@mabula-admin)
Quasar Admin
Joined: 2 years  ago
Posts: 1797
February 9, 2018 23:42  
Posted by: tony.liu.photography
Posted by: Mabula Haverkamp - Admin

Hi Tony,

Okay, both dithering and under sampled images are required if you want to benefit from drizzle integration. Otherwise don't use it ! The results will be worse for noise 😉 and not sharper when compared to normal debayering with APP's AAD algorithm.

Dither your images, probably nice explanation by Jerry Lodrigus on Sky and Telescope website:

Why and How to Dither Your Astro Images

Under sampled:

this has to do with the pixel size of your camera, the resolution of your optics and the resolution of the incomind data from outer space 😉

If you google with search term:

astrophotography under sampled images

you will find all the information you need. Simply said, in under sampled images, the stars are bery tiny, so the image pixels are too big when compared to the resolution of the incoming data. This happens if you shoot data with a very short focal lenght and good optics.

Or: If you have atmospheric seeing of about 2" and you image with an image scale of 1"/pixel then you aren't undersampling.

But If you have atmospheric seeing of about 2" and you image with an image scale of 4"/pixel then you are undersampling.

So under sampling will not happen if you use long focal length optics, unless you can image from outer space 😉 so drizzling data with such a long focal length is pointless if seeing limits the image scale that can be achieved. Drizzling will not accomplish more than normal upscaling of the data in that case and drizzle should really be avoided, because it's terrible for the amount of noise in the result..

Cheers,

Mabula

 

 

thank you Mabula! learning so much here, hopefully one last question for a while...I noticed under integration tab at the top there is option to select % of lights to stack, I am assuming should you not choose 100% the APP will automatically determine the best ones to stack would that be correct?

@tony-liu-photography,

😉

Yes, if you adjust the % of lights, then the worst frames are disabled for integration. The parameter that is is used is the quality of each frame.

And the quality of each frame is a combination of the following parameters:

  • star density 
  • star size and roundness (shown in the FWHM min, max values)
  • noise

These parameters are combined in a formula to get the quality metric.

Check what happens, if you adjust the %, the STACK keyword in the bottom frame list panel is added or removed on the frames concerned 😉

Cheers,

Mabula

Main developer of Astro Pixel Processor and owner of Aries Productions


ReplyQuote
(@tony-liu-photography)
White Dwarf Customer
Joined: 10 months  ago
Posts: 11
February 12, 2018 10:58  
Posted by: Mabula Haverkamp - Admin
Posted by: tony.liu.photography
Posted by: Mabula Haverkamp - Admin

Hi Tony,

Okay, both dithering and under sampled images are required if you want to benefit from drizzle integration. Otherwise don't use it ! The results will be worse for noise 😉 and not sharper when compared to normal debayering with APP's AAD algorithm.

Dither your images, probably nice explanation by Jerry Lodrigus on Sky and Telescope website:

Why and How to Dither Your Astro Images

Under sampled:

this has to do with the pixel size of your camera, the resolution of your optics and the resolution of the incomind data from outer space 😉

If you google with search term:

astrophotography under sampled images

you will find all the information you need. Simply said, in under sampled images, the stars are bery tiny, so the image pixels are too big when compared to the resolution of the incoming data. This happens if you shoot data with a very short focal lenght and good optics.

Or: If you have atmospheric seeing of about 2" and you image with an image scale of 1"/pixel then you aren't undersampling.

But If you have atmospheric seeing of about 2" and you image with an image scale of 4"/pixel then you are undersampling.

So under sampling will not happen if you use long focal length optics, unless you can image from outer space 😉 so drizzling data with such a long focal length is pointless if seeing limits the image scale that can be achieved. Drizzling will not accomplish more than normal upscaling of the data in that case and drizzle should really be avoided, because it's terrible for the amount of noise in the result..

Cheers,

Mabula

 

 

thank you Mabula! learning so much here, hopefully one last question for a while...I noticed under integration tab at the top there is option to select % of lights to stack, I am assuming should you not choose 100% the APP will automatically determine the best ones to stack would that be correct?

@tony-liu-photography,

😉

Yes, if you adjust the % of lights, then the worst frames are disabled for integration. The parameter that is is used is the quality of each frame.

And the quality of each frame is a combination of the following parameters:

  • star density 
  • star size and roundness (shown in the FWHM min, max values)
  • noise

These parameters are combined in a formula to get the quality metric.

Check what happens, if you adjust the %, the STACK keyword in the bottom frame list panel is added or removed on the frames concerned 😉

Cheers,

Mabula

thanks so much Mabula!


ReplyQuote
Share: