Stacking images fro...
 
Share:
Notifications
Clear all

15th Feb 2024: Astro Pixel Processor 2.0.0-beta29 released - macOS native File Chooser, macOS CMD-Q fixed, read-only Fits on network fixed and other bug fixes

7th December 2023:  added payment option Alipay to purchase Astro Pixel Processor from China, Hong Kong, Macau, Taiwan, Korea, Japan and other countries where Alipay is used.

 

Stacking images from different sources?

28 Posts
8 Users
6 Likes
4,935 Views
 Gary
(@garyrmck)
Main Sequence Star
Joined: 7 years ago
Posts: 15
Topic starter  

Hi Mabula,

I've been trying out APP with the limited amount of data I have available to me and getting excellent results, but I have a question - primarily because I don't have the data to try it!

Can APP process/stack data from different scope and camera combinations? Let's say tonight I use my 8" f4 newt to get some data, and next week I use my C11 at f7, one using a one shot color camera, the other a mono these be stacked in APP?

I'm looking at this to get fast low res color data, combined with high detail long focal length removed link

cheers

Gary


   
Mabula-Admin reacted
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Hi Gary,

Thank you for your question.

Can APP process/stack data from different scope and camera combinations? Let's say tonight I use my 8" f4 newt to get some data, and next week I use my C11 at f7, one using a one shot color camera, the other a mono these be stacked in APP?

Yes, APP actually is very good at this. The registration engine of APP has no problem at all with combining data of different image scales and/or focal lengths and/or Field of Views. I have created a long video tutuorial to demonstrate this using data of 5 different optical configuration of 5 different photographers:

https://www.astropixelprocessor.com/registration-normalization-integration-using-ddc-lnc-mbb/

In the video, you can clearly see that there is a big difference in image scale between the different datasets of the different photographers.

Regarding combination of RGB and monochrome data. In 2) Calibrate, you can save your calibrated frames splitted per channel. Then you can register and integrate them with the monochrome channels. And with the "RGB combine" tool, you can put everything together 😉

Let me know if you have more questions.

Cheers,

Mabula

 


   
ReplyQuote
 Gary
(@garyrmck)
Main Sequence Star
Joined: 7 years ago
Posts: 15
Topic starter  

Thanks Mabula,

APP looks better and better....

cheers

Gary


   
Mabula-Admin reacted
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

You're welcome Gary,

Thanks 😉

Mabula


   
ReplyQuote
 Ron
(@rojoyinc)
Main Sequence Star
Joined: 7 years ago
Posts: 24
 

Old topic, but I'm new to app.  Moved over from PI (which is way to many steps and clicks but surely very capable). I've enjoyed the automation of APP.

So - I shot some 6" APO shots of the ghost nebula. (200+ frames).  I later changed to the 11" rasa.  I shot some more.  I'm trying to intergrate all as APP says it can. I watched a nicely done video showing 5 photographers images and it seemed to be fairly fast.

My two scopes (same camera) has been churning now for over 24 hours and sits at 57%.  I have a 4 core CPU (app says it's using 3).
Is this typical speed?  There are only 2 different image scales - once it figured out the match - shouldn't it try that first on all other images? 

Oh I'm doing the registration. Here's what I see after 24 hours.           (my fear is I have something set wrong and after 24+ hours it doesn't work) = (
On a good note, like the video it shows that the RMS is at least lowering slowly each time. 

app

                                                        


   
ReplyQuote
(@jochen-scharmann)
Neutron Star
Joined: 5 years ago
Posts: 82
 

Hi Ron, 

Hold On! if you selected multiple iterations, this process may last very long. With some 200 subs it took like 6-8 hours on my 8-core i7. Usually 1 iteration will do, but employing different focal lengths like you do, it might need 3 iterations and lots of patience. It will pay out in the end.

clear skies,

jochen 


   
ReplyQuote
 Ron
(@rojoyinc)
Main Sequence Star
Joined: 7 years ago
Posts: 24
 

OK THANKS for the fast reply. I did abort it before. Tried again and still long... I didn't pay attention to iterations. I assumed that was default. But I'll watch it next time.  I'll let it cook over night again tonight.  If they all stack it will be worth 48 hours.  It's in the bg - so as long as I don't crash it will be fine.  thanks!

 


   
ReplyQuote
 Ron
(@rojoyinc)
Main Sequence Star
Joined: 7 years ago
Posts: 24
 

re-read you said IF?  I didn't change it.

looking =- I checked what Mabula checked in the video.

Quad
1
5
FLIP since one is a APO the other a RASA assuming 1/2 are flipped
Use dynamic dist.
SAME CAMERA OPTICS  unchecked  (different optics - same camera)

mode normal
reg model projective

rectilinearprojection
1.0
lanczos-3
no over under - 
I don't see iterations?


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Hi Ron, so 3 cores isn't a lot and having a lot of data with different scopes will take a lot of time in that case. It might be better to split up the data from the scopes and keeping the amount of frames in 1-go to about 150-200.


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

Reviving this thread instead of starting a new one.

When stacking multiple sessions using two different cameras but same optics, how do I feel with two different Bayer patterns?

One has BGGR (ToupTek) and the other is rggb (asi2600mc).

I would like to process the whole lot together.

Thanks


   
ReplyQuote
(@wvreeven)
Quasar
Joined: 6 years ago
Posts: 2133
 

@cwm2col Hi Colm, it depends.

Is the Bayer info available in the FITS header? If yes then you can load the images in two different sessions and integrate them. In tab 6 you can choose to create one integration for each session or one integration for all sessions together.

If the Bayer info is not available in the FITS headers (for instance when you use APP) then you'll need to integrate the two sessions separately in two different integration runs. Then you can load the two integration results as lights and combine them.

HTH, Wouter


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

Of course, I knew this but forgot 😆 Thanks for the reminder and the help Wouter. I'll check if the ToupTek stores info in the fits file, I know the ZWO does 


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

My ToupTek does not store fits info correctly, and labels a GRBG j stead of BGGR (an older driver problem).

So, it might be time to test the fits header editor feature on the latest APP release to make it work with the 'supported' debayering/CFA option for multiple session integration


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

That might work (didn't try it myself yet 🙂 ), or you process as separate sessions still. But let us know how it works!


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

@vincent-mod

 

Just getting time now to try this.

But, a persistent problem is that the byer patterns under 'supported' in tab 2 (read from fits header) says GBRG for all fields from both cameras, and always did that for all versions of APP I have used.

Can that be fixed? BAYERPAT and COLORTYP say RGGB for ASI2600 (correct) and BGGR for the touptek (correct). APP only says GBRG when all are loaded, under the CFA column in the text groups at the bottom of the app display. Anyone noticed this before?

 

I will try rewriting the touptek ones, but it should not be necessary to rewrite any of them if the bayer pattern is identified in the FITS header (assume reading direction such as bottom-up is the same for both). Maybe I am missing something?

Clipboard01

 


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

Fyi, the batch CFA modifier tool works perfectly. But, to get it to work, the images must be first loaded with the correct force bayer pattern and then batch modified so that it looks the correct color on the screen.

Then all files from different bayer pattern cameras can be loaded under 'supported' and 'force CFA' in tab 2 for subsequent processing.


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Ok, thanks for that. So does that mean you got it working or that you still have this issue like in the comment above?


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

No. Still the same problem. No matter how many lights I load from any camera, irrespective of whether I choose supported or a particular bayer pattern in tab 2 and force CFA, the default pattern in the file list at the bottom of the screen is always GRBG.

Forcing CFA to the known pattern will allow it to show with correct colors on screen and subsequent processing, but I would like to see this problem fixed though. Right now it does not read the fits header correctly and ignores the bayer pattern of all cameras. It has to be known and forced, making it impossible to use data from two different camera with different bayer patterns using the 'supported' option. that would be very useful, especially for different systems, optics, cameras for collaborative mosiacs for instance.

They need to be processed separately at the moment, which is not so good for rejection and weighting of a large multisession stack, in some instances.

For the CFA fits header modification, that does work but it is not so intuitive. An image needs to be loaded first with force CFA option so that the color on the screen for the image is correct (again 'supported' option does not work initially). Then the batch fits header modifier works fine.  I think the nice tool tips for the new changes would be helpful, these are not implemented fully in the newer version. Maybe we might see them soon.

 

Thanks again, Colm


   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Alright, I'll pass this on to Mabula as I'm also not sure why the pattern would change in your case. Thanks for letting us know!


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Hi Colm @cwm2col, @vincent-mod,

I need to really see this myself I think.... but

First of all, if the files have the BAYERPAT tag in the header, then all should work. APP also supports the tag CFAPAT and our own tag CFAIMAGE.

Then, the fits tag COLORTYP is not standard for a CFA pattern,  it is not in the FITS specification as well, and we don't support it (yet). I just checked. We can support it however if you can upload some frames with that tag.

The frames with the COLORTYP tag can be corrected in 1.083-beta2 with the Batch modify FITS tool.

If you leave the bayer pattern in 0) RAW/FITS to supported, it will use a supported tag in the header when present. Now the COLORTYP frames then will not work, because that is unsupported for now.

If you set the Bayer Pattern to anything else than supported, it will always overrule the header tag as it should.

Finally, if you set FORCE BAYER CFA, then the COLORTYP frames will be demosaiced with the set pattern. If the pattern was supported, it falls back to RGGB as the default, since most sensors have that pattern.

I would think that all your problems are caused by this unsupported COLOTTYP tag.

To make sure that we solve this, can you upload frames for both camera's so I can have a good look?

Upload can be done here:

https://upload.astropixelprocessor.com/

username and password: upload5

Please make a foilder with your name and let us know when uploaded 😉

Mabula

 


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

@mabula-admin

thansk Mabula,

 

I'll upload 5 from each camera, one BGGR and the other RGGB. Again, what I see is that the report something else on the frames listing at the bottom of the screen, but they process fine once the CFA pattern is selected and forced. The supported option does not pick up on the CFA and render the color correctly automatically in all cases. 

these 10 files will be ones that have not been FITS header batch rewritten.


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

@mabula-admin

 

files have been uploaded in two folders inside my folder


   
ReplyQuote
(@mabula-admin)
Universe Admin
Joined: 7 years ago
Posts: 4366
 

Thanks Colm, @cwm2col, I will check your files and report back 😉

Mabula


   
ReplyQuote
(@jzholloway)
Molecular Cloud
Joined: 3 years ago
Posts: 3
 

I am having another problem - I am stacking data from two different optics and ending up with what looks like a fisheye effect - I tried using three images from each data set and it worked fine, but when I try to do all of them I am getting the effect.


   
ReplyQuote
(@jzholloway)
Molecular Cloud
Joined: 3 years ago
Posts: 3
 
loop2

Here is the result


   
ReplyQuote
(@col)
Neutron Star
Joined: 3 years ago
Posts: 127
 

@jzholloway

The looks zany.

There is something way off with registration between images. What image scales are used in the various image set?

If calibration frames are used, are they correctly assigned to each set of images?

After registration, can you check which image set is used as reference, assuming one is much more wide field than the other. Best to use the widefield as reference maybe, even if the longer focal length system (I am making assumption that they are very different) gives a better overall score to be the reference.

honestly never saw this before, maybe someone else did and knows a simple workaround.


   
ReplyQuote
(@jzholloway)
Molecular Cloud
Joined: 3 years ago
Posts: 3
 

@cwm2col

Thanks for the reply! I am using a Radian Raptor 61 with a Canon EOS Ra and my buddy is using a RedCat with a ZWO asi2600mc - so mine has a wider field and one of my images is used as the registration image. Note: We have combined before (Sadr Region wider field and Crescent Nebula with a longer focal length - 480 and 500) with no issues.

Only calibration frames that are used are on my three sessions (again, did this the last time we tried this) and they are all assigned to the sessions correctly. I tested, taking 3 images from each of my session (plus calibration frames) and 3 from his session ans it worked with no issue.

loopworked

   
ReplyQuote
(@vincent-mod)
Universe Admin
Joined: 7 years ago
Posts: 5707
 

Switch off distortion correction, if data doesn't need it, the algorithm may get into a state like this.


   
ReplyQuote
Share: