2022-05-11: APP 2.0.0-beta 1 has been released !!!
Download links per platform:
Running out of RAM
I'm running out of RAM when integrating large data sets. I have an F/2 imaging system so my exposures are short. I have 16GB of RAM but can't integrate more than 300 or so frames from my ASI183MM. I have several data sets with 600-1000 frames.
Is there anything I can do to integrate this data? Would I loose anything by doing 250 frames at a time and the integrating the resulting integrations based on exposure length?
Could APP warn me before I spend 3 hours integrating data just for it to die at the end? Could SWAP space be used on my SSD instead of just dying?
I had the same issue processing a large data set of 1000+ images. I have 32 GB RAM and had to max it in APP in order to get it to go through. But I actually made several attempts increasing it a few megs here and there, until I got tired of waiting 5 hours for it to process only to find out there was not enough RAM.
When integrating so many frames, it's better to divide the frames into sub stacks of say 200 and combine those later.
ps. Mabula is also aware of this issue, it seems to be a problem when using LNC. Did you have that on?