Mar 28 2026 APP 2.0.0-beta40 will be released in 7 days.
It did take a long time to have the work finished on this and it will have a major performance boost of 30-50% over 2.0.0-beta39 from calibration to integration. We extensively optimized many critical parts of APP. All has been tested to guarantee correct optimizations. Drizzle and image resampling is much faster for instance, those modules have been completely rewritten. Much less memory usage. LNC 2.0 will be released which works much better and faster than LNC in it's current state. And more, all will be added to the release notes in the coming weeks...
Update on the 2.0.0 release & the full manual
We are getting close to the 2.0.0 stable release and the full manual. The manual will soon become available on the website and also in PDF format. Both versions will be identical and once released, will start to follow the APP release cycle and thus will stay up-to-date to the latest APP version.
Once 2.0.0 is released, the price for APP will increase. Owner's license holders will not need to pay an upgrade fee to use 2.0.0, neither do Renter's license holders.
Hi everyone,
One limitation I ran into while working with Astro Pixel Processor (APP) is that while it shows a very useful graph of subframe quality, it doesn’t allow for automatically removing images below a given quality threshold — for example, removing all subs with a quality score below 75% of the best.
To address this, I developed a script with the help of ChatGPT, tailored specifically for APP’s exported TableData.csv. It automates the cleanup process based on the same principle the graph uses.
✅ What It Does
This script:
Automatically rejects light frames whose APP quality score is below a defined percentage of the best frame
Moves those rejected .ARW files into a Rejected/ folder, preserving their original directory structure
❌ Rejects any subframe with a failed star analysis or malformed quality entry
How to Use
After analizing stars in APP, save the Analitical data into any folder.
bash
python3 clean_subframes.py TableData.csv 75
This will:
Read APP’s TableData.csv file
Keep all frames with a quality score ≥ 75% of the best subframe
Move the rest to Rejected/ folders
Why It Matters
APP gives us the tools to measure frame quality, but frame rejection is still manual. This script bridges that gap — it behaves like APP’s graph and automates the process based on the normalized threshold.
Whether you want to clean up bad seeing, guiding errors, or outliers in long imaging runs — this script will save time and help standardize your rejection criteria.
Let me know if you'd like the script — I’m happy to share it with anyone who finds it useful.
Yo may download the script here
Hi Pere @pereguerra,
Thank you very much for your suggestion. I think that you have missed the % slider at the top op menu 6) Integrate. That will do exactly what you are proposing. Set it to 80% and the best 80% will be stacked without the bad 20%.
Does this solve it?
Mabula
@mabula-admin thank you for your prompt reply, I did not miss the slider and I understand what it does, but once I’ve decided to exclude a certain percentage of subs from the integration process I would want a way to batch delete those subs from the hard drive to save valuable space.
I normally store 40GB of subs per night as I much prefer stacking shorter exposures in my light polluted skies, this script saves me about 5GB per night.
I would be very careful with automatically removing subs. I do it manually, evaluate each parameter separately and look for outlier. For example, high s/n ratio in one subs doesnt mean it actually is better than the average or the worst. It can also mean clouds coverage.
/Stefan
@mabula-admin Hi Mabula.
The % slider in 6) has the semantic to use the best 80% (example) of the frames.
--> So regardless of the absolute score values, the worst 20% are rejected.
--> Example: 20% have a score of 499, 80% have a score or 500 ... you ignore all the frames with 499.
--> Not what @pereguerra proposes. (And I think, not meaningful - so I never use this slider.)
Pere Guerra proposal is different: Ignore all frames, which have score below 80% of the best frame.
--> This condition is data dependent - the number of rejected frames can be 0, or (theoretically) "all - 1".
--> In the example above, all frames would be used for integration, 0 rejected frames.
Best regards.
Michael
You're right about the threshold and you described a nice example and it would be a nice add!
You're wrong about what Pere wanted, he did ask for a tool to auto remove (delete) data!
I think a tool like this, to delete data, shouldn't be implemented or if so, it should not be easy to access. Moreover it is most cases essential to inspect "bad" quality data manually because there could be bad data (bad guiding) or good data (airplane/satellite) which actually gets a bad score. I do inspect the best score images to, sometimes on cloudy days or at dusk/dawn, the best images are shaded by clouds or somehow influencrd by light pollution, resulting in exceptionally good fwhm and small stars therefore quality about 20% above best/darker images without clouds....
Sebastian
Hi Sebastian. @xyfus
... Moreover it is most cases essential to inspect "bad" quality data manually because there could be bad data (bad guiding) or good data (airplane/satellite) which actually gets a bad score. I do inspect the best score images too, sometimes on cloudy days or at dusk/dawn, the best images are shaded by clouds or somehow influenced by light pollution, resulting in exceptionally good fwhm and small stars therefore quality about 20% above best/darker images without clouds...
This is valuable - so I learned from this thread
a. to not automatically deselect bad-score-images ... might be good data, and
b. to inspect carefully also the good-score-images ... might be bad data.
Thank you !
Michael
You're welcome! 🙂
In most cases app quality score is a valid criteria! But it depends on the data. If I know my session didn't have the best conditions, I do inspect the top/bad 10% manually before integration.
That said, I m8stly do long exposures, so there are 50-100 images a session. If someone is going to inspect 1000+ with short ones, maybe Pere has a valid point for autodelete some of the raw data... 😉
But you always can sort them in app browser inspect diagram an delete all images lower a given threshold by shift selecting them and right click delete ( or remove if you dont want to delete the data)
@mabula-admin Hi Mabula.
The % slider in 6) has the semantic to use the best 80% (example) of the frames.
--> So regardless of the absolute score values, the worst 20% are rejected.
--> Example: 20% have a score of 499, 80% have a score or 500 ... you ignore all the frames with 499.
--> Not what @pereguerra proposes. (And I think, not meaningful - so I never use this slider.)Pere Guerra proposal is different: Ignore all frames, which have score below 80% of the best frame.
--> This condition is data dependent - the number of rejected frames can be 0, or (theoretically) "all - 1".
--> In the example above, all frames would be used for integration, 0 rejected frames.Best regards.
Michael
Indeed, duely noted 😉