It did take a long time to have the work finished on this and it will have a major performance boost of 30-50% over 2.0.0-beta39 from calibration to integration. We extensively optimized many critical parts of APP. All has been tested to guarantee correct optimizations. Drizzle and image resampling is much faster for instance, those modules have been completely rewritten. Much less memory usage. LNC 2.0 will be released which works much better and faster than LNC in it's current state. And more, all will be added to the release notes in the coming weeks...
Update on the 2.0.0 release & the full manual
We are getting close to the 2.0.0 stable release and the full manual. The manual will soon become available on the website and also in PDF format. Both versions will be identical and once released, will start to follow the APP release cycle and thus will stay up-to-date to the latest APP version.
Once 2.0.0 is released, the price for APP will increase. Owner's license holders will not need to pay an upgrade fee to use 2.0.0, neither do Renter's license holders.
I always go through all images visually to delete obvious bad subs. Then Star shapes and delete outliers and the same with star size. After that, I check signal/noise ratio. If you have couple of subs that has strong s/n ratio and the rest is like 70%, them I delete those since they are due clouds or other streaks of light thar shouldnt be there. After that I check background, I delete obvious subs shot during twilight,. Sometimes I keep them, but the obvious are deleted. Lastly at the least important is the quality check. After that I stack 100% of the subs.
If you leave the integrate box set to automatic then APP will weight the frames using its own quality measure. Low quality frames will receive a lower weighting. In many cases - pehaps most cases - this means you don't have to worry about some frames being a bit sub-optimal, and you can stack 100% unless you have reason to think some frames are particularly bad. I often rely on this, but if I suspect some frames are affected by clouds I may look for them and delete them, or I might reduce the lights to stack to 80% or so.
You can put a fair bit of effort into identifying sub-optimal frames, but often this will not necessarily improve your final image unless there are some really horrible frames.
JC
This post was modified 12 months ago by John Connor
@connor231 Just noticed I have always had mine on automatic (quality is greyed out) but when I started getting star registration failed I started using best 90% 75% or less.
Registration failure indicates there may be something seriously wrong with some frames. In this situation you should identify the problem frames and probably delete them. Stefan has given some good advice above on how to do this, although these days I mostly rely on the analytical graph (at least to start). I usually run the graph after 5)Normalize, but if registration fails it can be run after 3)Analyse Stars.
Typically registration fails because of clouds or tracking problems but there are other possibilties.
On the face of it you have two separate problems - a registation problem and a calibration problem. I guess it might be possible that the registration problem is caused by the calibration problem (although that seems unlikely to me). It would be easy to test - just run APP on your lights, with no flats, no darks, no darkflats, no BPM. If the frames register you can just move on to the calibration issue. Issues with calibration frames are important and need to be addressed.
Its difficult to problem shoot without detailed knowledge of your equipment and workflows, but these are the basics (forgive me if you already know all this):
gain and offset must match between lights, flats, darkflats and darks
exposure time of darks (and ideally temperature) must match the lights
exposure time of flats must match darkflats
exposure time of flats should be long enough to achieve roughly 50% illumination (check histogram - be careful about believing the histogram of dslrs)
exposure time of flats should be long enough to avoid non-linear behaviour of some sensors (at least 2 seconds is a good time to aim for)
light source for flats must be even and clean - I've always avoided twilight sky flats, although they used to be used a lot
no light leaks in the darks or darkflats
Even knowing all this it is still possible to just make a mistake. I've certainly done that - human error -happens all the time. 😉
If your optical train has not been altered these frames could all be taken again after the event.
That is not an error, but an important warning message, it warns you about a problem with your data calibration workflow. It means that the masterflat will be created without subtracting bias or dark flats from the flats. So you really want to fix that as well. When you loaded your flats, did you load compatible bias or dark flats so a masterDarkFlat or MasterBias is created that can be subtracted from the flats?
Stefan and John, thank you very much for helping Trevor here.
Trevor, I can confirm the advice from both Stefan and John. And try to fix that calibration warning first. It should not show when the masterflat is created. Sometimes this issue happens because the flats have a different filter tag in the fits header then the bias or dark flats. So double check that the filter is assigned correctly in the frame list after having loaded all data and before starting the 2) Calibration.
You can also disable the multi-filter/channel processing to see if that removes the calibration warning. Then you know for sure that the warning is caused by a wrong filter assigned between the flats and the other calibration data.