It did take a long time to have the work finished on this and it will have a major performance boost of 30-50% over 2.0.0-beta39 from calibration to integration. We extensively optimized many critical parts of APP. All has been tested to guarantee correct optimizations. Drizzle and image resampling is much faster for instance, those modules have been completely rewritten. Much less memory usage. LNC 2.0 will be released which works much better and faster than LNC in it's current state. And more, all will be added to the release notes in the coming weeks...
Update on the 2.0.0 release & the full manual
We are getting close to the 2.0.0 stable release and the full manual. The manual will soon become available on the website and also in PDF format. Both versions will be identical and once released, will start to follow the APP release cycle and thus will stay up-to-date to the latest APP version.
Once 2.0.0 is released, the price for APP will increase. Owner's license holders will not need to pay an upgrade fee to use 2.0.0, neither do Renter's license holders.
Getting lines in the finished product I dont see in the subs
Some data from Orion. 100 of my best subs, and I keep getting these lines in the data. Get them regardless if I use calibration frames as well. Â
my work flow for dropping bad subs was to register them in Deep Sky Stacker (cause APP takes way too long to look at each sub) sort and just look at them for anything, disregard the bad frames... but I can't seem to find bad frames even looking at them individually in APP. I am curious if there was a way to sort the data? is this a rolling shutter effect gone crazy or just throw away everything? I am not sure what to do with this. I dont have this in any other data. Â
I know the camera has some years on it in some pretty inhospitable climates so its got some years of wear and tear, but still have taken decent shots after this data set.
This is a screenshot of a zoomed, zoomed image, so it will look like garbage. like a zoom of a zoom of the preview window. Each individual sub has what looks like a rolling shutter effect, so I was hoping that the stack of them would look better and just be calibrated out as noise. Should I be using any form of cosmetic corrections?Â
So the individual subs do have artefacts in the form of bands going over it? If so, that will impact a good result for sure as that is then a data issue. Do you want me to have a look at your data?
They are in different positions which is why I described it as a rolling effect. I am wondering if it is power related, so I am attempting to install a large capacitor on the camera side of the output power supply for less ripple current and more steady voltage. Â
It seems to be that data set, as I tried the Rosette data from around the same time frame and I dont see it. However every time I use this program I get horrible vignetting like so. The light pollution tool helps, but my stacks with DSS are cleaner.Â
 Fresh flats, bias, darks, dark flats etc... still getting this all the time. I have yet to get a clean image right off the bat with APP
That's over-correction which could indicate an issue in noise levels. This can be caused by an issue in the darks/bias or flats themselves, which we can investigate. But that's a different thing from the rolling shutter effect.
The Vignette tool does nothing to help, and the data is just so dirty, with over 200 subs. 6 hours of integration time with a Triad filter should be crystal clear data. There has to be some setting(s) that I am doing wrong.
Yes, if there is an issue in the data, likely with the noise background like I mentioned, that won't be solved with the LP tool. The LP tool is meant for gradients, mainly caused by light pollution. If you leave out the flats, you'll likely see a more normal image. The flats aren't correctly applied to the data, likely due to the flats, lights or bias/darkflats. They probably don't match completely. We can investigate that by looking at your data if you want.
Same exact data in DSS. 10 Flats, 10 Dark Flats, 10 Darks, 10 Bias, and 227 light frames for 6 hours of exposure time. It can't be my data? it has to be some settings in APP. Â
Zoom in even shows its 10 times cleaner than the stack in APP
After importing it into photoshop and some curve stretching, I can see similar vignetting (just not as over the top as seen in APP) which I know APP would make quick work of with the Light Pollution tool. But this data is much cleaner. So I am still no closer.
I have no experience with DSS, I do know that works way different from APP and APP is able to process a wide variety of challenging data, while DSS is not. Not saying DSS is bad, but it uses way simpler algorithms which in this case may work for your data. However, that doesn't mean there is not something going on with the data. If you want I can have a look at your data to find out and make it better.
After importing it into photoshop and some curve stretching, I can see similar vignetting (just not as over the top as seen in APP) which I know APP would make quick work of with the Light Pollution tool. But this data is much cleaner. So I am still no closer.
Right, so that does show the same issue then. The over the top you see in APP is simply due to a different stretch.
Yea, I cant argue when that. I uploaded under Vignetting. 10 of everything. Still curious why everything looks so dirty in my stack with APP. it has to be some settings im not familiar with.Â
Now comes something weird. When I attempt the Vignetting tool (I have no idea what and where to select these 8 boxes) I get a much cleaner Rosette. Â
The gradient is awful, but the fact that the data got much cleaner (even thought this fresh stack was still very noisey) brings up so much confusion for me.
Some data from Orion. 100 of my best subs, and I keep getting these lines in the data. Get them regardless if I use calibration frames as well. Â
my work flow for dropping bad subs was to register them in Deep Sky Stacker (cause APP takes way too long to look at each sub) sort and just look at them for anything, disregard the bad frames... but I can't seem to find bad frames even looking at them individually in APP. I am curious if there was a way to sort the data? is this a rolling shutter effect gone crazy or just throw away everything? I am not sure what to do with this. I dont have this in any other data. Â
Can you please check the filesystem of the drive that you use for the Working Directory? I suspect it will be fat32, which would cause these lines. APP needs a more modern filesystem, like exFAT or NTFS or a Linux/macOS specific file system. Fat32 is really old and only supports files up to 4GB which is too small for APP to work properly.
@vincent-mod wonderful. Id be happy for you to look at my calibration files (and the subs) and let me know if there is anything I could do to improve with my existing gear.
Flats were taken with Astrophotography tools Flats aid tool. It auto takes the flats, and I use 2 sheets of white plastic, with an air gap to help with light dispersion and a flat field. Dark flats were taken just at night (all the same temp) and bias I just looked up the min exposure of the camera on the specs page (something like 0.0006 seconds)
Ok, looking at the data now and I think I'm seeing a couple of potential issues:
1. If I stretch a dark, it looks to me like there might be some light leakage going on. If so, this will impact the noise levels by a lot for a master-dark.
This is more pronounced in the masterdark;
Using calibration data like this will impact a proper calibration of the rest of the data, which may very well be the reason you have an issue with flat calibration. My suggestion would be to ensure that you have absolute darkness when taking these frames. For instance, by taking them when not on the telescope, in a completely dark bag or something like that. Of course with the same settings as the lights still and cooling when you used that.
2. The flats are not exposed long enough, looking at the linear histogram (DDP button switched off on the right hand side of APP), you can see the peaks are to the left and likely below the noise floor of the sensor. Ideally you want these peaks to be in the middle or even more to the right of the histogram (of course without clipping on the right side).
3. I'm seeing the red channel to be very low in data. Is this because you're using a filter? It's not a big deal, but you do need to be aware that you need an exposure that also gets the red channel exposed properly.
Â
Besides that, I would advice not to take the bias frames at the shortest possible exposure. This is indeed a more true bias, but there are a lot of sensors that have issues behaving linearly in this exposure length. So, I would take those at about 0.5s per exposure. This will still contain mainly the bias signal and it'll make sure the data is linear.