2023-01-19: APP 2.0.0-beta13 has been released !
!!! Big performance increase due to optimizations in integration !!!
and upgraded development platform to GraalVM 22.3 based on openJDK19
We are very close now to releasing APP 2.0.0 stable with a complete printable manual...
Error message when integrating images from different instruments
I wanted to integrate data from two different instruments. Different scopes and different cameras with different formats and resolutions. I followed a tutorial here:
However, when I came to the integration part I had this error message:
Encountered error in module:
java.lang.ArrayIndexOutOfBoundsException: Index 1 out of bounds for length 1
Index 1 out of bounds for length 1
I am running the 2.0.0 beta 6 version. According to the tutorial this integration should work, but it didn't.
However, the images were also of different types, one was OSC color, and the other was mono. Could it be that
this is the reason for the crash?
Could you try to first separate the OSC data into mono channels? This can be done by selecting "split channels" in tab 2 (all the way down) and then save those. Load in the split frames as mono and add the other mono data and try again.
Hi, thanks, yes splitting the color image in channels does make it possible to integrate them separately, as well as the mono nb data. But the only way I can see to put it all toghether is to use RGB combination. This means that I can not get the advantages from a local normalization correction when integrating data from different instruments with different image sizes and orientation. The tutorial I mentioned show the great advantages of this. But it uses only color images. Is there a way to get local normalization while working with both color data and nb mono data from several different intstruments?
And, secondly, while it may not be possible to integrate mixed types of data, I think APP should say this in some other way then the "Out of bounds" error.
You can actually, if you have all mono data you can load those in again as lights and just process all those together. Easiest is to have those calibrated first with their own calibration data (in the case of different sensors).