The Aberration Inspector is now merged into the latest code base. So anyone taking bleeding edge builds should have access to it for any builds started from now.
To launch it, set a "Mosaic Mask" and Algorithm = Linear 1 Pass. A button to the right of "Auto Focus" called "Inspector" will be enabled. Press it and focus will run as usual but at the end of the focus run the Aberration Inspector dialog will launch.
NINA's Hocus Focus plugin has been the only reason I've been keeping the NTFS partition on my laptop's SSD since converting to RasPI/Ekos.
Honestly, I've even considered giving a go to implementing a similar tool.
Really happy to find your work is in such a mature phase John, and I am very much looking forward to trying it out!
I do have a tilt adjustment device, and here is the record of how much fun I had with it one night (time flows bottom up, the top row represents the final results):
Two concerns I had during that session were:
Unclear margin of error (dependency on conditions, e.t.c.). I should have just re-run multiple times without any adjustments, but it was time for bed.
(this may be related to the first point) Sometimes the effect of my adjustments were showing as if they had been done in the opposite direction. I ended up resorting to what you in your video referred to as "collimation approach" whereas I would just note down adjustments and their effects between iterations and work from there.
Do I get it right this is part of the KStars rather than INDI? I am building the backend and can take in any recent merge, but using pre-cooked images of KStars for mac, so I guess I will need to wait until the next release.
But very much looking forward to giving it a go!
The Aberration Inspector is built into Ekos' Focus module. There are no changes to Indi for it. Its merged so if you pull and build the latest "master" you will have it. Easiest way to tell is that on the Focus screen there is a new button next to "Auto Focus" called "Inspector". If you just take Kstars/Ekos official releases it will be in 3.6.8 which is scheduled for the beginning December.
On your points:
1. Unclear error of margins. I agree it would be good to have some sort of traffic light system: green = good (no point in fiddling any more; amber = quite good but could be better; red = bad, need to adjust). However, I don't know how this would work on different systems so need user feedback to be able to calibrate this.
2. "Direction" of adjustments. This could get very complicated so I've kind of "ignored" it and left it to the user to figure out. I could probably improve this with feedback. The issue is that lens / mirrors all invert / flip the image in different ways so different scopes with different combinations of lens / mirrors will change the left / right / top / bottom of the image which in turn will change what the user needs to do with the tilt device. So, for example, move screw 1 on the tilt device "clockwise" may be the correct adjustment on "scope a" but the wrong adjustment for "scope b". Still with more user feedback I may be able to build in more help for the user.
Ref feedback, here is a thought.... perhaps you could show value using traffic light system at each of the corners to visually tell the user which corners need improving (rather than which screws need adjusting). So if top RHS is showing Red value, I know I need to bring that down and as I move the screws if the value changes to Amber then I know I am moving in right direction.
I will wait for the new KStars release as I haven't quite got around to building it myself.
Regarding the traffic light system idea you described, correct me if I am wrong but to me it sounds more of a "how close you are to the ideal" kind of an indicator.
I suppose it would use CFZ as its main reference and would light green when the delta between the smallest and the largest solutions (out of 4 corners & center) is within CFZ.
I was rather talking about the measure of trustworthiness, if you will, of the tool's output, as it relates to current atmospheric conditions.
I do not really see an obvious way to quantify that other than having the user run the autofocus routine several times without any adjustments and then fitting the results into some [normal] distribution.
Then you would be able to show, right next to your output of Delta (ticks), how it compares to the calculated standard deviation.
If my Delta readings turn out between -20 and 10, but the standard deviation is 30 (just an example, you could use a derived metric, such as 2 sigma, or whatever else), I would know not to bother any further.
Another approach could be to make use of PHD2's guiding assistant output (or some similar routine) where it estimates the high-frequency atmospheric vibrations, but I suppose it could end up quite sophisticated as essentially you would need to translate those readings (which relate to the camera sensor plane) into what would need to relate to distance of focuser travel in the direction perpendicular to the camera sensor plane.
That, in turn, would probably need to rely on how exactly a star image changes with focuser move, e.t.c..
Now, regarding where (corners / adjustment screws) the adjustments need to be made, I think you have already basically nailed it in your video @
where you rotated the 3D graph and said "I'm sort of looking at this towards the telescope".
I think all the user needs to know is how the coordinate grid relates to their camera sensor. From the tool's perspective it means showing
Where the telescope is (which side of the sensor plane on the graphic). Maybe just show an arrow towards the telescope where the "Axis of Telescope" label is?
Where the top of the camera is, which is already done by the TL/TR/BL/BR labels
I made an attempt to use the Inspector running a rather fresh build of Kstars on Raspberry Pi and found it crashing just when the autofocus routine is done and results are about to be displayed. Nothing in the logs at all. Could it be that the graphical representation of results is relying on some libraries that are not bundled in my build of Kstars?
The solution was to run Kstars remotely from the laptop, although on my macOS 12.7.1 the latest Kstars 3.6.9 failed to start, but 3.6.8 did, and luckily it already has the AI.
It was a pleasure to use, but I had to go to the explanation video on Youtube to remind myself that the view is towards the telescope.
The point you mention the crash of AI happens is when AI is doing all its work. If the libraries weren't available on the Pi then kstars probably wouldn't build although it's possible some run time component is missing. You could try minimising the datapoints just to see if it launches. Also, AI uses QtDataVisualization which isn't widely used in Kstars but is used in the star profile viewer. Can you launch this on the Pi?
Did you have verbose focus logging set on? Hard to comment much more without more info.
On the Mac issue you'll need to upgrade to Sonoma to run 3.6.9