Language switching for Mac version of KStars does not seem to work depending on OS version.
10.12.6 → Language switching OK
10.13.6 → language switching NG
10.15.2 → language switching NG
I tried the following with OS version that cannot switch languages.
・ Check permission settings of application
・ Restore access right with disk utility
・ Rewrite configuration file of / Users / User / Library / Preferences / klanguageoverridesrc
Etc., but it did not resolve.
Please let me know if there is a workaround.Language switching issue for Mac version of KStars
I want to see a diagram of the workflow for a complete understanding.
I mentioned it on my wishlist before.
but I think that two fixes will make things much easier.
One is that the INDI server is started up in the initial state, and the driver list is provided to the client software.
Another point is the addition of missing functions in each Ekos module.
Even if the driver management becomes independent in the form of a train, if the function of the Ekos module is insufficient, it will eventually have to return to the control panel and it will not be improved.
I think it's better to think in parallel with the extension of Ekos.
As you can see, the ASI120MC capture does not end with today's update.
Please check the attached log.
I use Ekos, the star chart, and the driver control panel in my workspace, but the LiveVideo Window only appears on the star chart screen.
I want to add an independent window option like other windows.
The trouble has been corrected.
Thank you for always responding promptly.
Indi-full cannot be installed on NanoPi-M4 and Armbian systems due to dependency issues. (It is displayed that the indi-aok driver is broken.)
In the past, there was an installation trouble in the FrendlyDesktop system in the ARM64 system, but this time it seems that the trouble occurs only in Armbian.
sudo apt-get -o Dpkg :: Options :: = "-force-overwrite" -f install
sudo apt-get update && sudo apt-get -y dist-upgrade
Using the above command did not improve.
Please let me know if there is an improvement method.
The flow is an astronomical camera and there is no internal processing like DSLR.
So I think it will be necessary to devise a software that can be handled comfortably by software processing such as Ekos.
Live view (INDI live video) displays a low-resolution stream image (for example, 800x600), and when zoomed, it automatically switches to a high-resolution setting and displays the ROI image with the above number of pixels (800x600).
In this way, the network driver can reduce the amount of live view transfer.
It can also extract files from lightweight streams and process them very quickly over the network.
This flow is suitable for solver analysis, autofocus analysis, and autoguide.
ASIAir already uses this concept, so solver processing and live view are very fast.
I want you to examine in future update.
Changed the NanoPiM4 system to Armbian.
I have the exact same symptoms as the trouble described in the # 40409 post. (The driver will crash.)
Attach the log.
Ekos is currently previewing, autofocusing, and auto-guiding with the image resolution specified by the driver.
Therefore, it takes a very long time to perform these processes using SBC.
I think that the processing speed will be dramatically improved by changing this to the flow to extract and process images from the stream.
(It may be converted to Fits for convenience for post-processing.)
Regarding the live stacking that MR.Han is currently working on, I think that if you can add a function to extract images from the stream, you can enjoy it at a realistic speed.
The images (videos) used for display and analysis are lightweight and fast, and for shooting, you can perform comfortable operations by saving the file size set by the driver and changing the role for each application.
Ekos can handle both Raw and Fits with DSLR.
(Raw is fast because there is no conversion process.)
In CCDCamera, Raw images are converted to Fits by the driver.
(I think it will be even faster if the driver does not perform the conversion process.)
Thank you for a wonderful job.
Many people running EAA with SharpCap enjoy using an automated deployment platform such as AZGTI.
After introducing the target object while watching the live view, press the stacking button to start stacking.
Stacked images are adjusted manually (in color, brightness, etc.) in real time with a slider, etc., and they are having fun.
When moving the target, press the stop button to stop the stack before moving.
Q. Can the following items be realized with this stacking function?
1. Stacking using live view video.
2. Adjust brightness and color in real time during stacking.
I checked the updated app, but it looked like a stacking method to load a saved still image. (It is similar to the function of AstroToaster.)
It will be even better if the above 1.2 features are also added.