I don't want to start a flame war but I feel this might trigger it, but it comes from a place of genuine confusion and concern for the non-Windows, open source astronomy automation world.
To be clear, I have only been in the hobby a few months but have had 25 years in Silicon Valley global product marketing and management experience, so not entirely a total tech noob (I just act like one).
As a long term UNIX fanboi and Mac OS X and open source lover (I still remerging compiling my first gcc compiler on a Sun 3/60 workstation back when workstations were still a thing) I was surprised to learn that ASCOM existed and was so Windows centric and was delighted to learn about INDI and all the great work that's been done by the community.
Now I learn that despite all this, ASCOM isn't dead, people are still working to get a more open, cross platform ASCOM "bridge" (yuck) and worst of all that INDI seems to have formed a splinter group called INDIGO that seems to have some great ideas to technically improve on INDI but unless I misunderstood has diverted resources away from INDI and adding yet another reason for the ASCOM world to continue to do its thing.
This reminds me so much of the UNIX wars (BSD, System V, HPUX, Solaris, AIX, Irix, etc) that I worry we're going to go the same path - should I just give up now and go with Windows and ASCOM where it's a junk operating system, but at least the people running the standards seem to be a little more, i guess disciplined is the word?
Again, not hating on anyone - the fact that so much time and love has been devoted freely is incredible - I'm just trying to understand where to invest my time, energy and frankly money to ensure I can enjoy my hobby while having a bit of fun tinkering.
it's on my To Do - work is crazy busy but once i've got an MVP ready I'll send out for feedback
Agree 100% I was thinking my first task will be to write a script to go through a file system tree and find every FITzs file, get the info and build a .esq file for every combination of Dark and Bias I need to generate/maintain for example.
The house keeping of managing data is probably one of the greatest requirements for ‘’operational control”, otherwise why are we doing all the other things? (We all know it’s actually to play with toys, but we need to convince our suffering friends and family it’s for the greater good
Thanks Wouter - just to add for anyone who is having this problem, if you are WAY off on your initial alignment you may not be able to see the little red "cross hair" that is mentioned in the instructions (the fact that there is also a purple set of cross hairs may even add to your confusion given the bug - as it did for me).
If this is the case, then you should use the purple cross hairs as a proxy for the red one you will se as you get closer to PA, what you need to do is;
1). do your three alignment photos as per the standard alignment process
2). when the results are computer you will see a PA error in degree/minutes/seconds - the SCALAR for this is accurate, the VECTOR is not (ie: the size of the error is correct, it's just telling you the wrong direction) - you will need to do the opposite of what is said on the instructions with regards to the pink line.
3). To help you guide, I recommend you position the cursor with the pink target finder/cross hairs somewhere on the screen as though there is an imaginary line that continues from the end of the pink line all the way down as far as you can go on your screen - for this reason it's usually best to double click somewhere on teh results screen to move the end of the pink line to create as much space as possible to create your "virtual line" (it will make more sense with screen shots i am sure - PM me if you need help).
IGNORE ANY INSTRUCTIONS ABOUT SELECTING A BRIGHT STAR, you will use your pink target icon to make your own virtual star
4). once you've done this (moved the pink line) click next and start refreshing as per the instructions - now move your mouse pointer over the results screens (where you can see the star field) and place BUT DO NOT CLICK) so that the pointer/target icon is floating on an imaginary vector that joins the tip of the pink line in one continuous line to wherever you place the pointer/target icon - this becomes your "virtual star"
5). now adjust your RA and DEC to get the tip of the pink line to move towards your virtual star - remember we are doing this just so we can get the scope closer to the SCP/NCP so that we can see the little red target that Wouter mentions.
6). then follow Wouters inductions as above, the hardest part will be remembering to do the exact opposite and then of course to line up your star with the very end of the TINY little pink line, but it is possible to get within a few arc seconds with a little practice.
7). wait for the fix/workaround
Please! Great idea.
Maybe to reduced interface clutter and to respect the workflow it might be an idea to have a “daytime” or “darkroom” or “file management’’ tab in Ekos which is only concerned with managing your data/files, we could start with image files but then even add things like config files, telescope and eyepiece library etc.
Then whatever code is developed can focus on pre-pre-processing Steps like bulk renaming based on FITS variables, adding meta data, maybe a simple light table to sort and remove bad exposures and in my dreams “add this region to the ekos scheduler for tomorrow because I just realised I took 18 subs of a cloud” button
Great idea, will try when Melbourne isn’t cloudy at night, expect a reply in 2021
In all seriousnessFriday should be good for testing, if there are other ideas I’ll add them to my list.
Now to figure out why you Mac k stars crashes/loses connectivity to my RPi sleeps
I too am having the same problem and arrived at the same solution - do the exact opposite of what the tool is telling me!
To your questions;
1. aligned with finder scope (upper left hand of primary, only roughly aligned) - ZWO290MM mini camera mounted upside down in the refractor (to orient the image in teh right way visually)
2. My mount rotates in the specified direction - always west ( I can try east if that helps) (that is weights rise to the east, OTA sinks into the west)
3. No matter what I did with the mount Ekos always reports "mount is oriented to the top left of the celestial pole" (or similar language) - that strikes me as odd
4. Please NO!
When I was trying to diagnose the problem I thought it was strange that the tool indicated the the celestial pole was not visible in the FOV but when I looked at where the mount had solved to in the kstars planetarium it looked it a). it WAS in the FOV and b). the orientation of the sensor as indicated didn't seem to match what I saw in the eyepiece (inverted)
So I am thinking a bug in the WCS/solver when close to the pole and/or the FOV crossing the meridian?
I had a quick test with a simple webcam and while Ekos picked up teh camera itself easily it wasn't so happy streaming for more than about 30 seconds before the system started to behave erratically and I gave up until I have more time to experiment.
If windowing management in kStars and Ekos was able to manage tiles tied to particular windows/portals we could put whatever data we wanted in there eg: cameras for situations awareness, weather radars, etc
I might have to dig out my compiler and stop talking and start coding!
Just to add to this - I've made two changes since the last post, both of which appear to have helped enormously;
1). changed streaming settings to MPEG in the driver
2). hard wired the RPi4 Ethernet into my DLink 1665 access point in client mode - effectively still using my WiFi network but with a high quality signal - absolutely huge difference
If you're an RPi user I strongly recommend ditching the built in WiFi and going with an Ethernet umbilical (of get a better antenna) - especially true of the RPi4 which seems to suffer from signal strength issues, more so if you are using an aluminium case to help with cooling as I am