Major INDI Library release v1.9.0 bring significant internal changes championed by @pawel-soja to modernize core INDI Library drivers and clients. New drivers for DeepSkyDad Flat Panel & Pegasus devices plus further improvements to PCM8 drivers.
I've been using a roll-your-own RPI setup with ekos etc for a while now, and I'd like to start doing some EAA so I can share my hobby better with family (and friends once the pandemic is over). I found getting some of the tools like oaLive / oaCapture etc to be lot's of work, and there were conflicts with other software. So I tried out Astroberry and to my delight, many tools are just working. I'd like to have a setup where I am able to show some images of objects captured and stacked in real time on a tablet to my kids. The simpler the setup and interface the better. Any tips on doing this with Astroberry? I briefly tried oaLive last night, and it seemed to work, but I'm not sure what settings would be best for stacking etc. Any hints or tips would be greatly appreciated!
‘The simpler the better’ doesn’t quite sum up my arrangements, but in the lack of other responses I’ll tell you what I do. Firstly my main interest is in capturing images for subsequent processing, so what is usually described as AP. I have astroberry on an RPi4 booting from a large SSD and operating as a 5GHz wifi hotspot. I get everything set up and operating from my laptop using VNC viewer. The image files as they are captured are stored on the SSD and by virtue of running Syncthing on the RPi and the PC, all files in the imaging folder are transferred over to a matching folder on the laptop. The laptop is also running SharpCap Pro and I use it to live stack the image files as they arrive in the folder. It’s not instant since each image capture is perhaps two or three minutes, but the live stacked output lets me examine my ongoing results, actively adjusting the histogram without affecting the captured image files. I often move indoors after establishing a capture run, and monitor the individual images via VNC, and/or switch to SharpCap and the live-stacked view.
For pure EAA I have been trying out connecting the imaging camera directly to SharpCap Pro and using 15 second images at higher gain settings and live stacking say 30 images before moving on to another target, roughly every eight minutes. I still run my astroberry set up, but the guidescope becomes the only scope in Ekos and is used for all polar alignment and target finding and plate solving (and guiding if needed). Before running this arrangement I align the guidescope, primary scope (and finderscope) during the day so that the platesolved target in the guidescope is close to the centre of the main scope imaging sensor.
Currently running KStars/Ekos/Indi on a Raspberry Pi 4B 4GByte under Raspberry Pi OS thanks to Astroberry
Sky-Watcher ED80 Pro with 0.85x FR/Corrector with Canon DSLR and ASI533MC on AZ-EQ5 GT mount
Other Sky-Watcher scopes - SkyMax 150 Pro Maksutov Cassegrain and Explorer 150p Newtonian
1. Use use INDISERVER in command line mode only (NO EKOS/Kstars) - so just drives your devices (Astroberry will do this simply via Panels) - and run CCDCIEL/CDC on a remote Windows laptop. The latter will automatically download the files to your Laptop and control all the devices,plus includes Platesolving via a number of ways(you have to set up the Platesolving software - ASTAP,ANSNVR etc). Then use Astrotaoster (uses DSS) on your laptop to create/show real time stacked images.
2. Do the "normal" Kstars/Ekos local set up and monitor via VNC but place your image folder in the Public area - if you are using Astroberry this automatically shares this area via Samba and you will be able to see the files from Windows. Then run Astrotoaster ,on your laptop, and use the inbuilt Rsync function to sync and monitor the image folder on the RPI. Platesolving would be done on the RPI,
You didn't say what your kit was so this is a "general" solution assuming No guiding just a Image camera and Mount. It also assumes you have a Windows device and are not an Apple person.
Live Image stacking ,IMHO, just does not work well on a RPI(any) as it needs resources (CPU etc) that the poor old RPI cannot provide and run everything else.
Thanks for the replys! Yes I didn't give much detail. I normally use my raspberry pi / ekos setup similar to others here. I run the whole ekos/kstars/phd stack on the PI and the images are saved to an nfs mount on my Linux PC -- I'm a Linux only type of maniac. I image an object over several days gathering data, and finally stack when I have enough. This is awesome, and I love it, but I want a more active way to share my hobby with my kids, so I'm trying to figure out EAA so that we can cruise around the sky looking at nebula, galaxies, planets and all those amazing things up there. But without just seeing a grey smudge.
So last night I tried my new astroberry setup to see if I could do this. It was pretty successful. I dug up my old dslr camera because that's the best color camera I've got. I put it on an EdgeHD8 with a reducer. I use skysafari to navigate around, and I ran a VNC session from my desktop where I first platesolved a couple stars to get everything working well. I also run vnc on my phone and tablet. I used the tablet to get a steady stream of captures with a bahtinov mask to get everything nicely focused, and then I was off to the races. I tried oalive which is supposed to do live stacking of sorts, but I had to shutdown ekos for it to be able to connect to my camera. I wasn't able to get any good results from it either, so I just gave up on that. I went back to ekos, and just did 30s stretched exposures. My polar alignment is always pretty good because I have metal flanges pounded into the ground where I setup my telescope, and that keeps the alignment pretty good. So I was able to get pretty good 30 second images unguided. And since it was just looping taking shots I could slew around to different objects and get some interesting views. The dumbbell nebula looked amazing. You could make out the veil nebula as well. This is not optimal since 30s may not be enough for some objects, but it's good enough for what I wanted out of it.
I have to say that I'm very happy with the setup, although it could be better. I set the tablet vnc viewer to be read-only so that the kids won't mess up the whole thing. I'm also connected, or the kids to the scope through skysafari, so we'll all be able to explore the sky. I probably could set it up to save the captures to a nfs share on a laptop and use some sort of live stacking software to get better views, but I prefer to keep it simple and only use mobile devices. I could always just add a guide scope and be able to get longer exposures if I need that for some objects.
tl;dr : Success! Using unguided, but platesolved, polar aligned regular old ekos/kstars with looping 30s captures works great. One phone/tablet running skysafari to navigate, and one running read-only vnc to view the captures works great!