So I'm getting less than ideal FPS when streaming video from my CCD. Here's the setup:
QHY5L-II CCD mono connected to a Raspberry Pi 3 running a fairly recent INDI from GitHub (less than a week old).
RPi3 is configured as a WiFi hotspot using its internal WiFi. I'm getting about 49Mbit/s with this as indicated using iperf.
On my laptop I run kstars-bleeding and latest INDI from GitHub.
In the INDI Control Panel I've set the following for the QHY:
- USB Speed: 2
- USB Traffic: 0
- Expose Duration(s): 0.001
Here's are the FPS I'm getting with various resolutions when I enable streaming plus some vital signs of what's happening on the RPi according to xosview: 1280x960: 7.3FPS
- CPU load: 59-62%
- RAM usage: 435M
- Network utilization: 5.4-5.6Mbytes/sec
- No IO waiting 640x480: 7.4FPS
- CPU load: 57-59%
- RAM usage: 425M
- Network utilization: 2.9-3.3Mbytes/sec
- No IO waiting 320x240: 7.4FPS
- CPU load: 39-45%
- RAM usage: 425M
- Network utilization: 0.56-1.1Mbytes/sec
- No IO waiting
I seem to get pretty much the same FPS no matter what I do so I'm not sure what the bottleneck is? The only thing that seems to get maxed out (=same as iperf was indicating) is the network utilization at the maximum resolution but the FPS does not change. In the indiserver log running on the RPi3 I get the following line each time I start streaming:
What does this mean and is there anything I can do about it?
I know my setup is not ideal for streaming but I'd like to get the most out of what I have. Feels like it should be able to do better as for example running
on the RPi3 I'm getting the following FPS: 1280x960: 9-11FPS
Hmm... I just tried the QHY connected directly to my laptop and running INDI server and Ekos on the same machine and I'm getting exactly the same 7.3-7.4FPS as with the RPi3 over WiFi. Feels like it's deliberately capped to this?
I'm somewhat tempted to have a closer look at the oaCapture source code and see if it would be easy to write an indi ccd driver based on it. I believe it does not depend on the QHY SDK. Does that make any sense? Obviously it would make it less generic...
Ok I just asked Thomas if he can add this feature to the INDI driver as well. Regarding the speed, I'm working on a new websocket based solution that should help with the frame rates. Now if I can finish the new Synscan driver...
Ok I feel stupid now since you are Thomas and the the developer behind this driver. Can something like this be done on the INDI driver side as well? Right now, JPEG frames are sent. Is this software zoom or a camera control feature?
Copello, thanks for the suggestion.
Unfortunately, I'm looking for a way of capturing a video stream of raw cropped/5x zoomed frames (like eos-movrec app does), and the INDIGO ccd gphoto2 driver provide only a feature for capturing cropped/5x zoomed preview stills, since it doesn't have the Streaming tab/feature.
I was checking the code from eos-movrec and I found some interesting things:
1- They also used gp_camera_capture_preview(...) for capturing from the Live View stream, but they employed a Mutex to control the update (memcpy() of the pointer data from gp_file_get_data_and_size() into the buffer frame) of the live stream buffer, and a file write control to try to stabilize the fps. Looks like this must be implemented at both INDI::StreamManager and driver side;
2- Instead of doing software cropping, like INDI Photo does, they used _gp_set_config_value_string(camera, "eoszoom", str_param, camera_context) and _gp_set_config_value_string(camera, "eoszoomposition", str_param, camera_context) to enable zoom and set the zoom/crop position features of the camera's Live View itself. According to
, only 1x and 5x zoom factors work, and, unfortunately, 10x zoom doesn't work. The equivalent command line call would be something like: gphoto2 --set-config eoszoom=5 eoszoomposition=640,320 --capture-preview