×

INDI Library v2.0.7 is Released (01 Apr 2024)

Bi-monthly release with minor bug fixes and improvements

ASI178 Camera Performance on Rasperry PI 4

  • Posts: 28
  • Thank you received: 17
I'm confused.
The USB (libusb) works like this, you send a request for how much data you want to receive. If device has data to send, a reply is given.
The amount of data received may be equal to or less than in the request. In an ideal world, a request should be sent with the length of the entire frame. Ideally, several such requests, so that there is always one ready for new data.
Of course, on the lower layer, the request can be split into smaller ones, but at the application layer, you shouldn't be interested in that.
After all, in the library I split the packages into a maximum size of 1MB. I made a suggestion about this size when decompiling the original library.

In the "16-bit" branch I increased this limit to 256MB, and the number of frame queries to 4. Of course, the limit is a limit, if the frame size is, for example, 10MB, a 10MB transfer is initiated (4 times).

You can modify these values ​​in the lib/src/cameraboost.h file.
    enum // constants
    {
        Transfers = 4, // TODO limit to maximum possible transfers
        InitialBuffers = Transfers + 1,
        MaximumTransferChunkSize = 256 * 1024 * 1024 //
    };

At the moment, in the "16-bit" branch, I don't see any problems with the 8/16-bit mode.
3 years 4 months ago #64258

Please Log in or Create an account to join the conversation.

  • Posts: 220
  • Thank you received: 27
Confused? Don't really understand. Reason I asked if there is 16 bit support is for larger/faster systems with lots of memory. In some cases (solar work) I need high-frame rates, large dynamic range (16bit). As I explained earlier when doing this on Linux the camera starts spitting out starus code 11 and drops lots of frames. This behaviour is unpredictable (sometimes going ok, sometimes loosing frames by the dozen) On the same hardware with W10 not a single frame lost. (and happens with every capture software so must be in ZWO or Linux usb handling)

Not really want to do this on an RPI (write speed to disk will slow dow probably as well). Looks logic that you need to keep buffers smaller to have it run on the smaller Rpi's.

Anyway, compiled your 16bit branch on the RPI and the results for 8bit and 16bit attached.

Regards,

Paul
3 years 4 months ago #64271
Attachments:

Please Log in or Create an account to join the conversation.

  • Posts: 28
  • Thank you received: 17
I meant the confusion about the original ZWO library.
I didn't mean why you need 16-bit mode ;)
Above I wrote my thoughts on USB and the library.
They divide each transfer into 1MB. For the ASI178 camera, 7 bulk transfers are created for the 8-bit mode and 14 for the 16-bit mode.
This is what I suggested when writing the wrapper.
Now I have modified it so that the transfer is not split into parts in the application layer.

My goal is to fix the ZWO library so that it is usable on Linux. Because that's where the main problem lies.

Looking at your attachment I can see that everything is fine! right?
When receiving the frames, none got lost and you achieved fps above the manufacturer's declaration.
I do not understand.
You don't want to run the camera on Linux on RPi?
Are you running it on W10 on RPi? How do you use the camera?
How long are your videos? How can I help you?
I don't understand what you're referring to.
The last modification was based on a smaller number of larger buffers. And in fact, on their distribution of addresses for transfers. I operate on megabytes here, I am far from hardware limitations.

In my project I want to use RPi for wireless transmission of high-speed cameras. After optimizing the code, I think it is real. That's why I started working.
Writing directly to the disk should not be a problem either. Additionally, by introducing a simple lossless compression, it is possible to reduce the size by half.

You can describe how you currently use the hardware/software and how you would like to use it?

After some tweaks, I think libASICamera2Boost will be ready. I still have to fix the Indi Library so that no frame gets lost.
3 years 4 months ago #64276

Please Log in or Create an account to join the conversation.

  • Posts: 220
  • Thank you received: 27
Hi Pawel,

mainly i am trying to use the RPI for deep-sky (with an ATIK camera). For planetary (not at the moment oppositions are done) and solar i still use a W10 system because of the problems with that ASI178.

Planetary/Solar I use (now) an W10 PCc (AMD Ryzen3 1200 - 16GB memory - Harddisk + M2 SSD (PCI mode) Wireless 5G Network (800+Mb/s) I connect to the PC via a VNC desktop connection.
During solar session I usually collect > 50GB of data. (~1000 frames at full resolution per video). For Mars (but that was with ROI) 10000 frames per video ~3 minutes per filter. (RGB and L or IR). Captures are done on the M2 SSDC. Using a vnc from my desktop. Afterwards I copy the files over wireless network to my desktop PC for processing. I easyly get the listed speeds of the camera without any frame loss.
Capturing is done with FireCapture. (also tried Ubuntu on that machine : dropped frames all over the place)

If I can get an RPI working at reasonable speed it will become an option for planetary. (Solar don't really know but if the AMD can get up to speed, I can drop Windows)

Hopes this clears it a bit?

Still some questions on how the final result will work. Will everything needs to be compiled with your solution or will it be possible to have it as a "drop in" replacement for the ASI libraries.

Regards,

Paul
Last edit: 3 years 4 months ago by PDB.
3 years 4 months ago #64277

Please Log in or Create an account to join the conversation.

  • Posts: 28
  • Thank you received: 17
Hello Paul!

The library will be prepared so as to it could be easily replaced. As described in README.md.
Replacement is not an efficient solution because the interface of the original library is prepared to copy data.
However, if you can modify the code I have provided function to get frames without copying.

In short, you will be able to swap the library - it will work better than the original library.
Indi Library (I'll add some fixes) will use the additional features implemented in libAsiCamera2Boost - it will work even better.

I am thinking about adding the possibility to configure the library from the configuration file. This makes it easy to adjust the parameters in the case of a simple replacement.
I bought my first telescope and I'm more in code than under the sky ;)
When I find out what programs you are using, I will try to suggest improvements to this programs.
The following user(s) said Thank You: Jasem Mutlaq
3 years 4 months ago #64308

Please Log in or Create an account to join the conversation.

  • Posts: 220
  • Thank you received: 27
Thanks Pawel.

Unfortunately there is no way I can modify the FireCapture code (not open source, I can ask the author if he would be willing ....)
Further testing with my ASI178 gets me more and more frustated. (keeps me from spending more money on a new bigger cooled camera if I am not sure it is going to work).
On the RPI4 connected via USB3, wifi disabled (avoid interference) Gbit network cable connected.
Start Indi and start a framing session, 10 second exposures
From time to time (fairly frequent) there is status 3 (instead of 2) in AsiGetExpStatus (so exposure is retried. No a problem on 5 second exposures, but really a pain if it happens on 300 second exposures)


Then I tested on my AMD Ryzen.
Here I coud not see any retries
But with videocapture as soon as either highspeed or 16 bit is enabled it stats to spit out error 11.
(same for Indi streaming as for FireCapture)
If on the same PC I boot W10 no problems with video.

Is it the camera hardware? I doubt, looks more like the SDK. I wonder if this onlt the 187 or how do others run cams with even more megapixels on an RPI.

Paul
3 years 4 months ago #64326

Please Log in or Create an account to join the conversation.

  • Posts: 28
  • Thank you received: 17
Hi Paul,

when you write about tests, you mean the original, unmodified ASI library? right?
If I finish some minor fixes on high-frame speed recording, I'll move on to fixing long exposure.
Apart from the hiccups when starting high-speed recording, it seems to me that the situation in my library is under control.

I am also planning to buy a bigger camera from ASI and I think I will get the situation under control as well.
What kind of camera would you buy? Maybe I'll buy the same one.

I put the 8/16-bit version on branch master, which works fine for me. I added the ability to set the size of buffers using the function and using the configuration file. Everything well described in the README and header file.
My next step is to fix timeouts and run capturing smoothly.

Maybe tonight, Polish time, I will manage to find some time and get it over with ;)
After completing the library and Indi, I'll take a look at other programs. FireCapture added to the list.
3 years 4 months ago #64331

Please Log in or Create an account to join the conversation.

  • Posts: 220
  • Thank you received: 27
Hi Pawel
Correct. (writing a small test program now just looping on exposure and waiting for the result. That easier to adapt and test)

That's a difficult question... My main C8 is equiped with an Atik (mainly trying light curves, and exo planet transits) but i have an smaill 80mm achromat which I would like to use for some deep-sky. But since it is an achromat I am still thinking what to do (mono cam, or new refractor + color cam)
Don't rush ... (ha Polish Time, same as my TimeZone)

Rgrds,

Paul
3 years 4 months ago #64332

Please Log in or Create an account to join the conversation.

  • Posts: 396
  • Thank you received: 17
I just now found this thread, so I am way behind. I run both an ASI120MC and an ASI533MC camera on my rig. I had tried planetary imaging when Mars was at opposition, but I was not getting decent frame rates (back in August). So I switched to running ASIStudio on a W10 machine, this runs on a 64 bit system, it has all the bells and whistles needed for these cameras. I think they have a SDK on their website as well that may be used to develop drivers (I am not a programmer and am out of my league in this topic other than I could try and run your tests above with my cameras.) I was going to try it on a 64 bit version of Ubuntu on my Rpi 4 and see if I could get ASIStudio to run there, but I found out 2 things; one, the Astroberry Server will not run under a 64 bit system (yet?) and the Rpi4 that I bought last year only had 1 GB Ram (don't know what I was thinking - I guess at the time I just wanted the USB 3.0 connection). Anyway I have since purchased a new Rpi4 with 8 GB and have Astroberry Server running under standard 32 bit Raspbian. I have been doing DSO imaging lately, so have not tried running video capture. I am interested in having both DSO and Planetary capability all under the Astroberry Server on the Rpi, so all of this seems pertinent to me. I will try running the test with my cameras when I get the opportunity.

thanks for all the work,
Ron
3 years 4 months ago #64335

Please Log in or Create an account to join the conversation.

  • Posts: 37
  • Thank you received: 0
Hi Paweł,
Just my "out of the box" thought. Did you ever tried to contact directly with ZWO about this issue ?
3 years 4 months ago #64409

Please Log in or Create an account to join the conversation.

  • Posts: 2
  • Thank you received: 1
Hello Luzik,

I did. They told me they will fix it. I referenced my own perf tests and also Pawel's code so hopefully they will have a good reference and fix it properly.

See bbs.astronomy-imaging-camera.com/d/11601...sdk-on-jetson-nano/5

for details.
The following user(s) said Thank You: Jasem Mutlaq
3 years 3 months ago #65033

Please Log in or Create an account to join the conversation.

  • Posts: 2
  • Thank you received: 1
Hello Pawel,

thank you for this library. I was afraid I will have to write it on my own :)

I tested it on Jetson Xavier NX and I can achieve 1936x1096x8@170 at about 40% CPU usage.

When I did some tests on my own with my own ZWO Lib patch I could get the same bitstream (around 2.5Gbit) at about 0.3 to 3% CPU usage so I think it can still be optimized somehow. But to be honest I didn't check your code yet so I can't really tell.

Currently, on Xavier, I am using CUDA to Debayer the camera data and OpenGL to display the result. For details you can refer to forums.developer.nvidia.com/t/jetson-tel...processing/160782/13

Everything works really great. I can achieve 170 FPS of such bitrate at 50% CPU and 50% GPU usage. I will probably optimize it further, but currently I am satisfied. Initial tests results are very good and fair enough for the final gadget we are targetting.

My further plan is to save the video to NVME drive in the uncompressed AVI format (Full FPS) and stream part of the video (15-30 FPS) over WIFI to tablet or mobile using the Jetson H264 hardware encoder. I tested this already with Logitech C270 WebCam
and it works great. On the client I have used this: github.com/matijagaspar/ws-avc-player
Last edit: 3 years 3 months ago by Fis.
3 years 3 months ago #65034

Please Log in or Create an account to join the conversation.

Moderators: Radek Kaczorek
Time to create page: 1.283 seconds