-
-
Notifications
You must be signed in to change notification settings - Fork 238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Unable to start capturing: Invalid argument" on Raspberry Pi 2B #139
Comments
|
A simple fix for now is to enable Legacy Camera assuming you are on Raspbian Bullseye
3 Interface Options > I1 Legacy Camera > Yes
and voila it works (I hope xD, this is what I had to do) My assumption is that ustreamer may be reliant on the older system and since the author is busy with the pikvm, the code hasn't been updated to the newer libcamera stack. |
Thanks @benjaminjacobreji for your suggestion to revert to the legacy stack! If possible, I would prefer keeping libcamera, as I also use |
I assume this is related to the recent (late 2021) introduction of the Media Controller API in newer Linux kernels and Raspbian Bullseye. Support for the Media Controller API was enabled for several camera modules including OV5647 (Pi Camera v1, end of life) and IMX219 (Pi Camera v2). It's not enabled for TC358743 though. On a fresh install of Raspbian Bullseye, you'll likely have If you have, say, an IMX219 module, you can disable support for the Media Controller API by loading the appropriate overlay and adding
I ran into the same issue as you with:
My V4L2 settings after a fresh boot: v4l2-ctl --list-devices
v4l2-ctl --info --device /dev/video0
That the Media Controller API is available can be seen in the Device Caps value Below is the topology for the Image Signal Processor (ISP). media-ctl --print-topology --device /dev/media1
media-ctl --print-topology --device /dev/media2
The topology associated with the camera sensor module. media-ctl --print-topology --device /dev/media3
If just run
In the kernel log (
The first problem can be corrected by updating the size on the V4L2 subdev.
Unfortunately, this is not enough and I've tried to disable use of Media Controller API by updating
After a reboot, I see:
None of these pixel formats are among the ones supported by I attempted to circumvent this problem by adding definitions for these formats to the source code but all I ended up with after running |
It is failing the same on imx519 (Arducam 16MP)
|
Hello, I just tried with the latest code from the master branch (ustreamer 5.3)
Using the examples in the README, I tried the examples:
Both of these commands failed with:
With
@mdevaev, do you have any input on what's necessary to be able to use ustreamer for streaming video from MIPI CSI-2 sensors under Raspbian Bullseye with the modern camera stack (unicam)? (comments in this thread suggested it's possible to workaround the problem by enabling the legacy camera stack and I did not try this. Personally I'd rather spend time on getting things to work with the modern camera stack to future proof things and ease setups under a default install) Thanks for your time and keep up the good work with PiKVM! |
OK. I have been playing with this to use a new You can see my branch in here: https://github.com/ayufan-research/ustreamer. Compile and then try to run it. I have troubles with higher resolutions. It seems to work on 5.10 and 5.15. On 5.15 it does work with
ustreamer --host :: -r 1920x1080 -m pRAA -w 1 -b 1 --debug --no-log-colors -d /dev/video0 |
Nice work! I just tested this with a Raspberry Pi Zero 2W and later a Zero W v1.1, both running kernel 5.15.32.
This might be completely incidental.. but the Zero 2W appears to have fried within about 15 seconds after running these commands, before I managed to open the browser to test out the stream. I lost SSH access and smelled a very hot PCB. After powering it off, I let it cool for 10 minutes and then tried to power it up again but it immediately becomes very hot and doesn't light any LEDs. Not sure what happened here -- it had been running idle for a few hours already on a clean desk, and I only accessed it over SSH. Anyway, I moved the SD card and the camera over to a Raspberry Pi Zero W v1.1 and re-ran the commands. If I move my hand in front of the camera, I see a dark grey silhouette of it on a darker/black background on the streaming page. Is that expected? ( I also note that if I run BTW, what Raspberry Pi hardware did you test with? |
same issue |
Sorry for the long answer. I'm using ustreamer with some usb webcams and csi-hdmi bridge so I don't know how to make it work with the regular csi camera, I don't have any. I'll try to get some, but I can't promise that it will be happen soon. |
Thanks a lot for looking into this issue, @mdevaev ! |
I did not had such problems.
This is a problem that bcm2835-isp is not configured. The |
In general all those DLSR type sensors (from Sony, and others) they return only 8-bit or 10-bit Bayer packed format (once you hook into this stream, the latency is simply awesome) that resembles exactly how the CMOS sensor for cameras are built. You now once have this data need to manually calculate exposure time, analogue and digital gains. This can be done by ISP. Raspberry PI does have Why I'm writing about this? The ISP is software controlled. Trying to re-implement About ustreamer/src/ustreamer/device.c Line 421 in 7ceb8a3
b. use v4l2-ctl to manually feed ISP with a stream of data v4l2-ctl --device /dev/video0 --out-device /dev/video13 --stream-mmap --stream-out-dmabuf (it will copy over RG10 data into ISP)c. run ustreamer --device /dev/video14 (it will expose YUYV, or whatever you configure it to)d. this will work, but have pretty terrible latency Exactly above behavior is implemented in https://github.com/ayufan-research/ustreamer. The camera data, is feed into ISP, and this is feed into JPEG. The implementation in https://github.com/ayufan-research/ustreamer is pretty clumsy, since ustreamer architecture does not allow to easily create a multi-stage pipelines for data processing, or hooking libcamera. For example the https://github.com/ayufan-research/ustreamer even though it hooks ISP, it does not offer YUYV stream for H264 support and there's no easy way aroud this limitation except maybe rewriting the The integrated kernel data pipelining (as done with As an exercise I tried to do it, and thus (sorry, for slightly hijacking this thread with own project) created in last few days the https://github.com/ayufan-research/camera-streamer. It did felt slightly easier after trying the first iteration in ustreamer codebase (due to lack of flexible multi-stage pipelines) to start from scratch with a project being v4l2 hardware centric. I wrote the It appears that might I took a wrong approach with https://github.com/ayufan-research/ustreamer to implement ISP step instead of rewriting Ref.: |
@ayufan thank you for the research. It seems ustreamer is not a best choice for IMX219 and so on. The main ustreamer's purpose is HDMI capture. When I wrote ustreamer, libcamera did not exist on RPi. So, V4L2 encoders has no flexible support right now. Actually I dont't mind to add support of ISP and any other things if doesn't spoil latency, etc. |
@mdevaev Adding ISP support is doable, but is not the best idea :) If you want to support cameras, consider making
|
Latency difference according to my tests is due to the way how buffers are enqueued by libcamera, and how this introduces an additional processing delay. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The This is especially visible when you have FPS set lower than the sensor can provide. Now, having too many buffers enqueued does increase latency. Doing this very dumb test https://www.youtube.com/watch?v=e8ZtPSIfWPc:
2328x1748@30fps# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4 2328x1748@10fps# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4 1280x720@120fps for Arducam 16MP# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3 |
Im using your latest camera-streamer for my voron (3d printer) with an arducam imx477 and cm4. Camera-streamer is amazing, ustreamer didn't worked for me either. It works absolutely perfect with the new media-controller. Im streaming at 1080p@60fps, it's almost absolutely latency free, takes no cpu usage and im overall extremely happy! Your fork of ustreamer doesn't work for me sadly. Same error like mainline ustreamer. However, please continue the work on camera-streamer it's perfect, i love it and thank you so much for that! Cheers |
I've got a camera and experimented with it. Too much work will be needed for native support, but Raspberry offers "libcamerify", which works fine with uStreamer. On Raspberry OS, install |
I have Bullseye on a Pi4 and RPI camera2 bristolfish@Mycodo:~ $ sudo apt install libcamera-tools |
Install |
Thanks, that made part of the error go away, but I'm left with this one sudo apt-get install libcamera-v4l2 |
Is it ustreamer from the debian repo? Show |
bristolfish@Mycodo:~ $ ustreamer --version |
This is the very old release. Uninstall it and build from git if you want to use m2m encoder. |
Thanks. I'll try the Docker version I think |
It is better to use native build. The list of dependencies is in the README, the build will take a few seconds. |
@samuk Sup? |
I do want to end up running it in Docker, the underlying OS will be https://github.com/balena-os I'll have a go at native first though if that's your recommendation. What's up with Docker? |
Docker requires port forwarding and video device. There are no problems, but I usually recommend the native launch because it's easier. |
Is it working? |
thanks for the guidance!! got a cheap camera in aliexress (aparently is V1) found instructions in got "Unable to start capturing: Invalid argument" found in this thread the instructions to install "libcamera-v4l2" wich are mentioned in after that this works sudo modprobe bcm2835-v4l2 sudo libcamerify ./ustreamer --encoder=m2m-video --host=0.0.0.0 --port=80 -r 1296x972 |
Hi,
I get "Unable to start capturing: Invalid argument" on a Raspberry Pi 2B running Raspberry Pi OS Bullseye:
I installed ustreamer from "deb http://raspbian.raspberrypi.org/raspbian/ bullseye":
I tried upgrading to ustreamer v4.11 (source build), but the error persists.
According to
strace
, the error occurs inioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE])
:libcamera-vid
works fine.Any ideas how to solve this issue?
Thank you!
The text was updated successfully, but these errors were encountered: