Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Unable to start capturing: Invalid argument" on Raspberry Pi 2B #139

Open
fphammerle opened this issue Jan 27, 2022 · 31 comments
Open

"Unable to start capturing: Invalid argument" on Raspberry Pi 2B #139

fphammerle opened this issue Jan 27, 2022 · 31 comments
Assignees
Labels
type:bug Something isn't working

Comments

@fphammerle
Copy link
Contributor

fphammerle commented Jan 27, 2022

Hi,

I get "Unable to start capturing: Invalid argument" on a Raspberry Pi 2B running Raspberry Pi OS Bullseye:

$ sudo -g video ustreamer --resolution 1920x1080
-- INFO  [13606.486      main] -- Using internal blank placeholder
-- INFO  [13606.487      main] -- Listening HTTP on [127.0.0.1]:8080
-- INFO  [13606.488    stream] -- Using V4L2 device: /dev/video0
-- INFO  [13606.488    stream] -- Using desired FPS: 0
-- INFO  [13606.488      http] -- Starting HTTP eventloop ...
================================================================================
-- INFO  [13606.493    stream] -- Device fd=8 opened
-- INFO  [13606.493    stream] -- Using input channel: 0
-- INFO  [13606.493    stream] -- Using resolution: 1920x1080
-- INFO  [13606.493    stream] -- Using pixelformat: YUYV
-- INFO  [13606.493    stream] -- Querying HW FPS changing is not supported
-- INFO  [13606.493    stream] -- Using IO method: MMAP
-- INFO  [13606.523    stream] -- Requested 5 device buffers, got 5
-- ERROR [13606.524    stream] -- Unable to start capturing: Invalid argument
-- INFO  [13606.542    stream] -- Device fd=8 closed
-- INFO  [13606.542    stream] -- Sleeping 1 seconds before new stream init ...
================================================================================
-- INFO  [13607.548    stream] -- Device fd=8 opened
-- INFO  [13607.548    stream] -- Using input channel: 0
-- INFO  [13607.548    stream] -- Using resolution: 1920x1080
-- INFO  [13607.548    stream] -- Using pixelformat: YUYV
-- INFO  [13607.548    stream] -- Querying HW FPS changing is not supported
-- INFO  [13607.548    stream] -- Using IO method: MMAP
-- INFO  [13607.578    stream] -- Requested 5 device buffers, got 5
-- ERROR [13607.580    stream] -- Unable to start capturing: Invalid argument
-- INFO  [13607.597    stream] -- Device fd=8 closed
-- INFO  [13607.597    stream] -- Sleeping 1 seconds before new stream init ...
^C-- INFO  [13608.282      main] -- ===== Stopping by SIGINT =====
-- INFO  [13608.283      http] -- HTTP eventloop stopped
-- INFO  [13608.598      main] -- Bye-bye
$ journalctl --lines=3 -o cat
   someone : TTY=pts/1 ; PWD=/home/someone ; USER=someone ; GROUP=video ; COMMAND=/usr/bin/ustreamer>
unicam 3f801000.csi: Failed to start media pipeline: -22
unicam 3f801000.csi: Failed to start media pipeline: -22
$ cat /proc/cpuinfo | tail -n 4 | grep -v Serial
Hardware	: BCM2835
Revision	: a01041
Model		: Raspberry Pi 2 Model B Rev 1.1
$ cat /etc/issue
Raspbian GNU/Linux 11 \n \l
$ sudo -g video v4l2-ctl --info
Driver Info:
	Driver name      : unicam
	Card type        : unicam
	Bus info         : platform:3f801000.csi
	Driver version   : 5.10.92
	Capabilities     : 0xa5a00001
		Video Capture
		Metadata Capture
		Read/Write
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps      : 0x25200001
		Video Capture
		Read/Write
		Streaming
		Extended Pix Format
Media Driver Info:
	Driver name      : unicam
	Model            : unicam
	Serial           : 
	Bus info         : platform:3f801000.csi
	Media version    : 5.10.92
	Hardware revision: 0x00000000 (0)
	Driver version   : 5.10.92
Interface Info:
	ID               : 0x03000005
	Type             : V4L Video
Entity Info:
	ID               : 0x00000003 (3)
	Name             : unicam-image
	Function         : V4L2 I/O
	Flags         : default
	Pad 0x01000004   : 0: Sink
	 Link 0x02000007: from remote pad 0x1000002 of entity 'ov5647 10-0036': Data, Enabled, Immutable
$ sudo -g video v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'YUYV' (YUYV 4:2:2)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[1]: 'UYVY' (UYVY 4:2:2)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[2]: 'YVYU' (YVYU 4:2:2)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[3]: 'VYUY' (VYUY 4:2:2)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[4]: 'RGBP' (16-bit RGB 5-6-5)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[5]: 'RGBR' (16-bit RGB 5-6-5 BE)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[6]: 'RGBO' (16-bit A/XRGB 1-5-5-5)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[7]: 'RGBQ' (16-bit A/XRGB 1-5-5-5 BE)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[8]: 'RGB3' (24-bit RGB 8-8-8)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[9]: 'BGR3' (24-bit BGR 8-8-8)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[10]: 'RGB4' (32-bit A/XRGB 8-8-8-8)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[11]: 'BA81' (8-bit Bayer BGBG/GRGR)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[12]: 'GBRG' (8-bit Bayer GBGB/RGRG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[13]: 'GRBG' (8-bit Bayer GRGR/BGBG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[14]: 'RGGB' (8-bit Bayer RGRG/GBGB)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[15]: 'pBAA' (10-bit Bayer BGBG/GRGR Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[16]: 'BG10' (10-bit Bayer BGBG/GRGR)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[17]: 'pGAA' (10-bit Bayer GBGB/RGRG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[18]: 'GB10' (10-bit Bayer GBGB/RGRG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[19]: 'pgAA' (10-bit Bayer GRGR/BGBG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[20]: 'BA10' (10-bit Bayer GRGR/BGBG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[21]: 'pRAA' (10-bit Bayer RGRG/GBGB Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[22]: 'RG10' (10-bit Bayer RGRG/GBGB)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[23]: 'pBCC' (12-bit Bayer BGBG/GRGR Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[24]: 'BG12' (12-bit Bayer BGBG/GRGR)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[25]: 'pGCC' (12-bit Bayer GBGB/RGRG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[26]: 'GB12' (12-bit Bayer GBGB/RGRG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[27]: 'pgCC' (12-bit Bayer GRGR/BGBG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[28]: 'BA12' (12-bit Bayer GRGR/BGBG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[29]: 'pRCC' (12-bit Bayer RGRG/GBGB Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[30]: 'RG12' (12-bit Bayer RGRG/GBGB)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[31]: 'pBEE' (14-bit Bayer BGBG/GRGR Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[32]: 'BG14' (14-bit Bayer BGBG/GRGR)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[33]: 'pGEE' (14-bit Bayer GBGB/RGRG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[34]: 'GB14' (14-bit Bayer GBGB/RGRG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[35]: 'pgEE' (14-bit Bayer GRGR/BGBG Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[36]: 'GR14' (14-bit Bayer GRGR/BGBG)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[37]: 'pREE' (14-bit Bayer RGRG/GBGB Packed)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[38]: 'RG14' (14-bit Bayer RGRG/GBGB)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[39]: 'GREY' (8-bit Greyscale)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[40]: 'Y10P' (10-bit Greyscale (MIPI Packed))
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[41]: 'Y10 ' (10-bit Greyscale)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[42]: 'Y12P' (12-bit Greyscale (MIPI Packed))
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[43]: 'Y12 ' (12-bit Greyscale)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[44]: 'Y14P' (14-bit Greyscale (MIPI Packed))
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
	[45]: 'Y14 ' (14-bit Greyscale)
		Size: Stepwise 16x16 - 16376x16376 with step 1/1
$ sudo -g video libcamera-vid --list-cameras 
[4:13:48.035845093] [4230]  INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3384-44d59841
[4:13:48.083237034] [4231] ERROR CameraSensor camera_sensor.cpp:535 'ov5647 10-0036': Camera sensor does not support test pattern modes.
[4:13:48.166319146] [4231]  INFO RPI raspberrypi.cpp:1313 Registered camera /base/soc/i2c0mux/i2c@1/ov5647@36 to Unicam device /dev/media3 and ISP device /dev/media0
Available cameras
-----------------
0 : ov5647 [2592x1944] (/base/soc/i2c0mux/i2c@1/ov5647@36)
    Modes: 'SGBRG10_CSI2P' : 640x480 1296x972 1920x1080 2592x1944 
           'SGBRG8' : 640x480 

I installed ustreamer from "deb http://raspbian.raspberrypi.org/raspbian/ bullseye":

$ ustreamer --version
3.16
$ apt-cache show ustreamer | grep Version:
Version: 3.16-1

I tried upgrading to ustreamer v4.11 (source build), but the error persists.
According to strace, the error occurs in ioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE]):

$ ./ustreamer --version
4.11
$ sudo -g video strace -f -e ioctl ./ustreamer --resolution 1920x1080 --format=uyvy
ioctl(2, TCGETS, {B38400 opost isig icanon echo ...}) = 0
-- INFO  [14867.371      main] -- Using internal blank placeholder
-- INFO  [14867.376      main] -- Listening HTTP on [127.0.0.1]:8080
strace: Process 4210 attached
strace: Process 4211 attached
-- INFO  [14867.378    stream] -- Using V4L2 device: /dev/video0
-- INFO  [14867.379    stream] -- Using desired FPS: 0
================================================================================
-- INFO  [14867.393      http] -- Starting HTTP eventloop ...
-- INFO  [14867.396    stream] -- Device fd=8 opened
[pid  4210] ioctl(8, VIDIOC_QUERYCAP, {driver="unicam", card="unicam", bus_info="platform:3f801000.csi", version=KERNEL_VERSION(5, 10, 92), capabilities=V4L2_CAP_VIDEO_CAPTURE|V4L2_CAP_EXT_PIX_FORMAT|V4L2_CAP_META_CAPTURE|V4L2_CAP_READWRITE|V4L2_CAP_STREAMING|V4L2_CAP_DEVICE_CAPS|0x20000000, device_caps=V4L2_CAP_VIDEO_CAPTURE|V4L2_CAP_EXT_PIX_FORMAT|V4L2_CAP_READWRITE|V4L2_CAP_STREAMING|0x20000000}) = 0
-- INFO  [14867.398    stream] -- Using input channel: 0
[pid  4210] ioctl(8, VIDIOC_S_INPUT, [0]) = 0
[pid  4210] ioctl(8, VIDIOC_S_FMT, {type=V4L2_BUF_TYPE_VIDEO_CAPTURE, fmt.pix={width=1920, height=1080, pixelformat=v4l2_fourcc('U', 'Y', 'V', 'Y') /* V4L2_PIX_FMT_UYVY */, field=V4L2_FIELD_ANY, bytesperline=3840, sizeimage=0, colorspace=V4L2_COLORSPACE_DEFAULT}} => {fmt.pix={width=1920, height=1080, pixelformat=v4l2_fourcc('U', 'Y', 'V', 'Y') /* V4L2_PIX_FMT_UYVY */, field=V4L2_FIELD_NONE, bytesperline=3840, sizeimage=4147200, colorspace=V4L2_COLORSPACE_SMPTE170M}}) = 0
-- INFO  [14867.399    stream] -- Using resolution: 1920x1080
-- INFO  [14867.400    stream] -- Using pixelformat: UYVY
[pid  4210] ioctl(8, VIDIOC_G_PARM, {type=V4L2_BUF_TYPE_VIDEO_CAPTURE}) = -1 ENOTTY (Inappropriate ioctl for device)
-- INFO  [14867.401    stream] -- Querying HW FPS changing is not supported
-- INFO  [14867.401    stream] -- Using IO method: MMAP
[pid  4210] ioctl(8, VIDIOC_REQBUFS, {type=V4L2_BUF_TYPE_VIDEO_CAPTURE, memory=V4L2_MEMORY_MMAP, count=5 => 5}) = 0
-- INFO  [14867.425    stream] -- Requested 5 device buffers, got 5
[pid  4210] ioctl(8, VIDIOC_QUERYBUF_TIME32, 0x76a97824) = 0
[pid  4210] ioctl(8, VIDIOC_QUERYBUF_TIME32, 0x76a97824) = 0
[pid  4210] ioctl(8, VIDIOC_QUERYBUF_TIME32, 0x76a97824) = 0
[pid  4210] ioctl(8, VIDIOC_QUERYBUF_TIME32, 0x76a97824) = 0
[pid  4210] ioctl(8, VIDIOC_QUERYBUF_TIME32, 0x76a97824) = 0
[pid  4210] ioctl(8, VIDIOC_QBUF_TIME32, 0x76a98e38) = 0
[pid  4210] ioctl(8, VIDIOC_QBUF_TIME32, 0x76a98e38) = 0
[pid  4210] ioctl(8, VIDIOC_QBUF_TIME32, 0x76a98e38) = 0
[pid  4210] ioctl(8, VIDIOC_QBUF_TIME32, 0x76a98e38) = 0
[pid  4210] ioctl(8, VIDIOC_QBUF_TIME32, 0x76a98e38) = 0
[pid  4210] ioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE]) = -1 EINVAL (Invalid argument)
-- ERROR [14867.433    stream] -- Unable to start capturing: Invalid argument
-- INFO  [14867.449    stream] -- Device fd=8 closed
-- INFO  [14867.450    stream] -- Sleeping 1 seconds before new stream init ...
^Cstrace: Process 4209 detached
strace: Process 4210 detached
strace: Process 4211 detached
-- INFO  [14867.921      main] -- ===== Stopping by SIGINT =====
-- INFO  [14867.921      http] -- HTTP eventloop stopped

libcamera-vid works fine.

Any ideas how to solve this issue?

Thank you!

@fphammerle
Copy link
Contributor Author

ustreamer works fine when I connect a TC358743 board instead of the camera.

@benjaminjacobreji
Copy link

@fphammerle

A simple fix for now is to enable Legacy Camera assuming you are on Raspbian Bullseye

sudo raspi-config

3 Interface Options > I1 Legacy Camera > Yes

sudo reboot

and voila it works (I hope xD, this is what I had to do)

My assumption is that ustreamer may be reliant on the older system and since the author is busy with the pikvm, the code hasn't been updated to the newer libcamera stack.
Hopefully someone picks it up and is able to look into it.
I might give it a shot, once I learn more about how the new system is implemented.

@fphammerle
Copy link
Contributor Author

Thanks @benjaminjacobreji for your suggestion to revert to the legacy stack!

If possible, I would prefer keeping libcamera, as I also use libcamera-vid --post-process-file ... from time to time.
I guess I'll wait for a fix, thanks!

@noahwilliamsson
Copy link

I assume this is related to the recent (late 2021) introduction of the Media Controller API in newer Linux kernels and Raspbian Bullseye. Support for the Media Controller API was enabled for several camera modules including OV5647 (Pi Camera v1, end of life) and IMX219 (Pi Camera v2). It's not enabled for TC358743 though.
See this post bcm2835-unicam and the Media Controller API - this is now live on the forums.

On a fresh install of Raspbian Bullseye, you'll likely have camera_auto_detect=1 in /boot/config.txt. In this case, the appropriate device tree overlay will be loaded automatically for supported camera modules and the Media Controller API will be enabled.

If you have, say, an IMX219 module, you can disable support for the Media Controller API by loading the appropriate overlay and adding media-controller=0:

camera_auto_detect=0
start_x=1
dtoverlay=imx219,media-controller=0

I ran into the same issue as you with:

  • Raspberry Pi Zero 2 with an IMX219 module
  • Raspbian Bullseye with a 5.15.30 kernel from yesterday, obtained by running sudo rpi-update
  • ustreamer built from the https://github.com/PiKVM/uStreamer/tree/m2m branch (not sure it matters but that's what I tested with)

My V4L2 settings after a fresh boot:

v4l2-ctl --list-devices
$ v4l2-ctl --list-devices
unicam (platform:3f801000.csi):
	/dev/video0
	/dev/video1
	/dev/media3

bcm2835-codec-decode (platform:bcm2835-codec):
	/dev/video10
	/dev/video11
	/dev/video12
	/dev/video18
	/dev/video31
	/dev/media2

bcm2835-isp (platform:bcm2835-isp):
	/dev/video13
	/dev/video14
	/dev/video15
	/dev/video16
	/dev/video20
	/dev/video21
	/dev/video22
	/dev/video23
	/dev/media0
	/dev/media1
v4l2-ctl --info --device /dev/video0
$ v4l2-ctl --info --device /dev/video0
Driver Info:
	Driver name      : unicam
	Card type        : unicam
	Bus info         : platform:3f801000.csi
	Driver version   : 5.15.30
	Capabilities     : 0xa5a00001
		Video Capture
		Metadata Capture
		Read/Write
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps      : 0x25200001
		Video Capture
		Read/Write
		Streaming
		Extended Pix Format
Media Driver Info:
	Driver name      : unicam
	Model            : unicam
	Serial           : 
	Bus info         : platform:3f801000.csi
	Media version    : 5.15.30
	Hardware revision: 0x00000000 (0)
	Driver version   : 5.15.30
Interface Info:
	ID               : 0x03000006
	Type             : V4L Video
Entity Info:
	ID               : 0x00000004 (4)
	Name             : unicam-image
	Function         : V4L2 I/O
	Flags         : default
	Pad 0x01000005   : 0: Sink
	  Link 0x02000008: from remote pad 0x1000002 of entity 'imx219 10-0010': Data, Enabled, Immutable

$ v4l2-ctl --get-fmt-video --device /dev/video0
Format Video Capture:
	Width/Height      : 640/480
	Pixel Format      : 'YUYV' (YUYV 4:2:2)
	Field             : None
	Bytes per Line    : 1280
	Size Image        : 614400
	Colorspace        : sRGB
	Transfer Function : sRGB
	YCbCr/HSV Encoding: ITU-R 601
	Quantization      : Limited Range
	Flags             : 

That the Media Controller API is available can be seen in the Device Caps value 0x25200001, in which the V4L2_CAP_IO_MC (0x20000000) flag is set. You also see a /dev/media3 node along with the /dev/video0 node for the unicam device.

Below is the topology for the Image Signal Processor (ISP).

media-ctl --print-topology --device /dev/media1
$ media-ctl --print-topology --device /dev/media1
Media controller API version 5.15.30

Media device information
------------------------
driver          bcm2835-isp
model           bcm2835-isp
serial          
bus info        platform:bcm2835-isp
hw revision     0x0
driver version  5.15.30

Device topology
- entity 1: bcm2835_isp0 (4 pads, 4 links)
            type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-isp0-output0":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-isp0-capture1":0 [ENABLED,IMMUTABLE]
	pad2: Source
		-> "bcm2835-isp0-capture2":0 [ENABLED,IMMUTABLE]
	pad3: Source
		-> "bcm2835-isp0-capture3":0 [ENABLED,IMMUTABLE]

- entity 6: bcm2835-isp0-output0 (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video20
	pad0: Source
		-> "bcm2835_isp0":0 [ENABLED,IMMUTABLE]

- entity 12: bcm2835-isp0-capture1 (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video21
	pad0: Sink
		<- "bcm2835_isp0":1 [ENABLED,IMMUTABLE]

- entity 18: bcm2835-isp0-capture2 (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video22
	pad0: Sink
		<- "bcm2835_isp0":2 [ENABLED,IMMUTABLE]

- entity 24: bcm2835-isp0-capture3 (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video23
	pad0: Sink
		<- "bcm2835_isp0":3 [ENABLED,IMMUTABLE]
media-ctl --print-topology --device /dev/media2
$ media-ctl --print-topology --device /dev/media2
Media controller API version 5.15.30

Media device information
------------------------
driver          bcm2835-codec
model           bcm2835-codec
serial          0000
bus info        platform:bcm2835-codec
hw revision     0x1
driver version  5.15.30

Device topology
- entity 1: bcm2835-codec-decode-source (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video10
	pad0: Source
		-> "bcm2835-codec-decode-proc":0 [ENABLED,IMMUTABLE]

- entity 3: bcm2835-codec-decode-proc (2 pads, 2 links)
            type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-codec-decode-source":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-codec-decode-sink":0 [ENABLED,IMMUTABLE]

- entity 6: bcm2835-codec-decode-sink (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video10
	pad0: Sink
		<- "bcm2835-codec-decode-proc":1 [ENABLED,IMMUTABLE]

- entity 15: bcm2835-codec-encode-source (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video11
	pad0: Source
		-> "bcm2835-codec-encode-proc":0 [ENABLED,IMMUTABLE]

- entity 17: bcm2835-codec-encode-proc (2 pads, 2 links)
             type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-codec-encode-source":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-codec-encode-sink":0 [ENABLED,IMMUTABLE]

- entity 20: bcm2835-codec-encode-sink (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video11
	pad0: Sink
		<- "bcm2835-codec-encode-proc":1 [ENABLED,IMMUTABLE]

- entity 29: bcm2835-codec-isp-source (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video12
	pad0: Source
		-> "bcm2835-codec-isp-proc":0 [ENABLED,IMMUTABLE]

- entity 31: bcm2835-codec-isp-proc (2 pads, 2 links)
             type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-codec-isp-source":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-codec-isp-sink":0 [ENABLED,IMMUTABLE]

- entity 34: bcm2835-codec-isp-sink (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video12
	pad0: Sink
		<- "bcm2835-codec-isp-proc":1 [ENABLED,IMMUTABLE]

- entity 43: bcm2835-codec-image_fx-source (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video18
	pad0: Source
		-> "bcm2835-codec-image_fx-proc":0 [ENABLED,IMMUTABLE]

- entity 45: bcm2835-codec-image_fx-proc (2 pads, 2 links)
             type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-codec-image_fx-source":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-codec-image_fx-sink":0 [ENABLED,IMMUTABLE]

- entity 48: bcm2835-codec-image_fx-sink (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video18
	pad0: Sink
		<- "bcm2835-codec-image_fx-proc":1 [ENABLED,IMMUTABLE]

- entity 57: bcm2835-codec-encode_image-sour (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video31
	pad0: Source
		-> "bcm2835-codec-encode_image-proc":0 [ENABLED,IMMUTABLE]

- entity 59: bcm2835-codec-encode_image-proc (2 pads, 2 links)
             type Node subtype Unknown flags 0
	pad0: Sink
		<- "bcm2835-codec-encode_image-sour":0 [ENABLED,IMMUTABLE]
	pad1: Source
		-> "bcm2835-codec-encode_image-sink":0 [ENABLED,IMMUTABLE]

- entity 62: bcm2835-codec-encode_image-sink (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video31
	pad0: Sink
		<- "bcm2835-codec-encode_image-proc":1 [ENABLED,IMMUTABLE]

The topology associated with the camera sensor module.

media-ctl --print-topology --device /dev/media3
$ media-ctl --print-topology --device /dev/media3
Media controller API version 5.15.30

Media device information
------------------------
driver          unicam
model           unicam
serial          
bus info        platform:3f801000.csi
hw revision     0x0
driver version  5.15.30

Device topology
- entity 1: imx219 10-0010 (2 pads, 2 links)
            type V4L2 subdev subtype Sensor flags 0
            device node name /dev/v4l-subdev0
	pad0: Source
		[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
		 crop.bounds:(8,8)/3280x2464
		 crop:(8,8)/3280x2464]
		-> "unicam-image":0 [ENABLED,IMMUTABLE]
	pad1: Source
		[fmt:unknown/16384x1 field:none
		 crop.bounds:(8,8)/3280x2464
		 crop:(8,8)/3280x2464]
		-> "unicam-embedded":0 [ENABLED,IMMUTABLE]

- entity 4: unicam-image (1 pad, 1 link)
            type Node subtype V4L flags 1
            device node name /dev/video0
	pad0: Sink
		<- "imx219 10-0010":0 [ENABLED,IMMUTABLE]

- entity 10: unicam-embedded (1 pad, 1 link)
             type Node subtype V4L flags 0
             device node name /dev/video1
	pad0: Sink
		<- "imx219 10-0010":1 [ENABLED,IMMUTABLE]

If just run ./ustreamer in this setup, it reports:

-- ERROR [99.029    stream] -- Can't start capturing: Invalid argument

In the kernel log (journal -t kernel --since '3 min ago'), the following lines are logged:

Mar 23 18:30:59 raspberry kernel: unicam 3f801000.csi: Wrong width or height 640x480 (remote pad set to 3280x2464)
Mar 23 18:30:59 raspberry kernel: unicam 3f801000.csi: Failed to start media pipeline: -22

The first problem can be corrected by updating the size on the V4L2 subdev.

# Display details for the sensor V4L2 subdev
$ media-ctl --print-topology --device /dev/media3
...
Device topology
- entity 1: imx219 10-0010 (2 pads, 2 links)
            type V4L2 subdev subtype Sensor flags 0
            device node name /dev/v4l-subdev0
	pad0: Source
		[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
		 crop.bounds:(8,8)/3280x2464
		 crop:(8,8)/3280x2464]
		-> "unicam-image":0 [ENABLED,IMMUTABLE]
	pad1: Source
		[fmt:unknown/16384x1 field:none
		 crop.bounds:(8,8)/3280x2464
		 crop:(8,8)/3280x2464]
		-> "unicam-embedded":0 [ENABLED,IMMUTABLE]
...

# See `media-ctl -h` for the syntax (pretty much: `'<entity>:<pad> <v4l2 properties>'`)
$ media-ctl --device 3 --set-v4l2 '"imx219 10-0010":0 [fmt:SRGGB10_1X10/640x480]'

# Verify
$ media-ctl -p -d 3
...
Device topology
- entity 1: imx219 10-0010 (2 pads, 2 links)
            type V4L2 subdev subtype Sensor flags 0
            device node name /dev/v4l-subdev0
	pad0: Source
		[fmt:SRGGB10_1X10/640x480 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
		 crop.bounds:(8,8)/3280x2464
		 crop:(1008,760)/1280x960]
		-> "unicam-image":0 [ENABLED,IMMUTABLE]
	pad1: Source
		[fmt:unknown/16384x1 field:none
		 crop.bounds:(8,8)/3280x2464
		 crop:(1008,760)/1280x960]
		-> "unicam-embedded":0 [ENABLED,IMMUTABLE]
...

Unfortunately, this is not enough and ustreamer still fails with Can't start capturing: Invalid argument and the kernel still logs Failed to start media pipeline: -22 (Invalid argument). It's not very helpful but indicates that one or more things aren't setup properly.

I've tried to disable use of Media Controller API by updating /boot/config.txt with:

start_x=1
camera_auto_detect=0
dtoverlay=imx219,media-controller=0

After a reboot, I see:

$ v4l2-ctl --list-formats --device /dev/video0
ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'pRAA' (10-bit Bayer RGRG/GBGB Packed)
	[1]: 'RG10' (10-bit Bayer RGRG/GBGB)
	[2]: 'RGGB' (8-bit Bayer RGRG/GBGB)

None of these pixel formats are among the ones supported by ustreamer, so it fails after proposing YUVY but getting pRAA from the driver.

I attempted to circumvent this problem by adding definitions for these formats to the source code but all I ended up with after running ./ustreamer -s :: --log-level 3 -c m2m-video (or -c M2M-MJPEG) is a green/black weave pattern on the video stream.

@ayufan
Copy link

ayufan commented Apr 3, 2022

It is failing the same on imx519 (Arducam 16MP)

-- DEBUG [1226.812    stream] -- Starting device capturing ...
-- ERROR [1226.812    stream] -- Can't start capturing: Invalid argument
-- DEBUG [1226.812    stream] -- Releasing device buffers ...
[pid   879] ioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE]) = -1 EINVAL (Invalid argument)

@bertvdijk
Copy link

Hello,

I just tried with the latest code from the master branch (ustreamer 5.3)

  • fresh install of Raspbian Bullseye (2022-01-28)
  • upgraded with apt update && apt dist-upgrade -y && reboot (kernel 5.15.30)
  • brought up to kernel 5.15.32 (as mentioned in the README) with rpi-update
  • Raspberry Pi Camera v2 (IMX219) connected via CSI (camera_auto_detect=1 in /boot/config.txt)

Using the examples in the README, I tried the examples:

# Prevent kernel message "Wrong width or height 640x480 (remote pad set to 3280x2464)"
media-ctl --device 3 --set-v4l2 '"imx219 10-0010":1 [fmt:SRGGB10_1X10/640x480]'

# From the "Usage" section but with a lower resolution
./ustreamer --host :: -m jpeg --device-timeout=5 --buffers=3 -r 640x480

# From the "Raspberry Pi Camera Example" section
./ustreamer  --format=uyvy --encoder=m2m-image --workers=3 --persistent --dv-timings  --drop-same-frames=30

Both of these commands failed with:

-- ERROR [1616.438    stream] -- Can't start capturing: Invalid argument

With dmesg -T I see:

unicam 3f801000.csi: Failed to start media pipeline: -22

@mdevaev, do you have any input on what's necessary to be able to use ustreamer for streaming video from MIPI CSI-2 sensors under Raspbian Bullseye with the modern camera stack (unicam)?

(comments in this thread suggested it's possible to workaround the problem by enabling the legacy camera stack and I did not try this. Personally I'd rather spend time on getting things to work with the modern camera stack to future proof things and ease setups under a default install)

Thanks for your time and keep up the good work with PiKVM!

@ayufan
Copy link

ayufan commented Apr 3, 2022

OK. I have been playing with this to use a new bcm2835-isp. In general libcamera does convert pRGAA into YUYV using isp and DMA transfer. It is not hard but the ustreamer is not very flexible to extend and inject. I hacked it somehow adding m2m isp. It kind of works, and breaks in number of places, but should support imx219 and imx519.

You can see my branch in here: https://github.com/ayufan-research/ustreamer. Compile and then try to run it. I have troubles with higher resolutions. It seems to work on 5.10 and 5.15. On 5.15 it does work with M2M-JPEG codec to further accelerate transfer.

This is not efficient, nor finished, and can only be seen how it might be done.

ustreamer --host :: -r 1920x1080 -m pRAA -w 1 -b 1 --debug --no-log-colors -d /dev/video0

@noahwilliamsson
Copy link

Nice work!

I just tested this with a Raspberry Pi Zero 2W and later a Zero W v1.1, both running kernel 5.15.32.

media-ctl --device 3 --set-v4l2 '"imx219 10-0010":0 [fmt:SRGGB10_1X10/640x480]'
./ustreamer --host :: -r 640x480 -m pRAA -w 1 -b 1 --debug --no-log-colors -d /dev/video0

This might be completely incidental.. but the Zero 2W appears to have fried within about 15 seconds after running these commands, before I managed to open the browser to test out the stream. I lost SSH access and smelled a very hot PCB. After powering it off, I let it cool for 10 minutes and then tried to power it up again but it immediately becomes very hot and doesn't light any LEDs. Not sure what happened here -- it had been running idle for a few hours already on a clean desk, and I only accessed it over SSH.

Anyway, I moved the SD card and the camera over to a Raspberry Pi Zero W v1.1 and re-ran the commands. If I move my hand in front of the camera, I see a dark grey silhouette of it on a darker/black background on the streaming page. Is that expected? (libcamera-still -o still.jpg produces an OK picture though.)

I also note that if I run libcamera-still -o still.jpg, then a subsequent ./ustreamer ... fails with Can't start capturing: Invalid argument and the kernel reports unicam 20801000.csi: Failed to start media pipeline: -22. It seems some settings are left in a different (unexpected) state than after a fresh boot when you've run libcamera-still.

BTW, what Raspberry Pi hardware did you test with?

@AntiHeld889
Copy link

same issue

@mdevaev
Copy link
Member

mdevaev commented Apr 11, 2022

Sorry for the long answer.

I'm using ustreamer with some usb webcams and csi-hdmi bridge so I don't know how to make it work with the regular csi camera, I don't have any. I'll try to get some, but I can't promise that it will be happen soon.

@mdevaev mdevaev self-assigned this Apr 11, 2022
@mdevaev mdevaev added the type:bug Something isn't working label Apr 11, 2022
@fphammerle
Copy link
Contributor Author

Thanks a lot for looking into this issue, @mdevaev !

@ayufan
Copy link

ayufan commented Apr 11, 2022

This might be completely incidental.. but the Zero 2W appears to have fried within about 15 seconds after running these commands, before I managed to open the browser to test out the stream. I lost SSH access and smelled a very hot PCB. After powering it off, I let it cool for 10 minutes and then tried to power it up again but it immediately becomes very hot and doesn't light any LEDs. Not sure what happened here -- it had been running idle for a few hours already on a clean desk, and I only accessed it over SSH.

I did not had such problems.

Anyway, I moved the SD card and the camera over to a Raspberry Pi Zero W v1.1 and re-ran the commands. If I move my hand in front of the camera, I see a dark grey silhouette of it on a darker/black background on the streaming page. Is that expected? (libcamera-still -o still.jpg produces an OK picture though.)

This is a problem that bcm2835-isp is not configured. The libcamera does read sensor parameters from /usr/share/libcamera/ipa/raspberrypi/imx519.json (or other depending on used sensor) and configures the /dev/video13 (you can see available with v4l2-ctl -d /dev/video13 -L or ISP params only: v4l2-ctl -d /dev/video13 -C red_balance -C colour_correction_matrix -C black_level -C green_equalisation -C gamma -C denoise -C sharpen -C defective_pixel_correction -C colour_denoise).
Once you configure this the above https://github.com/ayufan-research/ustreamer to work.

@ayufan
Copy link

ayufan commented Apr 11, 2022

In general all those DLSR type sensors (from Sony, and others) they return only 8-bit or 10-bit Bayer packed format (once you hook into this stream, the latency is simply awesome) that resembles exactly how the CMOS sensor for cameras are built. You now once have this data need to manually calculate exposure time, analogue and digital gains. This can be done by ISP. Raspberry PI does have bcm2835-isp (present in system under /dev/video12-15) which can take this raw data, apply all corrections (color, gamma, etc.) and output YUYV/YU12/NV12 which can be then used as a source for other encoders (JPEG or H264). The ISP is able to output two resolutions at the same time, so effectively able to perform a single capture, and have high-res and low-res YUYV stream being handled concurrently.

Why I'm writing about this? The ISP is software controlled. Trying to re-implement libcamera which does manage ISP is tendious task given all aspects that needs to be covered. Previously we would have UVC, or other type of sensor with integrated ISP returning YUYV, where we would configure brightness, contrast, focus, etc. It does mean that without libcamera it is fairly hard to implement automated brightness control, or auto focus. The https://github.com/ayufan-research/ustreamer does use ISP directly, and this is why you saw the black image mostly, since ISP was not configured. It can be one time off configured with usage of libcamera-still (with any type of parameters you might want).

About ustreamer. This can be supported today with using a trick:
a. comment out the VIDIOC_S_INPUT:

if (xioctl(RUN(fd), VIDIOC_S_INPUT, &input) < 0) {

b. use v4l2-ctl to manually feed ISP with a stream of data v4l2-ctl --device /dev/video0 --out-device /dev/video13 --stream-mmap --stream-out-dmabuf (it will copy over RG10 data into ISP)
c. run ustreamer --device /dev/video14 (it will expose YUYV, or whatever you configure it to)
d. this will work, but have pretty terrible latency

Exactly above behavior is implemented in https://github.com/ayufan-research/ustreamer. The camera data, is feed into ISP, and this is feed into JPEG. The implementation in https://github.com/ayufan-research/ustreamer is pretty clumsy, since ustreamer architecture does not allow to easily create a multi-stage pipelines for data processing, or hooking libcamera. For example the https://github.com/ayufan-research/ustreamer even though it hooks ISP, it does not offer YUYV stream for H264 support and there's no easy way aroud this limitation except maybe rewriting the device be libcamera centric (which internally uses ISP).

The integrated kernel data pipelining (as done with v4l2-ctl ...) might land at some point in linux kernel: https://lore.kernel.org/linux-arm-kernel/[email protected]/t/. This change uses media-controller routing capabilities, to hook into avaiable v4l2 devices and to route data streams between them to get sink with a desired formats. Unsure if this got merged, and if yes when it filters to Raspberry PI OS. However, this as well will have the same limitation: libcamera needs to control ISP to provide auto brightness and auto focus control.

As an exercise I tried to do it, and thus (sorry, for slightly hijacking this thread with own project) created in last few days the https://github.com/ayufan-research/camera-streamer. It did felt slightly easier after trying the first iteration in ustreamer codebase (due to lack of flexible multi-stage pipelines) to start from scratch with a project being v4l2 hardware centric. I wrote the v4l2 only pipeline, but at some point figured out above and saw that not using libcamera is in general not wise idea. Using libcamera comes with latency impact (but you get good control of exposure and focus), as I measured in my project compared to direct ISP-mode. Using direct ISP-mode I get a way better performance and latency than anything that I tested so far (using my branch for https://github.com/ayufan-research/ustreamer). Got pretty far, to the point that I started using the linked project on my Voron 0 with Arducam 16MP for MJPEG and H264 daily.

It appears that might I took a wrong approach with https://github.com/ayufan-research/ustreamer to implement ISP step instead of rewriting device to be libcamera centric. Usage of libcamera is not really that hard and it offers the same access to mmaped and dma-enabled buffer: https://github.com/ayufan-research/camera-streamer/tree/master/device/libcamera.

Ref.:

@mdevaev
Copy link
Member

mdevaev commented Apr 11, 2022

@ayufan thank you for the research. It seems ustreamer is not a best choice for IMX219 and so on. The main ustreamer's purpose is HDMI capture. When I wrote ustreamer, libcamera did not exist on RPi. So, V4L2 encoders has no flexible support right now. Actually I dont't mind to add support of ISP and any other things if doesn't spoil latency, etc.

@ayufan
Copy link

ayufan commented Apr 11, 2022

@mdevaev Adding ISP support is doable, but is not the best idea :) If you want to support cameras, consider making device to support two modes of operation:

  • v4l2 (as today, for minimal latency)
  • libcamera (for maximum compatibility) -> the output of libcamera will be the capture buffer with dma and mmap, so no later stages will need to be changed in ustreamer

@ayufan
Copy link

ayufan commented Apr 11, 2022

Latency difference according to my tests is due to the way how buffers are enqueued by libcamera, and how this introduces an additional processing delay. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The libcamera can still achieve 120fps, it is just slightly slower :). This does make a difference for a case how fluid the motion feels.

This is especially visible when you have FPS set lower than the sensor can provide. Now, having too many buffers enqueued does increase latency. Doing this very dumb test https://www.youtube.com/watch?v=e8ZtPSIfWPc:

  • 2328x1748@30fps: libcamera and my direct ISP is comparable, loosing about 9 frames (at 60HZ)
  • 2328x1748@10fps: libcamera looses about 25 frames, the direct ISP looses about 11 frames
  • I did not test, but I wonder how this compares to ustreamer ;)

2328x1748@30fps

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4

2328x1748@10fps

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4

1280x720@120fps for Arducam 16MP

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3

@Ramalama2
Copy link

Latency difference according to my tests is due to the way how buffers are enqueued by libcamera, and how this introduces an additional processing delay. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The libcamera can still achieve 120fps, it is just slightly slower :). This does make a difference for a case how fluid the motion feels.

This is especially visible when you have FPS set lower than the sensor can provide. Now, having too many buffers enqueued does increase latency. Doing this very dumb test https://www.youtube.com/watch?v=e8ZtPSIfWPc:

  • 2328x1748@30fps: libcamera and my direct ISP is comparable, loosing about 9 frames (at 60HZ)
  • 2328x1748@10fps: libcamera looses about 25 frames, the direct ISP looses about 11 frames
  • I did not test, but I wonder how this compares to ustreamer ;)

2328x1748@30fps

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4

2328x1748@10fps

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4

1280x720@120fps for Arducam 16MP

# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1

# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3

Im using your latest camera-streamer for my voron (3d printer) with an arducam imx477 and cm4.
Just wanted to say THANK YOU!!!!!

Camera-streamer is amazing, ustreamer didn't worked for me either.
Im using as dt-overlay:
dtoverlay=imx477,media-controller=1

It works absolutely perfect with the new media-controller.

Im streaming at 1080p@60fps, it's almost absolutely latency free, takes no cpu usage and im overall extremely happy!

Your fork of ustreamer doesn't work for me sadly. Same error like mainline ustreamer.

However, please continue the work on camera-streamer it's perfect, i love it and thank you so much for that!

Cheers

@mdevaev
Copy link
Member

mdevaev commented Jun 8, 2023

I've got a camera and experimented with it. Too much work will be needed for native support, but Raspberry offers "libcamerify", which works fine with uStreamer. On Raspberry OS, install libcamera-tools and run as follows: libcamerify ./ustreamer -r 1920x1080 --encoder=m2m-image

@samuk
Copy link

samuk commented Feb 3, 2024

I have Bullseye on a Pi4 and RPI camera2

bristolfish@Mycodo:~ $ sudo apt install libcamera-tools
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
libcamera-tools is already the newest version (0.1.0+rpt20231122-1).
0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded.
bristolfish@Mycodo:~ $ libcamera-hello
[5:33:33.478314526] [61051] INFO Camera camera_manager.cpp:284 libcamera v0.1.0+147-057299d0-dirty (2024-01-24T08:51:15+00:00)
[5:33:33.556571064] [61057] WARN RPiSdn sdn.cpp:39 Using legacy SDN tuning - please consider moving SDN inside rpi.denoise
[5:33:33.558631749] [61057] WARN RPI vc4.cpp:398 Mismatch between Unicam and CamHelper for embedded data usage!
[5:33:33.559594611] [61057] INFO RPI vc4.cpp:452 Registered camera /base/soc/i2c0mux/i2c@1/imx219@10 to Unicam device /dev/media4 and ISP device /dev/media0
[5:33:33.559697147] [61057] INFO RPI pipeline_base.cpp:1167 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml'
[5:33:33.560713675] [61057] WARN V4L2 v4l2_pixelformat.cpp:338 Unsupported V4L2 pixel format H264
Preview window unavailable
Mode selection for 1640:1232:12:P
SRGGB10_CSI2P,640x480/0 - Score: 4504.81
SRGGB10_CSI2P,1640x1232/0 - Score: 1000
SRGGB10_CSI2P,1920x1080/0 - Score: 1541.48
SRGGB10_CSI2P,3280x2464/0 - Score: 1718
SRGGB8,640x480/0 - Score: 5504.81
SRGGB8,1640x1232/0 - Score: 2000
SRGGB8,1920x1080/0 - Score: 2541.48
SRGGB8,3280x2464/0 - Score: 2718
Stream configuration adjusted
[5:33:33.563140599] [61051] INFO Camera camera.cpp:1183 configuring streams: (0) 1640x1232-YUV420 (1) 1640x1232-SBGGR10_CSI2P
[5:33:33.563608039] [61057] INFO RPI vc4.cpp:616 Sensor: /base/soc/i2c0mux/i2c@1/imx219@10 - Selected sensor format: 1640x1232-SBGGR10_1X10 - Selected unicam format: 1640x1232-pBAA
#0 (0.00 fps) exp 33251.00 ag 8.00 dg 1.00
#1 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#2 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#3 (30.00 fps) exp 33251.00 ag 8.00 dg 1.00
#4 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#5 (30.00 fps) exp 33251.00 ag 8.00 dg 1.00
#6 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#7 (30.00 fps) exp 33251.00 ag 8.00 dg 1.00
#8 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#9 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
#10 (30.00 fps) exp 33251.00 ag 8.00 dg 1.00
#11 (30.01 fps) exp 33251.00 ag 8.00 dg 1.00
^C
bristolfish@Mycodo:~ $ libcamerify ustreamer -r 1920x1080 --encoder=m2m-image
ERROR: ld.so: object '/usr/lib/aarch64-linux-gnu/libcamera/v4l2-compat.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored.
Unknown encoder type: m2m-image; available: CPU, HW, NOOP

@mdevaev
Copy link
Member

mdevaev commented Feb 3, 2024

Install libcamera-v4l2 also. They split a package.

@samuk
Copy link

samuk commented Feb 3, 2024

Thanks, that made part of the error go away, but I'm left with this one

sudo apt-get install libcamera-v4l2
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following NEW packages will be installed:
libcamera-v4l2
0 upgraded, 1 newly installed, 0 to remove and 50 not upgraded.
Need to get 42.4 kB of archives.
After this operation, 199 kB of additional disk space will be used.
Get:1 http://archive.raspberrypi.com/debian bookworm/main arm64 libcamera-v4l2 arm64 0.1.0+rpt20231122-1 [42.4 kB]
Fetched 42.4 kB in 0s (168 kB/s)
Selecting previously unselected package libcamera-v4l2:arm64.
(Reading database ... 165940 files and directories currently installed.)
Preparing to unpack .../libcamera-v4l2_0.1.0+rpt20231122-1_arm64.deb ...
Unpacking libcamera-v4l2:arm64 (0.1.0+rpt20231122-1) ...
Setting up libcamera-v4l2:arm64 (0.1.0+rpt20231122-1) ...
bristolfish@Mycodo:~ $ libcamerify ustreamer -r 1920x1080 --encoder=m2m-image
Unknown encoder type: m2m-image; available: CPU, HW, NOOP

@mdevaev
Copy link
Member

mdevaev commented Feb 3, 2024

Is it ustreamer from the debian repo? Show ustreamer --version.

@samuk
Copy link

samuk commented Feb 3, 2024

bristolfish@Mycodo:~ $ ustreamer --version
4.9

@mdevaev
Copy link
Member

mdevaev commented Feb 3, 2024

This is the very old release. Uninstall it and build from git if you want to use m2m encoder.

@samuk
Copy link

samuk commented Feb 3, 2024

Thanks. I'll try the Docker version I think

@mdevaev
Copy link
Member

mdevaev commented Feb 3, 2024

It is better to use native build. The list of dependencies is in the README, the build will take a few seconds.

@mdevaev
Copy link
Member

mdevaev commented Feb 5, 2024

@samuk Sup?

@samuk
Copy link

samuk commented Feb 5, 2024

I do want to end up running it in Docker, the underlying OS will be https://github.com/balena-os I'll have a go at native first though if that's your recommendation.

What's up with Docker?

@mdevaev
Copy link
Member

mdevaev commented Feb 5, 2024

Docker requires port forwarding and video device. There are no problems, but I usually recommend the native launch because it's easier.

@mdevaev
Copy link
Member

mdevaev commented Feb 8, 2024

Is it working?

@franciscoGar
Copy link

thanks for the guidance!!

got a cheap camera in aliexress (aparently is V1)
https://es.aliexpress.com/item/1005003386791483.html

found instructions in
https://docs.arducam.com/Raspberry-Pi-Camera/Native-camera/5MP-OV5647/
follow the instructions for Bookworm
installed their tools from
https://docs.arducam.com/Raspberry-Pi-Camera/Native-camera/Libcamera-User-Guide/

got "Unable to start capturing: Invalid argument"

found in this thread the instructions to install "libcamera-v4l2" wich are mentioned in
the instructions but until here I could not figure it out how
sudo apt-get install libcamera-v4l2

after that this works

sudo modprobe bcm2835-v4l2
sudo libcamerify ./ustreamer --encoder=m2m-image --host=0.0.0.0 --port=80 -r 1296x972

sudo libcamerify ./ustreamer --encoder=m2m-video --host=0.0.0.0 --port=80 -r 1296x972

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Development

No branches or pull requests

10 participants