Hardware-acceleration for webcam-stream on raspberry

Camera model
Bus 001 Device 005: ID 0553:0202 STMicroelectronics Imaging Division (VLSI Vision) STV0680 Camera

Driver
stv0680

What is the problem?
The camera only supports RGB3, no mjpeg, no YUYV

Logs

/var/log/webcamd.log doesn't exist.
However, there is output from mjpeg-streamer, when started manually:

#./mjpg_streamer -i "input_uvc.so -d /dev/video0 -r 320x242 -f 15 -y" -o "output_http.so -w www"
MJPG Streamer Version: git rev: 501f6362c5afddcfb41055f97ae484252c85c912
 i: Using V4L2 device.: /dev/video0
 i: Desired Resolution: 320 x 242
 i: Frames Per Second.: 15
 i: Format............: YUYV
 i: JPEG Quality......: 80
 i: TV-Norm...........: DEFAULT
 i: Could not obtain the requested pixelformat: YUYV , driver gave us: RGB3
    ... will try to handle this by checking against supported formats. 
Init v4L2 failed !! exit fatal
 i: init_VideoIn failed

Additional information about setup

Octopi is freshly installed, newest version on a first-generation raspberry PI B+.

I added this line to /etc/modules to load the driver (not sure if this was neccesary):

gspca_stv06xx

Also, here is the v4l2-output for my camera:

#v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
	Type: Video Capture

	[0]: 'S680' (GSPCA STV0680)
		Size: Discrete 322x242

Since mjpeg-streamer doesn't seem to support RGB3, I got the idea of using ffmpeg instead.
Octopi comes preinstalled with ffmpeg version 4.1.4-1+rpt1~deb10u1 and that is awesome because since version 3, it supports hardware-acceleration for h264-encoding on all raspberrys!

This means it should use no cpu-resources at all to provide a high-res realtime-webcam-stream in h264!
The encoder is called omx and acceleration was 3x-4x on a raspberry (compared to the playing speed of the video). The command should look something like this:

ffmpeg  -i [not sure how to define input] -vcodec h264_omx -f flv "rtmp://example.com:1060"

There are questions rising up now:

  • How to define the input for the webcam for ffmpeg at the commandline?

  • Could the octopi-webfrontend show the new stream?

  • Which output-address would I have to set?

The best thing is this: All raspberrys except raspberry4 are using the same gpu (Broadcom VideoCore IV). So it should even work on the oldest rpi-models. Rpi4 uses Broadcom VideoCore VI with a few more MHz.

Since I'm new to octoprint, I don't know how to set this up properly, hence this thread.

Nobody? Really?

BTW, I found out how to define the input stream: It's just /dev/video0

So I successfully could record something with typing this:
ffmpeg -re -f rawvideo -pix_fmt rgb32 -framerate 15 -video_size 322x242 -i /dev/video0 -vcodec h264_omx -b:v 750k -f flv ./testfile.flv

Unfortunately the resulting video has errors: I see a pattern of small webcam-images, filling the picture. I'm sure -pix_fmt32 is wrong.

However, unless I don't know how to relay the stream to octoprint I see no reason finding the right option.

So to answer your question in regard to getting that into OctoPrint, it would require some other tweaks. You can specify the output as MJPG stream potentially, or use ffserver, or export directly to YouTube Twitch and replace the camera view with the Live stream view from those other services using WebcamIframe. This stream to outside service would introduce delays in the stream because it has to be transcoded > pushed to YouTube/Twitch > Load back down to web browser.

That looks great! Thank you. I will try the tutorial.

Edit:

Please tell me why mjpeg is such a big thing. Is there any reason against usage of h264 directly?

The concern is resources on the pi, the streaming takes up power, and depending on how you do it and your pi model, it could cause issues while printing. The other problem is if you don't get a pure h264 stream with the write wrapper you can't embed it in a web interface very easily. There's several people that have gotten it to work a couple of different ways, and from other posts it did have an impact on CPU/memory while streaming. With the Pi4 that might not be that big of an issue, or if you are running OctoPrint/ffmpeg on a non-pi device like on a Windows or Linux workstation. Another option you could look at which I did, but didn't really dig too deep into it is running MotionEye on the Pi as well, which can hook into the raspberry pi cams or USB cams and provides a couple of different output options, that would allow for higher quality access direct and mjpg stream for use within OctoPrint.

HLS support in OctoPrint...