Using ffmpeg for webcam streaming and timelapse support


#1

I'd been wanting to add a webcam to my OctoPrint setup for a little while now, but I'm a professional sysadmin (ie, really, really lazy), and having to build and install mjpg-streamer manually just did not appeal. So I did a little reading and figured out how to set up a streaming mjpeg server using tools I already had installed on my Pi - ffmpeg.

This guide explains how I set up ffmpeg as a drop-in replacement for mjpeg-streamer. It expects a basic knowledge of Linux - how to create and edit text files, and how to run commands. If you've successfully installed OctoPrint following https://discourse.octoprint.org/t/setting-up-octoprint-on-a-raspberry-pi-running-raspbian/2337, then you'll be fine.

Notes about performance

This method will likely involve some transcoding, so I expect it to be more CPU-intensive than mjpg-streamer. On my Raspberry Pi 3B+ running a USB webcam (Logitech C920), ffmpeg is consuming ~20% CPU, and pushes the load average up around 0.2-0.3. I've had no problems running this setup for a few prints around 1.5-2 hours each. But on older Pis this extra load may be a problem.

On the other hand, now that I know how easy this is to get running, my next step is to look in to replacing the mjpeg stream with a low bitrate h264 stream, to take advantage of the hardware encoder on the Pi.

Getting started

This setup requires two components. First, an ffserver process is run that sets up the actual streaming server, which serves an mjpeg stream and static jpg images. Second, an ffmpeg process is launched that pulls video from the webcam and feeds it to ffserver.

:warning: We're setting up ffserver to use the same port as the default mjpg-streamer port. It's important to make sure mjpg-streamer is stopped before trying this.

Set up ffserver

The ffserver configuration file is located at /etc/ffserver.conf. Create it with these contents

# ffserver configuration for an mjpeg stream
# Adapted from
# https://gist.github.com/peterhellberg/ebfc72147c2009ee720aafe57ce9c141
HTTPPort 8080
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 200
MaxClients 100
MaxBandWidth 500000
CustomLog -

<Feed camera.ffm>
File /tmp/camera.ffm
FileMaxSize 5M
</Feed>

<Stream camera.mjpeg>
Feed camera.ffm
Format mpjpeg
# Make sure frame rate and size
# match those passed to ffmpeg
VideoFrameRate 5
VideoSize 640x480
VideoGopSize 12
VideoBitRate 4096
VideoBufferSize 4096
VideoQMin 5
VideoQMax 51
NoAudio
Strict -1
</Stream>

<Stream static-camera.jpg>
Feed camera.ffm
Format jpeg
VideoFrameRate 2
VideoIntraOnly
VideoSize 640x480
NoAudio
NoDefaults
Strict -1
</Stream>

Then run the server (as a background process so we can keep using this session) with:

ffserver &

Running ffmpeg

I'm running ffmpeg to get video from the first attached webcam with this command:

ffmpeg -input_format mjpeg -video_size 640x480 -framerate 5 -i /dev/video0 -c:v copy http://localhost:8090/camera.ffm

You can check everything's working by browsing to http://<Your Pi's IP>:8080/camera.mjpeg to view the 5fps live stream.

Running automatically at startup

:warning: Again, if you've got mjpg-streamer configured to run at boot, this is the part where you make sure that's been disabled before continuing.

I set up two systemd services to automatically run ffserver and ffmpeg.

First, create /etc/systemd/system/ffserver.service with these contents. Note that the User= line specifies the user to run ffserver as. This will work fine for a default Pi install, but may need to be tuned for your setup.

[Unit]
Description=FFMPEG streaming server service

[Service]
User=pi
ExecStart=/usr/bin/ffserver

[Install]
WantedBy=multi-user.target

Second, create /etc/systemd/system/ffmpeg.service with these contents. Again, the User= line may need to be adjusted. This service adds some extra flags to ffmpeg so it won't spam the logs with all the debug info that was dumped to your console when testing it above. :wink:

[Unit]
Description=FFMPEG transcoder service
After=ffserver.service

[Service]
User=pi
# -video_size and -framerate should match the settings in ffserver.conf
ExecStart=/usr/bin/ffmpeg -input_format mjpeg -video_size 640x480 -framerate 5 -i /dev/video0 -c:v copy -nostats -loglevel panic http://localhost:8090/camera.ffm

[Install]
WantedBy=multi-user.target

Then reload systemd and start our new services to ensure they start up properly:

sudo systemctl --system daemon-reload
sudo systemctl start ffserver.service
sudo systemctl start ffmpeg.service

If that goes well, then we can finally configure systemd to run them at startup:

sudo systemctl enable ffserver.service
sudo systemctl enable ffmpeg.service

Updating OctoPrint configuration

Navigate to the OctoPrint settings dialog. In the Webcam & Timelapse section you want these settings:

  • Stream URL: /webcam/camera.mjpeg
  • Snapshot URL: /webcam/static-camera.jpg

#2

And shortly after I posted this, I realised ffmpeg wasn't setting the input framerate properly, leading to much higher CPU usage. :man_facepalming: I've updated the ffmpeg commandline with the right settings, and on my Pi usage has dropped from nearly 50% CPU to hovering around 20%. Still thinking about how to improve things.


#3

Great one.
I'm looking into this right now.
Try this hint:
https://lists.ffmpeg.org/pipermail/ffmpeg-user/2014-October/024008.html
And pull x264 from the c920 - it has a HW encoder - 1920x1080x30 uses around 450kB/s.

I can't check this right now because I'm at work.


#4

Yep yep. I wanted to get something working streaming mjpeg to begin with, and it made the most sense to pull mjpeg from the camera with the same settings that I was going to be sending.

It would make the most sense to me to pull h264 from the camera and then send that format, and I was planning on working on that next. But given the current CPU load I'm seeing, it's almost certain the there's some transcoding going on, and maybe pulling h264 from the camera and transcoding to mjpeg might be worth experimenting with. I'll give it a go when I have time to look at this next.


#5

I am currently using Firefox and have run into a problem when opening the 'webcam' stream on the Pi, which doesn't affect Chrome (I think I have narrowed it down to bandwidth, as if I lower Quality or Frame Rate, I can get it to work.. But if I leave it "Default" it causes Firefox to get into a massive memory leak).

I believe this is caused by MJPEG.

I am using a Pi3 and Pi v2 camera, so could this change help someone like me with my Firefox problems?


#6

This solution sends an MJPEG stream at the same framerate and size as the default mjpg-streamer setup. If the bandwidth of that is giving you problems, then just blindly switching to the same stream from ffmpeg won't help you.


#7

Bugger.. I was hoping this might move over from MJPEG to a more modern streaming technology (i.e X264 or 265)...

Firefox does not seem to work well with high res, high framerate MPJEG and I am really suffering at the moment.


#8

Read the rest of the thread. I wanted to build a drop-in replacement for mjpg-streamer. So I did. And then talked about the possibility of adapting that technique for different formats.

I like h264 because the raspberry pi has hardware codecs for it. Just haven't had the time or inclination to properly test it myself yet. If you need it though, feel free to experiment with changing the format in the ffserver.conf file and ffmpeg commandline.


#9

The whole point of pulling x264 is not to have mjpeg. Pulling x264 and streaming it should take just a few % of CPU time and 10-20 times less bandwidth.


#10

Not saying sending h264 is off the table, but if h264->mjpeg ends up being easier for a pi to do than the pipeline that's in place now, then that's what I'll use.


#11

Subscribed, would like to pull h264 stream from my logic c920 that natively supports this stream.

Just dont have the background to pull this off. Linux, raspberry pi, and ocotoprint newbie.

Thanks


#12

Excellent! This seems like an improvement already, but if the x264 was working, maybe the load would be low enough to use it reliable on a zero. If you're interested, you'd reach a very large audience by getting these changes to the octopi github. As a professional sysadmin, I'm sure you'd be effective there. Guysoft's scripts are doing just what you would on the command line, but recording them and running them in a qemu environment to build the images. I'd love to see more stuff like this available.


#13

Weird thing mjpeg streamer allready used just 3-7% on my old RPi1. It uses ~3% on my RPi3 (streaming 1920x1080x30), and about the same on my Odroid XU4. I'm also testing my cameras on a x86 Dell minipc with an i5 and I'm getting less that 1% CPU usage while streaming 3 cameras (1x c920 & 2x c930e) at once at 1920x1080x30.
This suggests there is something wrong with your config (note: I do NOT use octoPi - all above were installed manually)


Logitech c920 journey
#14

To everyone waiting for x264 streaming. I'm not in any case a linux streaming video guru, but from what I've searched and read we can forget about it until OP will start using <video> html tag instead of <img> tag. After that it should be as easy as using a one liner per camera, something like:

cvlc v4l2:///dev/video0:chroma=h264:width=1920:height=1080:fps:30 --sout '#standard{access=http,mux=ts,dst=localhost:8080,name=stream,mime=video/ts}' -vvv

#15

Too bad a little jQuery couldn't change the tag into a video...?

$('img').replaceWith(function(){
    return $("<video />", {html: $(this).html()});
});

But I guess more work would need to be done on that.