How to simultaneously stream 3 streams from a raspberry pi camera

I am using a raspberry pi 4B with 4GB of ram.

I installed the latest version of Buster on my pi

I gave it a memory split of 128MB to GPU.

I followed this guide for installing Octoprint onto the pi:

I installed nginx as my reverse-proxy, I followed this guide to install nginx:

Here is my reverse-proxy file located in /etc/nginx/sites-enabled/3d-printer.conf, I deleted the default file located in /etc/nginx/sites-enabled:

server {

  # webcam port, optional.
  location /webcam {
    proxy_pass http://localhost:8080/;

  location /rtspcam {
    proxy_pass http://localhost:8555/;

  location /octoprint/ {
    proxy_pass http://localhost:5000/;
    proxy_set_header Host $http_host;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection "upgrade";
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Scheme $scheme;
    proxy_set_header X-Script-Name /octoprint;
    proxy_http_version 1.1;
    client_max_body_size 0;


I did NOT install mjpg-streamer. The reason I did not was because the /dev/video0 device (my raspberry pi camera) once used by a process can not be accessed by another process.

I orignally installed mjpg-streamer and v4l2rtspserver. But these two would not play nice together so I had to find another solution to my camera.

I want my pi camera to stream mjpeg, snap-shots or jpeg and rtsp stream (so I can add it in my surveillance screen with the rest of my cameras)
So I needed three streams coming from one input source, my raspberry pi camera. How to do this?

My solution:
use ffserver to distribute the streams for me. This is how you get ffserver onto your raspberry pi 4B.

Download the ffmpeg 3.4.6 from here (ffserver comes with ffmpeg 3.4.6):

Place the ffmpeg-3.4.6.tar.xz file into /usr/src

From now on, if you see a $ in front of a line it will mean you need to type the command in a command terminal window. Please do not type the '$' symbol with the command or the command will not work.

$cd /usr/src
$sudo tar -xvf ffmpeg-3.4.6.tar.xz
$cd /usr/src
$sudo git clone git://
$cd x264
$sudo ./configure --host=arm-unknown-linux-gnueabi --enable-static --disable-opencl
if you have a raspberry pi 3B or higher you can use the -j4 otherwise leave it off for raspberry pi 1 or 2
$sudo make -j4
on my raspberry pi 4B it took only 30 minutes to compile
$sudo make install
$cd /usr/src/ffmpeg-3.4.6

$sudo ./configure --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-omx --enable-omx-rpi

$sudo make -j4
$sudo make install

NOTE we will use tmp2 as place to save our video, but it will NOT go on the SD card. The video will be save to MEMORY

$sudo mkdir /tmp2
$sudo nano /etc/fstab

add this to /etc/fstab:

tmpfs           /tmp2              tmpfs   defaults,noatime,mode=1777 0       0

mount tempfs in ram:
$sudo mount /tmp2

create a file with:
$sudo nano /etc/ffserver.conf
and paste this:

# ffserver configuration for an mjpeg stream and rtsp stream
# Adapted from
HTTPPort 8080
RTSPPort 8555
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 500000
CustomLog -
# Suppress that if you want to launch ffserver as a daemon.

<Feed camera.ffm>
   File /tmp2/camera.ffm
   FileMaxSize 5M
   ACL allow
   ACL allow

<Stream camera.sdp>
   Format rtp
   Feed camera.ffm
   VideoFrameRate 10
   Videosize 480x360
   ACL allow
   ACL allow

<Stream camera.mjpeg>
   Format mpjpeg
   Feed camera.ffm
   # Make sure frame rate and size
   # match those passed to ffmpeg
   VideoFrameRate 10
   Videosize 480x360
   VideoBitRate 4096
   VideoBufferSize 4096
   VideoQMin 5
   VideoQMax 51
   Strict -1
   ACL allow
   ACL allow

<Stream static-camera.jpg>
   Format jpeg
   Feed camera.ffm
   VideoFrameRate 2
   VideoSize 480x360
   Strict -1
   ACL allow
   ACL allow

<Stream stat.html>
   Format status
   ACL allow localhost
   ACL allow

<Redirect index.html>

now let's test to see if ffserver will work:
$ffserver -d -f /etc/ffserver.conf

now open another terminal window:

$ffmpeg -input_format mjpeg -video_size 480x360 -framerate 10 -input_format h264 -i /dev/video0 -c:v copy http://localhost:8080/camera.ffm

now goto Chromium and put these two addresses in the address bar:
new Chromium tab:

if you see the camera output then everything is running nicely.
So let's set it up to start automatically:
$sudo nano /etc/systemd/system/ffserver.service
paste this in:

Description=FFMPEG streaming server (ffserver) service

ExecStart=/usr/local/bin/ffserver -f /etc/ffserver.conf


$sudo nano /etc/systemd/system/ffmpeg.service
paste this in:

Description=FFMPEG transcoder service

# -video_size and -framerate should match the settings in ffserver.conf
ExecStart=/usr/local/bin/ffmpeg -input_format mjpeg -video_size 480x360 -framerate 10 -input_format h264 -i /dev/video0 -c:v copy -nostats -loglevel panic http://localhost:8080/camera.ffm


Then reload systemd and start our new services to ensure they start up properly:

$sudo systemctl --system daemon-reload
$sudo systemctl start ffserver.service
$sudo systemctl start ffmpeg.service

If that goes well, then we can finally configure systemd to run them at startup:

$sudo systemctl enable ffserver.service
$sudo systemctl enable ffmpeg.service

Updating OctoPrint configuration:
Navigate to the OctoPrint settings dialog. In the Webcam & Timelapse section you want these settings:

Stream URL: http://localhost/webcam/camera.mjpeg
Snapshot URL: http://localhost/webcam/static-camera.jpg
RTSP URL: rtsp://localhost:8555/camera.sdp

If you would like to control all the stream through Octoprint:
$cd /home/pi/scripts
$nano webcamStreams
past this in:

# Start / stop webcam streamer daemons

case "$1" in
        sudo service ffserver start >/dev/null 2>&1 &
        sudo service ffmpeg start >/dev/null 2>&1 &
        echo "$0: started"
        sudo service ffmpeg stop >/dev/null 2>&1 &
        sudo service ffserver stop >/dev/null 2>&1 &
        echo "$0: stopped"
        echo "Usage: $0 {start|stop}" >&2

$cd /home/pi/.octoprint
$nano config.yaml
paste this in:

  - action: streamon
    command: /home/pi/scripts/webcamStreams start
    confirm: false
    name: Start video streams
  - action: streamoff
    command: sudo /home/pi/scripts/webcamStreams stop
    confirm: false
    name: Stop video streams

$sudo reboot

when the system reboots you will have the following:
on http://localhost, you will find the octoprint UI
on http://localhost/webcam/camera.mjpeg , you will find the Stream URL
on http://localhost/webcam/static-camera.jpg , you will find the Snapshot URL
on rtsp://localhost:8555/camera.sdp, you will find the RTSP URL
In Octorprint if you go to the top menu under the Power button, you will see two entries "Turn off the video streams" and "Turn on the video streams". These selection will give you control over all three streams. So they are all on or they are all off. On boot of your system, all three streams will be ON.

I noticed I am only using 19% CPU and my tempurature on my pi is at 44 degree C (I have a heat sink and active fan on the pi 4B)

I hope this helps someone. If anyone notices errors please point them out. But my setup is working at this time.

Happy and safe 3d printing!


Since you've added nginx I don't know if it's necessary to visit the haproxy configuration on there to adjust anything but perhaps not. As long as they're not fighting over ports, it's probably alright.

1 Like

Great write up. I wonder why @guysoft doesn't use ffmpegserver in lieu of mjpgstreamer in octopi. My best guess would be because of some issue with camera support maybe?

Which codebase is that?
Welcome to contribute info to this issue:

@guysoft, looks like it's compiled on Buster.

yes, ffmpeg is compiled on the raspberry pi but BUSTER OS is installed from the raspberry pi website

1 Like

yes, FFMPEG was compiled on BUSTER

Can someone give me a straightforward guide how to add this to OctoPi?

I do not know how you create the Octopi image. The latest version of FFMPEG that octopi uses DOES NOT contain the FFSERVER. That is why I point you to downloading version 3.4.6

This version (3.4.6) is maintained by someone who wanted to keep ffserver around to be used. I would assume that since ffmpeg is already installed with Octoprint, that creating a separate directory for this version to live in and then rename the ffmpeg version of the exe to ffmpeg3.4.6 that you could allow both versions of ffmpeg to live on the octopi image.

So download the ffmpeg v 3.4.6, follow my instruction above for compiling it on raspberry pi, then grab the ffmpeg.exe rename to ffmpeg3.4.6 and grab the ffserver.exe image and place them into a directory on your octopi image. Make PATH aware of the location. I know I have over simplified this but I do not create images for raspberry pi but I have worked as a instrumentation engineer.

BWT, the FFMPEG format for rtsp, is something I am still adjusting. I am finding out that their are different rtsp streams, some transmit via UDP protocol others use TCP, and yet others use UDP for control messages and TCP for the video. I recently have found that the FFMPEG format is rtmp and I can see it using VLC media player, but OMXPLAYER can not play the stream, OMXPLAYER expects all video streams to be rtsp over tcp. So I have to figure out how to force FFMPEG to send the stream rtsp over tcp. With the setup here, it is using rtsp over UDP (or the mix of UDP and TCP). If someone knows how to force FFMPEG to do rtsp over tcp, please post the solution here.

No need for you to build the image (though you are always welcome to learn, I can help with that).
What I do need is a step-by-step guide. What you wrote above looks like a good start, I don't think I saw it earlier.
If you have a single scipt that does it, its even better.

Do you want the script to run under Buster OS? So, I understand your need, you want a bash script that will download ffmpegv3.4.6 and then compile this version of ffmpeg, rename the compiled ffmpeg to ffmpeg3.4.6 and change the PATH variable to point to this version of ffmpeg and ffserver? Is that what you need?



If they have a make install I'd go with what it does, you can also use checkinstall
If you are unsure how to do this just place it anywhere for now, And I will finalize it.

I dont think we should have the version name in the folder, because it means it will constantly change.

my concern is I do not want to over ride the ffmpeg version that automatically gets installed with the BUSTER OS by default, so how do you want to handle that situation? Or maybe it does not matter because octoprint.img is built with which OS?

Give me about a week to try to get a script written, OK?

Sure! No rush :purple_heart:

You can place vlc in /usr/local or /opt That would not get overridden

Usually in a case like this, a symlink in the right location can allow you to have multiple versions of the same software installed on a system. Run...

echo $PATH


sudo sh
echo $PATH

So then you could create two symlinks in /root/bin and /usr/local/sbin, for example, which could select the one you really want. But then when referring to that program name later you'd want to probably remove the path prepending that.

@guysoft I've did a lot of testing yesterday and have come to the opinion, that replacing mjpg_streamer with FFmpeg and FFserver and streaming the webcam feed with that is not a good idea right now, because performance is actually not that great. Remember: The example above uses a resolution of 480x360 @ 10 fps which causes a CPU load of 19% on a RPi4. FFmpeg would probably kill performance on anything oder than that.

I followed the great tutorial by @GadgetAngel with my Pi4 (Did a manual setup of Raspbian, Octoprint and followed the steps above, but used HAProxy) and fiddled around with the settings, but it seams like FFmpeg refuses to just copy the MJPEG stream that comes from the webcam and reencodes it instead.

The terminal output of the ffmpeg commands shows:

Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 68514.700224, bitrate: N/A
    Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 1280x720, 20 fps, 20 tbr, 1000k tbn, 1000k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> mjpeg (native))
Press [q] to stop, [?] for help
Output #0, ffm, to 'http://localhost:8080/camera.ffm':
    creation_time   : now
    encoder         : Lavf57.83.100
    Stream #0:0: Video: mjpeg, yuvj422p(pc), 1280x720, q=3-3, 15360 kb/s, 20 fps, 1000k tbn, 20 tbc
      encoder         : Lavc57.107.100 mjpeg
    Side data:
      cpb: bitrate max/min/avg: 0/0/15360000 buffer size: 0 vbv_delay: -1

instead of:

Stream mapping:
  Stream #0:0 -> #0:0 (copy)

Nevertheless I ran some performance tests with the Pi4 and my Logitech C930e.

Resolution Framerate Format Threads CPU Usage
480x270 10 MJPEG -> MJPEG 1 25%
1280x720 17 (15 from Cam) MJPEG -> MJPEG 1 100%
1280x720 20 MJPEG -> MJPEG 2 70% + 40%
1280x720 24 MJPEG -> MJPEG 2 80% + 50%

Then I crammed out my Pi3 with a default OctoPi installation, and did some comparison with mjpeg_streamer:

Resolution Framerate Format CPU Usage
960x540 10 MJPEG -> MJPEG 0.7%
1280x720 20 MJPEG -> MJPEG 2%

As you can see, the performance impact of mjpeg_streamer on a Pi3 is negligible while the Pi4 with FFmpeg could only push out 1280x720 at a maximum of 17 FPS when using only one thread. On the other hand I've noticed that the required bandwith for 20 FPS is less than half when using FFmpeg. (6 MBit/s vs 16 MBit/s)

Because the Pi4 is just so fast I've settled on 720p@20 FPS with 2 Threads for now as the decreased bandwith from reencoding is actually more beneficial for my setup. Printing works fine and the OctoPrint interface still shows up a LOT quicker than with a Pi3. (Config in this gist)

Where I think FFmpeg gets really interesting is streaming H264 encoded video, though this comes with it's own intricacies, as you can't just push out a mp4 file for a live stream and latency as well as protocol support is tricky. (RTSP, RTMP is not supported in browsers)

I think one has the best chances with HLS (HTTP Live Streaming) as Android as well as iOS and Edge + Safari support it natively and all other desktop browsers can be made compatible with a little bit of Javascript.

I thought I could use the H264 output of my C930e directly, but that is unfortunately not supported in the linux kernel right now, so I've tested with the hardware accelerated h264_omx codec and was able to achieve:

Resolution Framerate Format Threads CPU Usage
1280x720 24 MJPEG -> H264 (OMX-Encoder) 1 95%

Most CPU usage I guess comes from decoding the MJPEG stream of the webcam. Latency was about 1 second. (Example Config in this gist)

The only problem I currently see is software support in anything that provides access to OctoPrint. (Apps like OctoRemote and Printoid or slicer integration as in Cura) Support for H264 is nonexistent, as they seem to be written to strictly expect an MJPEG stream.


thanks @Noguai for the detailed tests and explanation!

Great thread tyvm! I'm new here and debating on where to start which of my Pi's to use for what and if best to stick with the 3b+ for now or pop in my 4 i just got. Honestly after reading this I may try both, I have 2 3d printers atm and this was a great intro thread to this message board in general for me.

Thanks again,


I'll just throw this out there... I got interested in streaming v264/5 directly to the browser (Chrome), and cobbled together something that worked well 720p@30fps, smooth as butter, single digit CPU. This was with v4l and some python scripts I found. It was a proof of concept -- if it can be displayed itself in the browser, it can be displayed in a window on a web page.

I'll see if I can put it together again and share, maybe it will help. In any case, this is how streaming should be implemented in Octoprint.