Experimental HLS support

Yes, ffmpeg can stream directly, but I don’t know a way to stream it to the web page.

Sorry, I don’t have time to implement a better way to switch the streamer both in UI and at the source.

Thanks @Chudsaviet [Chudsaviet]. Good to know about the hls_time setting. Will have a play. As some side testing I did manage to use ffmpeg and GitHub - aler9/rtsp-simple-server: ready-to-use RTSP / RTMP / HLS server and proxy that allows to read, pu to stream via HLS as well but I havent managed to tweak the latency yet. Approach is the same so naturally there will be lag.

The problem is HLS segmented nature. Each segment is a separate file. ‘hls_time’ is controlling the length of it. Also take into consideration that the player caches some segments before playing.
There is newer Low Latency HLS, but I have found its implementation in ffmpeg unusable at previous year.

One question/confirmation: this should be safe to keep running 24/7 since video is written to /run and /run is tmpfs (in RAM)? It's not going to eat up an SD card?

Also, in case it's useful to someone else I was able to use my Logitech C920's native H264 stream to stream 1080p while using < 10% of CPU (according to top). I used this ffmpeg command:

sudo -u webcam ffmpeg -vcodec h264 -pix_fmt h264 -i /dev/video0 -c:v copy \
-f hls -hls_time 1 \
-hls_flags delete_segments+program_date_time+temp_file+independent_segments \
-hls_allow_cache 0 -hls_segment_type fmp4 \
-hls_list_size 32 -hls_delete_threshold 64 \

This of course isn't outputting the snapshot or lower res variants, but I could probably add that back in. This probably won't work on newer C920s I don't think because the new ones don't have h264 streams.

I should also probably save it to the /run/webcam/hls/1080p/stream.m3u8 directory instead :slight_smile:

1 Like
  1. Yes, running 24/7 and using ‘tmpfs’ is the idea. Don’t use ‘/tmp’ though - its not always in RAM.
  2. You have got it right! I used reencoding on Rpi for compatibility reasons. If streaming h264 directly from cam works - its great! Yes, you can add decoding of the stream on sole purpose of generating JPEGs. The only problem I see here is that you can’t specify keyframe frequency if you a streaming from cam, so there could be problems with segments size and therefore latency.
  3. You can specify your own resolutions set for HLS - its easy. I copy root ‘stream.m3u8’ from permanent storage to ‘/run’ each time I start the systemd unit. See ‘/var/lib/ffmpeg_hls/stream.m3u8’.
1 Like

Yep, I'm seeing ~30s latency. I've seen some info on the internet that suggests there exists code that can configure the bitrate and gop/keyframe interval on these cameras, but I don't think that code exists in ffmpeg. I'll probably just stick with the long latency, or try out 1080p with h264_omx encoder (like in the default ffmpeg_hls.service unit).

BTW @Chudsaviet thanks for doing this! It's really nice to have real HD video from Octoprint.

1 Like

You are very welcome :slight_smile: It was just a fun project.
Also, look if your camera offers any controls via ‘v4l2-ctl’, maybe there is a keyframe setting.

1 Like

Yeah I looked through those settings with v4l2-ctl --device=/dev/video0 --list-ctrls and don't see any keyframe and bitrate settings. Don't think v4l2 supports those settings on this camera without some new code.

BTW @Chudsaviet when you were working on this did you consider WebRTC as an alternative? It seems like WebRTC is probably the most universal low latency video protocol around. Alternatives are Apple's Low-Latency HLS, MPEG-DASH, and Twitter/Periscope's LHLS, but none of those have very widespread player or server support I don't think. Here's an example of using a Pi+WebRTC to make a low latency camera.

I tried webrtc implementation in the past but never got anything reliable. This link you provided is interesting and might be able to be done as a plugin. I know TSD does something similar killing the default mjpgstreamer service and providing its own. Might be able to do something similar integrating balena-cam... https://github.com/balenalabs/balena-cam/blob/3ac79c3be2c5d2910e8ee5780f062a83222f3b01/balena-cam/app/server.py


Exactly what I thought, I tried to find a good implementation of WebRTC cam back then, but no luck. Maybe balenaCan will do the trick.

1 Like

Ok cool, if I stay motivated on this I'll poke at WebRTC next. In case it's helpful to others, here's the best I came up with for 1080p streaming direct from the webcam along with jpeg snapshots.

This is my /etc/systemd/system/ffmpeg_hls.service

Description=FFMPEG HLS webcam streaming service

ExecStartPre=/bin/rm -rf /run/webcam
ExecStartPre=/bin/mkdir -p /run/webcam/hls
ExecStartPre=/bin/mkdir -p /run/webcam/hls/1080p
ExecStartPre=/bin/mkdir -p /run/webcam/jpeg
ExecStartPre=/bin/cp /var/lib/ffmpeg_hls/stream.m3u8 /run/webcam/hls/stream.m3u8
ExecStartPre=/bin/chown -R webcam:webcam /run/webcam
ExecStartPre=/bin/chmod -R 0755 /run/webcam

ExecStart=/usr/bin/sudo -u webcam \
    /usr/bin/ffmpeg \
    -vcodec h264 \
    -pix_fmt h264 \
    -video_size 1920x1080 \
    -i /dev/video0 \
    -c:v copy \
    -f hls -hls_time 1 \
    -hls_flags delete_segments+program_date_time+temp_file+independent_segments \
    -hls_allow_cache 0 -hls_segment_type fmp4 \
    -hls_list_size 5 -hls_delete_threshold 6 \
    /run/webcam/hls/1080p/stream.m3u8 \
    -vf "select=eq(pict_type\,I)" \
    -vsync vfr \
    -c:v mjpeg -q:v 0 \
    -f image2 -update 1 -atomic_writing 1 \


And my /var/lib/ffmpeg_hls/stream.m3u8


This takes the raw 1080p stream from the camera and packages it as HLS. Latency is currently ~22s after i shortened the playlist a bit, but it probably depends on the player implementation. Also, for every I-frame in the video (-vf "select=eq(pict_type\,I)"), it dumps /run/webcam/jpeg/frame.jpg which keeps the decode processing power low since it's only doing full h264 decodes on the I-frames. This uses ~120% CPU on my Pi 3B 1.2. I also tried using h264_mmal and h264_v4l2m2m for decoding h264, which should lower CPU requirements even more since they use hardware accelerated decoding, but both didn't work with the pict_type filter (possibly they don't forward frame metadata, just decoded frame).

Short update on WebRTC: I was able to get the aiortc webcam example to run out of the box on OctoPi. I think it was reading the yuv stream from my camera and encoding it as h264 with openmax. And the colors were messed up. But not a bad start for working with nearly no work. Here's what I did:

git clone git@github.com:aiortc/aiortc.git
sudo apt update
sudo apt install python3-venv libsrtp2-dev
python3 -m venv venv
source venv/bin/activate
pip install google-crc32c==1.1.2
pip install aiohttp aiortc opencv-python
sudo systemctl stop ffmpeg_hls.service
python webcam.py

Latency is < 1s. Now just need to fix up the video settings to fix the color! Maybe I'll spin this off into a new thread to not clutter discussion about HLS.

Update: Spun off into a new thread.

Hey thanks for the contribution, I recently found that HLS stream stopped working on Octoprint. I tried to switch to safe mode but it still doesn't work. It creates blob and the video frame on the page but it just doesn't show anything.

Any one know what's going on? it was working like 2 months ago. :frowning:

I don’t know which changes were done since the HLS inclusion in OctoPrint.
If you try to open the stream URL with Safari or VLC, is it working?

Yes, I tried chrome, safari, VLC it loads correctly, I even switch different cam and streamer lib, no luck.

Interesting. What do you mean when you say “I tried different streamer lib”?
Which camera are you using?
By the way, let me install a latest OctoPi image and see what happens. It will take some time, sorry.

Oh, i built my octoprint under linux fyi.

I use 'restreamer' as streaming service to stream rtsp as hls feed.
Tried another lib i forgot name about it but also convert rtsp to hls.

xiaofang webcam as ipcam that serves rtsp feed.
Samsung j7000 installed ipcam app.

Thanks for the kind help!

I've used restreamer for this before as well and it works for me. @Chudsaviet here's instructions on how I personally set that up in octopi 0.18.

yep, it worked well before. but it stopped working 1 or 2 months ago :frowning:
I even reinstall everything (os/octoprint)