Experimental HLS support

Let me try to reproduce when I will have time. It can come later than sooner :frowning:

Sorry for my ignorance...but how are you guys testing this?

I changed the following line in octopi.txt:

camera_streamer=mjpeg

to

camera_streamer=hls

What else am I missing?

I think you need to change your stream url in OctoPrint's webcam settings to /hls/stream.m3u8 and the snapshot url to /jpeg/snaphot.jpg (not 100% sure on this one).

/jpeg/frame.jpg

For the snapshot. Found it up the thread :slight_smile:

Hi - Awesome work nad thank you for your effots on this plugin and I have managed to get my Creality USB camera working with this. Any ideas on how to reduce the lag?

Im running it at 1920x1080 at 30frames. RPI3B+ - CPU ~ 44%.

Current lag is ~ 3 secs

Thanks

That's the nature of this implementation, as it has to store the images in order to serve them, and to be fair 3 secs is way less lag then what I was seeing of ~30 secs.

Ah I did wonder. I assume ffmpeg should be able to serve directly as well if I try this manually right?

pretty sure the nginx proxy is necessary to middleman the stream and handle timings, which is built into the octopi 0.18 image.

Thanks. Have been researching several alternatives but there isnt any that I can see which are easily implemented that wont produce a lag of less than 3secs. This implementation does work really well but always keen to see if I can get it lower than 3secs.

You can’t make it 10x less, but ypu can experiment with ffmpeg settings, specifically with ‘hls_time’ setting. Please not that it has to do with frame rate and frequency of key frames, it means that in each segment there has to be at least one key frame.
Here is the file:

To apply changes, you have to reload a systemd service, learn here: Systemd Essentials: Working with Services, Units, and the Journal | DigitalOcean

Yes, ffmpeg can stream directly, but I don’t know a way to stream it to the web page.

Sorry, I don’t have time to implement a better way to switch the streamer both in UI and at the source.

Thanks @Chudsaviet [Chudsaviet]. Good to know about the hls_time setting. Will have a play. As some side testing I did manage to use ffmpeg and GitHub - aler9/rtsp-simple-server: ready-to-use RTSP / RTMP / HLS server and proxy that allows to read, pu to stream via HLS as well but I havent managed to tweak the latency yet. Approach is the same so naturally there will be lag.

The problem is HLS segmented nature. Each segment is a separate file. ‘hls_time’ is controlling the length of it. Also take into consideration that the player caches some segments before playing.
There is newer Low Latency HLS, but I have found its implementation in ffmpeg unusable at previous year.

One question/confirmation: this should be safe to keep running 24/7 since video is written to /run and /run is tmpfs (in RAM)? It's not going to eat up an SD card?

Also, in case it's useful to someone else I was able to use my Logitech C920's native H264 stream to stream 1080p while using < 10% of CPU (according to top). I used this ffmpeg command:

sudo -u webcam ffmpeg -vcodec h264 -pix_fmt h264 -i /dev/video0 -c:v copy \
\
-f hls -hls_time 1 \
-hls_flags delete_segments+program_date_time+temp_file+independent_segments \
-hls_allow_cache 0 -hls_segment_type fmp4 \
-hls_list_size 32 -hls_delete_threshold 64 \
/run/webcam/hls/480p/stream.m3u8

This of course isn't outputting the snapshot or lower res variants, but I could probably add that back in. This probably won't work on newer C920s I don't think because the new ones don't have h264 streams.

I should also probably save it to the /run/webcam/hls/1080p/stream.m3u8 directory instead :slight_smile:

1 Like
  1. Yes, running 24/7 and using ‘tmpfs’ is the idea. Don’t use ‘/tmp’ though - its not always in RAM.
  2. You have got it right! I used reencoding on Rpi for compatibility reasons. If streaming h264 directly from cam works - its great! Yes, you can add decoding of the stream on sole purpose of generating JPEGs. The only problem I see here is that you can’t specify keyframe frequency if you a streaming from cam, so there could be problems with segments size and therefore latency.
  3. You can specify your own resolutions set for HLS - its easy. I copy root ‘stream.m3u8’ from permanent storage to ‘/run’ each time I start the systemd unit. See ‘/var/lib/ffmpeg_hls/stream.m3u8’.
1 Like

Yep, I'm seeing ~30s latency. I've seen some info on the internet that suggests there exists code that can configure the bitrate and gop/keyframe interval on these cameras, but I don't think that code exists in ffmpeg. I'll probably just stick with the long latency, or try out 1080p with h264_omx encoder (like in the default ffmpeg_hls.service unit).

BTW @Chudsaviet thanks for doing this! It's really nice to have real HD video from Octoprint.

1 Like

You are very welcome :slight_smile: It was just a fun project.
Also, look if your camera offers any controls via ‘v4l2-ctl’, maybe there is a keyframe setting.

1 Like

Yeah I looked through those settings with v4l2-ctl --device=/dev/video0 --list-ctrls and don't see any keyframe and bitrate settings. Don't think v4l2 supports those settings on this camera without some new code.

BTW @Chudsaviet when you were working on this did you consider WebRTC as an alternative? It seems like WebRTC is probably the most universal low latency video protocol around. Alternatives are Apple's Low-Latency HLS, MPEG-DASH, and Twitter/Periscope's LHLS, but none of those have very widespread player or server support I don't think. Here's an example of using a Pi+WebRTC to make a low latency camera.

I tried webrtc implementation in the past but never got anything reliable. This link you provided is interesting and might be able to be done as a plugin. I know TSD does something similar killing the default mjpgstreamer service and providing its own. Might be able to do something similar integrating balena-cam... https://github.com/balenalabs/balena-cam/blob/3ac79c3be2c5d2910e8ee5780f062a83222f3b01/balena-cam/app/server.py

2 Likes