Raspberry Pi Low-Latency Stream with GStreamer

Gary Thomas gary.thomas at piksel.com
Thu Oct 26 09:43:57 UTC 2017


I suggest you add queue elements downstream from the tee. I do something similar with RPi and the receive pipeline I use is:

gst-launch-1.0 tcpclientsrc host=pizero1 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! tee name=parsed parsed. ! queue ! avdec_h264 ! autovideosink sync=false parsed. ! queue ! mpegtsmux ! filesink location=cam.ts

Gary

-----Original Message-----
From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of waymond91
Sent: 26 October 2017 08:52
To: gstreamer-devel at lists.freedesktop.org
Subject: Raspberry Pi Low-Latency Stream with GStreamer

Hello All!
I am trying to setup a raspberry pi as a low latency network camera (max.
latency 200ms).
In an ideal world, the raspberry pi would read the camera, send it over wifi to a host pc.
This host PC would both display the video stream and save it to a file. 
A key aspect that I am really struggling with is:
I would like the displayed image to be saved to a file that included a timestamp of when an individual frame was actually displayed.
That being said, this is my first time doing any video processing whatsoever.
I was able to find and build a gstreamer source for the pi camera from the git repository here:
https://github.com/thaytan/gst-rpicamsrc

This seems to work. With the raspberry pi I am able to start a network pipeline with the following command:
*$ gst-launch-1.0 rpicamsrc bitrate=1000000 \
    ! 'video/x-h264,width=640,height=480' \
    ! h264parse \
    ! queue \
    ! rtph264pay config-interval=1 pt=96 \
    ! gdppay \
    ! udpsink host=[MY IP] port=5000*

And on my host PC I can display a live stream with about 150mS latency with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! autovideosink sync=false*

Or save the stream with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! filesink location=video.h264*

So I am beginning to feel like I am on track. 
Ultimately, however, I feel like I need to decode the h264 into some other format (MP4?) so that I can actually start displaying and saving individual frames and associating them with the current time (or at the least the time between frames).
At the very least, I would like to be able to save and display the h264 stream with one pipeline.

On my host side, I have tried a tee element like so:
*$ gst-launch-1.0 udpsrc port=5000 \
  ! gdpdepay \
  ! rtph264depay \
  ! avdec_h264 \
  ! videoconvert \
  ! tee name = t \
  t. ! autovideosink sync=false \
  t. ! filesink location=tee.h264*

However, the video stream freezes up shortly after initialization. Any ideas of how I can fix this?
Any ideas of how I can save the decoded, displayed frames and the time it was displayed? I don't mind doing a little post-processing to get this lined up correctly :P Any ideas on how to cut latency?

Thanks again for the help!
PS Sorry I couldn't get code tags to work...



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
gstreamer-devel at lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


More information about the gstreamer-devel mailing list