h264-streaming to website using Gstreamer-1.0

Nicolas Dufresne nicolas at ndufresne.ca
Thu Nov 1 17:46:10 UTC 2018


Le mercredi 31 octobre 2018 à 06:45 -0500, michi1994 a écrit :
> Dear helpfull people,
> 
> I am very much new to the whole GStreamer-thing, therefore I would be happy
> if you could help me.
> 
> I need to stream a near-zero-latency videosignal from a webcam to a server
> and them be able to view the stream on a website.
> The webcam is linked to a Raspberry Pi 3, because there are
> space-constraints on the mounting plattform. As a result of using the Pi I
> really can't transcode the video on the Pi itself. Therefore I bought a
> Logitech C920 Webcam, which is able to output a raw h264-stream.
> 
> By now I managed to view the stream on my windows-machine, but didn't manage
> to get the whole website-thing working.
> My "achivements":
>  
>    -Sender:
>                 gst-launch-1.0 -e -v v4l2src device=/dev/video0 !
> video/x-h264,width=1920,height=1080,framerate=30/1 ! rtph264pay pt=96
> config-interval=5 mtu=60000 ! udpsink host=192.168.0.132 port=5000
> 
> My understanding of this command is: Get the signal of video-device0, which
> is a h264-stream with a certain width, height and framerate. Then pack it
> into a rtp-package with a high enough mtu to have no artefacts and capsulate
> the rtp-package into a udp-package and stream in to a ip+port.
> 
>   -Receiver:
>                  gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp,
> payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink
> sync=false text-overlay=false
> 
> My understanding of this command is: Receive a udp-package at port 5000.
> Application says it is a rtp-package inside. I don't know what
> rtpjitterbuffer does, but it reduces the latency of the video a bit.
> rtph264depay says that inside the rtp is a h264-encoded stream. To get the
> raw data, which fpsdisplaysink understands we need to decode the h264 signal
> by the use of avdec_h264.
> 
> My next step was to change the receiver-sink to a local tcp-sink and output
> that signal with the following html5-tag: 
> 
> <video width=320 height=240 autoplay>
>                         <source src="http://localhost:#port#">;
> </video>

Only WebRTC will let you render a low latency stream into your browser.
It is a tad more complicated and will required you to code things. You
also need very recent (probably git master) or GStreamer as support
this is is really new. This repository shows briefly how this works and
how to setup.

https://github.com/centricular/gstwebrtc-demos


> 
> If I view the website I can't see the stream, but I can view the videodata,
> which arrived as plain text, when I analyse the data.
> 
> Am I missing a videocontainer like MP4 for my video?
> Am I wrong with decoding?
> What am I doing wrong?
> How can I improve my solution?
> How would you solve that problem?
> 
> Best regards
> 
> 
> 
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: This is a digitally signed message part
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20181101/02e0198d/attachment.sig>


More information about the gstreamer-devel mailing list