Streaming video to a website

michi1994 mistmichaelhoeflmaier at gmx.at
Wed Oct 31 11:09:20 UTC 2018


Dear helpfull people,

I am very much new to the whole GStreamer-thing, therefore I would be happy
if you could help me.

I need to stream a near-zero-latency videosignal from a webcam to a server
and them be able to view the stream on a website.
The webcam is linked to a Raspberry Pi 3, because there are
space-constraints on the mounting plattform. As a result of using the Pi I
really can't transcode the video on the Pi itself. Therefore I bought a
Logitech C920 Webcam, which is able to output a raw h264-stream.

By now I managed to view the stream on my windows-machine, but didn't manage
to get the whole website-thing working.
My "achivements":
 
   -Sender:
                gst-launch-1.0 -e -v v4l2src device=/dev/video0 !
video/x-h264,width=1920,height=1080,framerate=30/1 ! rtph264pay pt=96
config-interval=5 mtu=60000 ! udpsink host=192.168.0.132 port=5000

My understanding of this command is: Get the signal of video-device0, which
is a h264-stream with a certain width, height and framerate. Then pack it
into a rtp-package with a high enough mtu to have no artefacts and capsulate
the rtp-package into a udp-package and stream in to a ip+port.

  -Receiver:
                 gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp,
payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink
sync=false text-overlay=false

My understanding of this command is: Receive a udp-package at port 5000.
Application says it is a rtp-package inside. I don't know what
rtpjitterbuffer does, but it reduces the latency of the video a bit.
rtph264depay says that inside the rtp is a h264-encoded stream. To get the
raw data, which fpsdisplaysink understands we need to decode the h264 signal
by the use of avdec_h264.

My next step was to change the receiver-sink to a local tcp-sink and output
that signal with the following html5-tag: <video width=320 height=240
autoplay>
                        <source src="http://localhost:#port#">
               </video>

If I view the website I can't see the stream, but I can view the videodata,
which arrived as plain text, when I analyse the data.

Am I missing a videocontainer like MP4 for my video?
Am I wrong with decoding?
What am I doing wrong?
How can I improve my solution?
How would you solve that problem?

Best regards



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/


More information about the gstreamer-devel mailing list