webrtc example: feeding video data from a pipeline

Guennadi Liakhovetski g.liakhovetski at gmx.de
Sun Oct 4 19:21:43 UTC 2020


Hi,

I'm trying to run webrtc examples on a Raspberry Pi by feeding video input
to gstreamer from a pipeline, written to by a raspivid process. I have
first tried to run a simple RTP streaming by first starting a listener on
a PC, then creating a named pipe at /tmp/cam and starting gstreamer per

gst-launch-1.0 -v filesrc location=/tmp/cam ! h264parse ! rtph264pay ! udpsink host=192.168.1.16 port=9001

and then in a different window

raspivid -t 0 -h 480 -w 640 -fps 25 -hf -b 2000000 -o /tmp/cam

This workd. Gstreamer launches, waiting for data on the pipe and then once
the camera process starts, streaming kicks off.

Next I tried to achieve the same with sendonly and sendrecv gstreamer
examples. First I tried to only modify the pipeline command in the source
by replacing videotestsrc with filesrc like above, this didn't work. I
also tried integrating starting of the raspivid process from the examples
by adding a fork() and exec() calls to them. I see that both succeed, but
streaming doesn't start. What could be the problem and how can I fix it?

Thanks
Guennadi


More information about the gstreamer-devel mailing list