Break down pipelines using udpsrc/udpsink
pfarmer
flacone at gmx.de
Mon Mar 4 12:15:08 PST 2013
Gary Thomas wrote
> I'm trying to split up a pipeline using udpsrc/udpsink elements.
> All of the "network" traffic will be local. This is in anticipation
> of having some variable number of sinks for a single data source
> (I have a camera that I want to share amongst a number of applications)
>
> I have this working pipeline:
> gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
> !
> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
> \
> ! tee name=v-t \
> v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
> This pipeline just takes the output of the camera and displays it in
> a window on my desktop.
>
> I've tried splitting it up like this:
> # Start the source pipeline
> gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
> !
> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
> \
> ! tee name=v-src \
> v-src. ! queue ! rtpvrawpay pt=96 ! udpsink port=61000
>
> # Display the image
> gst-launch -e -v udpsrc port=61000 \
> !
> "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)YCbCr-4:2:2,depth=(string)8,width=(string)1280,height=(string)720,colorimetry=(string)SMPTE240M,payload=(int)96"
> \
> ! rtpvrawdepay \
> !
> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
> \
> ! tee name=v-t \
> v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
>
> I can see that the packets are being sent between the two pipelines,
> but it looks like they are being quietly dropped (no messages). It's
> not totally clear that they are arriving to the second pipeline at all.
> I've also tried this where the source pipeline is on one machine and
> the receiving pipeline is on another, still no luck.
>
> I know that the CAPS match because I copied them from the output of
> a previous run.
>
> Any hints what I might be doing wrong & how to make this work?
Try to make the imagesink not syncing to the clock: sync=false
In my case the ximagesink also does not work, I have to use the xvimagesink.
For example:
Sender:
gst-launch -v v4l2src ! tee ! queue ! ffmpegcolorspace !
video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720, framerate=30/1
! rtpvrawpay ! udpsink port=61000 tee0. ! queue ! xvimagesink sync=false
Receiver:
gst-launch udpsrc port=61000 caps="application/x-rtp,
encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, width=(string)1280,
height=(string)720, colorimetry=(string)BT709-2, depth=(string)8" !
rtpvrawdepay ! xvimagesink
Gary Thomas wrote
> Is this a reasonable approach to sharing this data? Eventually
> I need to grab & display it on the screen while at the same time
> sending it out over a [physical] network.
On the same machine shared memory might be better suited. I.e. with
shmsink/shmsrc (at least on a Unix machine).
--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Break-down-pipelines-using-udpsrc-udpsink-tp4658873p4658878.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
More information about the gstreamer-devel
mailing list