Break down pipelines using udpsrc/udpsink

Gary Thomas gary at mlbassoc.com
Mon Mar 4 13:23:36 PST 2013


On 2013-03-04 13:15, pfarmer wrote:
> Gary Thomas wrote
>> I'm trying to split up a pipeline using udpsrc/udpsink elements.
>> All of the "network" traffic will be local.  This is in anticipation
>> of having some variable number of sinks for a single data source
>> (I have a camera that I want to share amongst a number of applications)
>>
>> I have this working pipeline:
>>     gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
>>       !
>> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> \
>>       ! tee name=v-t \
>>         v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
>> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
>> This pipeline just takes the output of the camera and displays it in
>> a window on my desktop.
>>
>> I've tried splitting it up like this:
>>     # Start the source pipeline
>>     gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
>>        !
>> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> \
>>        ! tee name=v-src \
>>          v-src. ! queue ! rtpvrawpay pt=96 ! udpsink port=61000
>>
>>     # Display the image
>>     gst-launch -e -v udpsrc port=61000  \
>>       !
>> "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)YCbCr-4:2:2,depth=(string)8,width=(string)1280,height=(string)720,colorimetry=(string)SMPTE240M,payload=(int)96"
>> \
>>       ! rtpvrawdepay \
>>       !
>> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> \
>>       ! tee name=v-t \
>>         v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
>> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
>>
>> I can see that the packets are being sent between the two pipelines,
>> but it looks like they are being quietly dropped (no messages).  It's
>> not totally clear that they are arriving to the second pipeline at all.
>> I've also tried this where the source pipeline is on one machine and
>> the receiving pipeline is on another, still no luck.
>>
>> I know that the CAPS match because I copied them from the output of
>> a previous run.
>>
>> Any hints what I might be doing wrong & how to make this work?
>
> Try to make the imagesink not syncing to the clock: sync=false
> In my case the ximagesink also does not work, I have to use the xvimagesink.
> For example:
> Sender:
>   gst-launch -v v4l2src ! tee ! queue ! ffmpegcolorspace !
> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720, framerate=30/1
> ! rtpvrawpay ! udpsink port=61000  tee0. ! queue ! xvimagesink sync=false
> Receiver:
>   gst-launch udpsrc port=61000 caps="application/x-rtp,
> encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, width=(string)1280,
> height=(string)720, colorimetry=(string)BT709-2, depth=(string)8" !
> rtpvrawdepay ! xvimagesink

That didn't make any difference, sorry.

I also tried adding the extra branch to do a local display and ran this
between machines - the source machine displays fine, the destination shows nothing.

Do the above quoted pipelines imply that they actually run for you?  Are you
using gstreamer-0.10 or 1.0?  (I'm stuck on 0.10 for the time being)

>
>
> Gary Thomas wrote
>> Is this a reasonable approach to sharing this data?  Eventually
>> I need to grab & display it on the screen while at the same time
>> sending it out over a [physical] network.
>
> On the same machine shared memory might be better suited. I.e. with
> shmsink/shmsrc (at least on a Unix machine).
>

Thanks for the pointer, I'll check it out.

-- 
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------


More information about the gstreamer-devel mailing list