Break down pipelines using udpsrc/udpsink

Gary Thomas gary at mlbassoc.com
Tue Mar 5 07:43:35 PST 2013


On 2013-03-04 16:29, pfarmer wrote:
> On 03/04/2013 10:25 PM, Gary Thomas [via GStreamer-devel] wrote:
>> On 2013-03-04 13:15, pfarmer wrote:
>>
>> > Gary Thomas wrote
>> >> I'm trying to split up a pipeline using udpsrc/udpsink elements.
>> >> All of the "network" traffic will be local.  This is in anticipation
>> >> of having some variable number of sinks for a single data source
>> >> (I have a camera that I want to share amongst a number of applications)
>> >>
>> >> I have this working pipeline:
>> >>     gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
>> >>       !
>> >> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> >> \
>> >>       ! tee name=v-t \
>> >>         v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
>> >> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
>> >> This pipeline just takes the output of the camera and displays it in
>> >> a window on my desktop.
>> >>
>> >> I've tried splitting it up like this:
>> >>     # Start the source pipeline
>> >>     gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
>> >>        !
>> >> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> >> \
>> >>        ! tee name=v-src \
>> >>          v-src. ! queue ! rtpvrawpay pt=96 ! udpsink port=61000
>> >>
>> >>     # Display the image
>> >>     gst-launch -e -v udpsrc port=61000  \
>> >>       !
>> >>
>> "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)YCbCr-4:2:2,depth=(string)8,width=(string)1280,height=(string)720,colorimetry=(string)SMPTE240M,payload=(int)96"
>> >> \
>> >>       ! rtpvrawdepay \
>> >>       !
>> >> video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1
>> >> \
>> >>       ! tee name=v-t \
>> >>         v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120"
>> >> ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true
>> >>
>> >> I can see that the packets are being sent between the two pipelines,
>> >> but it looks like they are being quietly dropped (no messages).  It's
>> >> not totally clear that they are arriving to the second pipeline at all.
>> >> I've also tried this where the source pipeline is on one machine and
>> >> the receiving pipeline is on another, still no luck.
>> >>
>> >> I know that the CAPS match because I copied them from the output of
>> >> a previous run.
>> >>
>> >> Any hints what I might be doing wrong & how to make this work?
>> >
>> > Try to make the imagesink not syncing to the clock: sync=false
>> > In my case the ximagesink also does not work, I have to use the xvimagesink.
>> > For example:
>> > Sender:
>> >   gst-launch -v v4l2src ! tee ! queue ! ffmpegcolorspace !
>> > video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720, framerate=30/1
>> > ! rtpvrawpay ! udpsink port=61000  tee0. ! queue ! xvimagesink sync=false
>> > Receiver:
>> >   gst-launch udpsrc port=61000 caps="application/x-rtp,
>> > encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, width=(string)1280,
>> > height=(string)720, colorimetry=(string)BT709-2, depth=(string)8" !
>> > rtpvrawdepay ! xvimagesink
>>
>> That didn't make any difference, sorry.
>>
>> I also tried adding the extra branch to do a local display and ran this
>> between machines - the source machine displays fine, the destination shows nothing.
>>
>> Do the above quoted pipelines imply that they actually run for you?  Are you
>> using gstreamer-0.10 or 1.0?  (I'm stuck on 0.10 for the time being)
>>
>> >
>> >
>> > Gary Thomas wrote
>> >> Is this a reasonable approach to sharing this data?  Eventually
>> >> I need to grab & display it on the screen while at the same time
>> >> sending it out over a [physical] network.
>> >
>> > On the same machine shared memory might be better suited. I.e. with
>> > shmsink/shmsrc (at least on a Unix machine).
>> >
>>
>> Thanks for the pointer, I'll check it out.
>>
>> --
>> ------------------------------------------------------------
>> Gary Thomas                 |  Consulting for the
>> MLB Associates              |    Embedded world
>> ------------------------------------------------------------
>> _______________________________________________
>> gstreamer-devel mailing list
>> [hidden email] </user/SendEmail.jtp?type=node&node=4658879&i=0>
>> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>>
>>
>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>> If you reply to this email, your message will be added to the discussion below:
>> http://gstreamer-devel.966125.n4.nabble.com/Break-down-pipelines-using-udpsrc-udpsink-tp4658873p4658879.html
>> To start a new topic under GStreamer-devel, email [hidden email] </user/SendEmail.jtp?type=node&node=4658880&i=0>
>> To unsubscribe from GStreamer-devel, click here.
>> NAML
>> <http://gstreamer-devel.966125.n4.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>>
> I used gstreamer 0.10.
> "I also tried adding the extra branch" what do you mean by that? It should work more easy on the same machine. Also check your firewall.
> Does a minimalistic (without tee's and without a real src) example work?, like:
> Sender:
>   gst-launch videotestsrc horizontal-speed=5 ! video/x-raw-yuv ! rtpvrawpay ! udpsink port=5001
> Receiver:
>   gst-launch udpsrc port=5001 caps="application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)320, height=(string)240" ! rtpvrawdepay ! xvimagesink
>

Thanks for the pointers.  I started with your examples and adjusted
them to fit my needs.  Finally, I ended up rewriting my pipelines,
adding bits piecemeal, until I got something that worked.  I'm
still unsure why my originals did not work, but these pipelines
function as expected:

Sender (camera source):
   gst-launch -e -v v4l2src device=/dev/video0 queue-size=16 \
    ! video/x-raw-yuv,format='(fourcc)UYVY',width=1280,height=720,framerate=30/1 \
    ! tee name=v-src \
      v-src. ! queue leaky=upstream ! rtpvrawpay pt=96 ! shmsink socket-path=/tmp/shm-stream.sock \
      v-src. ! queue leaky=upstream ! rtpvrawpay pt=96 ! shmsink socket-path=/tmp/shm-preview.sock \
      v-src. ! queue leaky=upstream ! rtpvrawpay pt=96 ! shmsink socket-path=/tmp/shm-record.sock \

One receiver (display on screen):
   gst-launch -vv shmsrc socket-path=/tmp/shm-preview.sock \
     ! "application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)1280, height=(string)720, payload=(int)96" \
     ! rtpvrawdepay ! tee name=v-t \
       v-t. ! queue ! videoscale ! "video/x-raw-yuv,width=240,height=120" ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true

Curiously, the sender pipeline will hang if the queues are not leaky.
I also ended up using the shared memory connections as UDP even over
the localhost (lo) connection was way too much overhead when I have
full size video to handle.

-- 
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------


More information about the gstreamer-devel mailing list