Confusing behaviour

Gary Thomas gary at mlbassoc.com
Wed Mar 6 10:31:45 PST 2013


I'm having a very hard time understanding why some of my pipelines
work while others just sit there.

For example, I have a video source coming from a separate pipeline
via shared memory.  This pipeline works great to display that video:
   gst-launch -e -vv shmsrc socket-path=/tmp/shm-record.sock \
     ! "application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96" \
     ! rtpvrawdepay \
     ! videoscale \
     ! "video/x-raw-yuv,width=240,height=120" \
     ! ffmpegcolorspace \
     ! 'video/x-raw-yuv,format=(fourcc)NV12' \
     ! queue ! ffmpegcolorspace ! ximagesink force-aspect-ratio=true

However, this very similar pipeline which tries to compress & record
the video instead never does anything.  There are no warnings, etc,
and I can see that the whole pipeline does not seem to get started.
Using the -vvv I see the various components get stitched together and
in this case, this isn't happening.  Here's the pipeline:
   gst-launch -e -vvv shmsrc socket-path=/tmp/shm-record.sock  \
      ! capsfilter caps='application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96' \
      ! rtpvrawdepay \
      ! ffmpegcolorspace \
      ! capsfilter caps='video/x-raw-yuv,format=(fourcc)NV12' \
      ! queue \
      ! ducatih264enc \
      ! h264parse ! filesink location=/tmp/junk.mp4
When I run it, I get this output:
   Pipeline is PREROLLING ...
   /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96, 
media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
   /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96, 
media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
   /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1
   /GstPipeline:pipeline0/GstDucatiH264Enc:ducatih264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)NV12, width=(int)720, height=(int)480, framerate=(fraction)0/1

So the pipeline construction gets as far as the ducati element
(this is a hardware assisted H.264 encoder) - it creates the
'sink' pad but not the 'src' pad which is needed to complete
the rest of the pipeline.  There are no other messages.  I've
tried enabling various debug levels and I still don't see any
indication why the src pad is not being created.

What's really bizarre is that if I write this:
   gst-launch -e -vvv shmsrc socket-path=/tmp/shm-record.sock  \
      ! capsfilter caps='application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96' \
      ! rtpvrawdepay \
      ! ffmpegcolorspace \
      ! capsfilter caps='video/x-raw-yuv,format=(fourcc)NV12,framerate=(fraction)30/1' \
      ! queue \
      ! ducatih264enc \
      ! h264parse ! filesink location=/tmp/junk.mp4
then it doesn't even get this far into the construction!  It stops
after only these outputs:
   Pipeline is PREROLLING ...
   /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96, 
media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
   /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate (fraction)0/1
   /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96, 
media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW

What's going on here?

Interestingly enough, this nearly identical pipeline does work:
   gst-launch -e -vvv v4l2src device=/dev/video1 num-buffers=100 \
    ! "video/x-raw-yuv, format=(fourcc)UYVY, width=720, height=480, framerate=(fraction)30/1" \
    ! ffmpegcolorspace \
    ! "video/x-raw-yuv, format=(fourcc)NV12, width=720, height=480, framerate=(fraction)30/1" \
    ! ducatih264enc ! h264parse ! qtmux ! filesink location=/tmp/testb.mov
 From my perspective, the only difference is that the original source
is from the raw device instead of going through a RTP chain.

How can I figure out why my pipelines are behaving like this?

-- 
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------


More information about the gstreamer-devel mailing list