Still stuck

Gary Thomas gary at mlbassoc.com
Thu Mar 7 09:52:05 PST 2013


On 2013-03-07 09:26, Gary Thomas wrote:
> On 2013-03-07 08:39, David Röthlisberger wrote:
>> On 7 Mar 2013, at 15:20, Gary Thomas wrote:
>>> Sorry if this is an old record, but I'm getting nowhere...
>>>
>>> Why does this pipeline work:
>>>   gst-launch -e -vvv shmsrc socket-path=/tmp/shm-stream.sock \
>>>     ! "application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96" \
>>>     ! rtpvrawdepay \
>>>     ! fakesink
>>>
>>> and this one does not?
>>>   gst-launch -e -vvv shmsrc socket-path=/tmp/shm-stream.sock \
>>>     ! "application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96" \
>>>     ! rtpvrawdepay \
>>>     ! ffenc_mpeg4 \
>>>     ! fakesink
>>>
>>> Note: I only added the mpeg4 encoder element which is compatible
>>> and there are no errors/complaints.
>>>
>>> The elements all have compatible [default] caps and it should
>>> just work.  If I trade the 'shmsrc'+'rtpvrawdepay' for something
>>> like 'videotestsrc', both can work.
>>
>>
>> First of all I haven't read through your previous emails, so apologies
>> for that. What do you mean by "doesn't work"? What are the symptoms?
>> What steps have you taken to investigate the problem -- have you enabled
>> debug logging, and what has that told you?
>
> In this case "doesn't work" means that when I run the first pipeline,
> fakesink prints messages about the data it is consuming/discarding.
> In the case of the second pipeline, there are no such messages.
>
> I've tried generating debug dumps at various levels.  What I can see is
> that in both cases, the pipeline I specify gets created and it moves to
> PLAYING.  In the non-working case, nothing useful seems to happen after
> this - shmsrc grabs data and rtpdepay says it is dispatching it (there
> are line-by-line messages) but nothing comes out of ffenc_mpeg4.
>
> I wonder if this is a case where ffenc_mpeg4 does not like being fed the
> video frame line by line (which is what comes out of the rtpvrawdepay element)?
> I tried adding queue with a blocking threshold, but that didn't help either.
>    gst-launch -e -vvv shmsrc socket-path=/tmp/shm-stream.sock \
>      ! "application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96" \
>      ! rtpvrawdepay \
>      ! queue min-threshold-bytes=691200 \
>      ! ffenc_mpeg4 \
>      ! fakesink
> Where '691200' is the size of exactly one video frame.
>
> If I run this pipeline, I get this output:
> Setting pipeline to PAUSED ...
> Pipeline is PREROLLING ...
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96,
> media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96,
> media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
>     ... sits there
> ^CCaught interrupt -- handling interrupt.
> Interrupt: Stopping pipeline ...
> Setting pipeline to NULL ...
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = NULL
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = NULL
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = NULL
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = NULL
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = NULL
> Freeing pipeline ...
>
> If I remove the ffenc_mpeg4 element, I see this:
> Setting pipeline to PAUSED ...
> Pipeline is PREROLLING ...
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96,
> media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, sampling=(string)YCbCr-4:2:2, width=(string)720, height=(string)480, payload=(int)96,
> media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw-yuv, width=(int)720, height=(int)480, format=(fourcc)UYVY, framerate=(fraction)0/1
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "preroll   ******* "
> Pipeline is PREROLLED ...
> Setting pipeline to PLAYING ...
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "event   ******* (fakesink0:sink) E (type: 102, GstEventNewsegment, update=(boolean)false,rate=(double)1,
> applied-rate=(double)1, format=(GstFormat)GST_FORMAT_TIME, start=(gint64)0, stop=(gint64)-1, position=(gint64)0;) 0xe4040"
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (691200 bytes, timestamp: none, duration: none, offset: -1, offset_end: -1, flags: 32
> discont ) 0xb5402478"
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (691200 bytes, timestamp: none, duration: none, offset: -1, offset_end: -1, flags: 0
> ) 0xb5402548"
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (691200 bytes, timestamp: none, duration: none, offset: -1, offset_end: -1, flags: 0
> ) 0xb5402818"
> /GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = "chain   ******* (fakesink0:sink) (691200 bytes, timestamp: none, duration: none, offset: -1, offset_end: -1, flags: 0
> ) 0xb54024e0"
>     ... until I type ^C
>
> The debug logs for these two examples are at
>    http://www.mlbassoc.com/misc/log.fakesink
>    http://www.mlbassoc.com/misc/log.fakesink+ffenc
>
> Any ideas?
>

I've tried reworking my whole setup to use TCP sockets and it
seems to behave more sanely (not 100% sure yet).  The biggest
problem I'm having now is that TCP lets data flow in bits and
pieces but all of the video elements want to see full frames.
Even the RTP stuff has problems with many packets not being
decoded because of the buffering issues.

Query: is there some way, possibly a stand alone element like
'queue' that will read an arbitrary TCP stream and only send
out data of fixed size blocks?  In my example above, I'd need
to read from the TCP stream until there are exactly 691200 bytes
and only then pass the buffer on to ffenc_mpeg4.

Ideas?

-- 
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------


More information about the gstreamer-devel mailing list