[gst-devel] rtpbin + mpegtsmux
Gary Thomas
gary at mlbassoc.com
Mon Sep 20 16:55:03 CEST 2010
On 09/20/2010 08:07 AM, Marc Leeman wrote:
>>> Because that's what the customer wants :-)
>
> Is this what the customer really wants (getting video reliably over the
> network) or is it what you've been told the customer wants :-)
That's always the $64,000 question!
>>> My understanding is that TS is a container that will eventually contain
>>> both video and audio and is not network worthy by itself, hence the RTP
>>> (RealTime [network] protocol)
>
> You might have some problems with the timestamps that are in the RTP
> header and those that are in TS. If only one is slightly off; you'll run
> into problems.
Any hints on how to diagnose this?
>> That said, I've also tried this with a raw H264 stream and the same
>> thing happens.
>>
>> As I've pointed out, these pipelines do not even work reliably on
>> my desktop system all the time. Using just the raw H264 stream, I
>> stream out and in on my desktop, using the local network (127.0.0.1)
>> While it may work, even for a while, after some time the receiver no
>> longer gets new frames (motion stops).
>>
>> Is there some way to get useful debug information on this? I don't
>> see any messages about the RTP stream until level 4 and then it's
>> too low level to interpret easily. I'd like to know when packets
>> come in, how they are parsed, passed on, etc, where the keyframes
>> are, etc. This sort of data doesn't seem to show up in the debug
>> data.
>
> We've been doing quite some h.264 streaming ourselves; and I can't say
> we've had many problems.
>
> There are a number of things you need to take into account.
>
> There are a number of encoders that only send NAL 7/8 once. You can
> configure the rtp payloader to re-multipex those on a regular interval
> in your stream.
>
> In your case; that will not work since it's not h264 you're sending, but
> h264/ts. That's why you can also instruct the h264 parser to re-include
> those settings into the stream (before you add the mpeg ts layer).
>
> Our focus is mainly on rtp/h264; but AFAIK; streaming is stable, both
> from hardware sources as from x264 sources (and file based).
In this light, I'm going to concentrate a bit more on the pure H264 streaming.
I have had a little success today, but it's still not great. I have to
have the client started before the server starts, so I'm guessing that
I have the "only one NAL" issue you mention above. How can I change the
behaviour as you mention above?
When it does run, I see messages like this on the client/receiver:
WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2686): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.
I tried adjusting the latency/jitterbuffer on the receiver, but it
didn't seem to change much.
Thanks for the help
--
------------------------------------------------------------
Gary Thomas | Consulting for the
MLB Associates | Embedded world
------------------------------------------------------------
More information about the gstreamer-devel
mailing list