h264 stream form filesrc vs from live source (rtsp). HD playback

bamboosso laski.maciej at gmail.com
Mon Dec 10 14:48:00 PST 2012


Hello,

so, I've made some digging. *First this is playback pipeline:*

gst-launch -e rtspsrc
location=rtsp://admin:4321@192.168.0.140:554/profile5/media.smp !
gstrtpjitterbuffer ! rtph264depay ! legacyh264parse output-format=0
access-unit=true ! mfw_vpudecoder codec-type=std_mpeg4 ! mfw_v4lsink
sync=false

*And two debug options:*
*First*
export GST_DEBUG=2,rtspsrc:5
rtspsrc.txt
<http://gstreamer-devel.966125.n4.nabble.com/file/n4657344/rtspsrc.txt>   or 
rtspsrc.txt <https://dl.dropbox.com/u/46679399/gst_debug/rtspsrc.txt>  

There are a lot of qos events received by the rtp src pad. But I don't know
what they mean. Artefacts on video are from first second of playback. There
is one session timeout near 29th second of playback, but I haven't seen it
on playback.

*Second:*
export GST_DEBUG=2,gstrtpjitterbuffer:5
rtspsrc_jitterbuffer.txt
<https://dl.dropbox.com/u/46679399/gst_debug/rtspsrc_jitterbuffer.txt>  

and there is a lot of messages like which looks good:


And a lot of:



and:



So, buffers are definetly lost somewhere... 

*Next i have save pipeline:*
gst-launch -e rtspsrc
location=rtsp://admin:4321@192.168.0.140:554/profile5/media.smp !
gstrtpjitterbuffer ! rtph264depay ! legacyh264parse output-format=0
access-unit=true ! mp4mux ! filesink location=h264_1280x1024.mp4

*and two outputs with same debug options:*
rtspsrc_record.txt
<https://dl.dropbox.com/u/46679399/gst_debug/rtspsrc_record.txt>  
rtspsrc_gstrtpjitterbuffer_record.txt
<https://dl.dropbox.com/u/46679399/gst_debug/rtspsrc_gstrtpjitterbuffer_record.txt>  

And here:
1. No more qos events sent to rtspsrc src pad
2. No buffer lost in gstrtpjitterbuffer

*I'm a little bit confused.*
Is it a clock propagation, clock source, hardware too slow (bitrate is set
to 5120kbps so 100Mb ethernet should be enougth?? Its working for
recording.)

*I'm thinking that mfw_vpudecoder or mfw_v4lsink takes too much time??* But
then I would expect, that buffer lost would be somewhere between parser and
decoder. Here are aoutputs from other elements of pipeline:
rtph264depay.txt
<https://dl.dropbox.com/u/46679399/gst_debug/rtph264depay.txt>  
legacyh264parse.txt
<https://dl.dropbox.com/u/46679399/gst_debug/legacyh264parse.txt>  
mfw_vpudecoder.txt
<https://dl.dropbox.com/u/46679399/gst_debug/mfw_vpudecoder.txt>  
mfw_v4lsink.txt
<https://dl.dropbox.com/u/46679399/gst_debug/mfw_v4lsink.txt>  

*But I can't see any errors in those files.*

Next test: record and display video in same time.

My pipeline:
gst-launch -e rtspsrc
location=rtsp://admin:4321@192.168.0.140:554/profile5/media.smp !
gstrtpjitterbuffer ! rtph264depay ! legacyh264parse output-format=0
access-unit=true ! tee name=t  t. ! queue ! mfw_vpudecoder
codec-type=std_mpeg4 ! mfw_v4lsink sync=false  t. ! queue ! mp4mux !
filesink location=temp.mp4

Video contains a lot of artefacts and delay. Below I attach 2 videos:
1. Only record pipeline:  http://h264_1280x1024.mp4
<https://dl.dropbox.com/u/46679399/gst_debug/h264_1280x1024.mp4>  
2. Record and playback pipeline:  h264_1280x1024_bad.mp4
<https://dl.dropbox.com/u/46679399/gst_debug/h264_1280x1024_bad.mp4>  

So the main question is: *What is the difference when using rtsp and
filesrc???* From filesrc I can decode h264 stream with 1920x1080 without
delays. And I can decode previously recorded stream from rtsp.

Regards
 Maciek



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/h264-stream-form-filesrc-vs-from-live-source-rtsp-HD-playback-tp4657319p4657344.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list