[Bug 746822] test-mp4: Reports different duration after each seeks

GStreamer (GNOME Bugzilla) bugzilla at gnome.org
Thu Apr 9 17:13:55 PDT 2015


https://bugzilla.gnome.org/show_bug.cgi?id=746822

Hyunjun <zzoon.ko at samsung.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |zzoon.ko at samsung.com

--- Comment #1 from Hyunjun <zzoon.ko at samsung.com> ---
When doing key uint seek, qtdemux calls gst_qtdemux_adjust_seek to get proper
offset.
And then this offset is set to segment.position and segment.time in
gst_qtdemux_perform_seek.

After that, app sends segment query,  qtdemux set start, stop to query. During
this job, it gets new value from gst_segment_to_stream_time and sets them to
query.

For example, app seeks 0:00:53.229000 and it aligns to 0:00:50.000 by 
keyframe seek. And then app gets result of the query that has
start=0:00:50.000, stop=0:00:59.721.
app (rtsp-server) thinks of stop time as finish time. 
But real finish time is 0:01:02.95000 ( this gap is exactly same as 
one between start time and postion)

At this moment, qtdemux->segment is like below.
segment start=0:00:53.229000000,
offset=0:00:00.000000000,
stop=0:01:02.950000000,
rate=1.000000,
applied_rate=1.000000,
flags=0x01,
time=0:00:50.000000000,
base=0:00:00.000000000,
position 0:00:50.000000000,
duration 0:01:02.950000000

As talk with Thiago, it is a bug of qt demuxer.
Qt demuxer's segment.start is has to be updated after key-unit seek as like
segment.time

I'm working on this.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.


More information about the gstreamer-bugs mailing list