[gst-devel] seeking VBR

Wim Taymans wim.taymans at chello.be
Tue Nov 19 10:22:02 CET 2002


Hi,

> Since the ffmpeg plugin isn't ready yet ...
>
> in gst_mpeg2dec_convert_sink, i see that the time->byte offset
> conversion is done with the byte_rate.  This assumes a CBR. However,
> most streams are VBR.  At least, i only care about VBR.  i haven't
> found any open-source encoders which aren't VBR (which makes it hard
> to create crystal clear VCDs).

yes, seeking is currently only done using the bitrate (CBR)


> How hard would it be to extract the time->byte offset mapping into an
> index file and use that (like cinelerra & mpeg3toc)?  It seems like a
> matter of:
>
> 1. Adding another sink pad (and mode?) to mpeg2dec which outputs the
> index at maximum parse speed .. then pipe the index to fdsink.
>
> 2. Add an "index-file" property to mpeg2dec.  This file could be
> mmap'd in whole .. and that is it.
>

The idea is to hand a GstTimeCache object (object is already included in
the core)
to the plugin. The Timecache object could be prefilled with previously
save data (XML format?), in which case the plugin would assume that it
contains byte->time mappings of a previous indexing round.

If the timecache object is empty (or partially filled) the plugin would
update
if as it receives data.

The procedure to fill the cache would go like:

a) connect filesrc->mpeg2parse/dec (don't connect any srcpads)
b) hand an empty timecache object to mpeg2parse/dec
c) run the pipeline to EOS
d) read/use/save the timecache

The procedure to hand an index to a plugin would be like:

a) create/fill timecache from previously saved data
b) create playback pipeline
c) hand index to plugin (mpeg2dec)
d) run/seek etc..

If no timecache is handed to mpeg2dec, it would simply not do any
indexing.

The design for the timecache is however not done yet. If you want to
give it a go, this is what I think the timecahce API should (at least)
contain:

- API to to mapping from byte<->timeoffset
- API to get individual index records.
- API to get info about a record (is it a keyframe (I frame,... ), ..)
- API to load/save timecache (XML?, customizable?)

nice to have:

- mapping to other (abritrary) formats (samples, frames, ...)
- metadata (who created the index, what is the corresponding URI
    of the master media file, timestamp, ...)
- API to specifiy the type of records to index (only keyframes, every
   N seconds, max size of index, ringbuffer, ...)
- certainty of index (for indexes created after a seek, ...)
- merging of index entries if certainty is known (playback reaches
   previously seeked and indexed position from region with higher
   certainty)
- indexing of time<->metadata (or other arbitrary properties)

> It sounds easy enough that i'd be willing to code something up.
>
> Is this design the gstreamer way of doing things?  What would be the
> _perfect_ way to support this type of index file option?  Can we just
> hack it temporarily?

I would not mind your hacking on GstTimecache to implement the bare
minimum for your requirements, I'm sure it'll be time well spent. A
standard
Timecache potentially allows apps like an NLE to work more efficiently
too.

> Let's assume that all frames are I-frames.  What the best byte-offset
> for a given frame?  Where in mpeg2dec is the "start of frame"
> identified?  Is this the correct byte-offset to use for seeking?

libmpeg2 CVS allows you to get notification when an I frame is starting,

I'm not sure how many bytes of the frame it has already consumed by then

or if you can even query that. For the plain indexing round, I would
just
count the number of bytes handed to mpeg2dec, guestimate the start code
of the picture and store that time<->offset pair in the cache.

I'm also thinking that the indexing should actually happen both in the
mpeg
demuxer and the mpeg video decoder. The video decoder would map
frames (I frames) to PTS timestamps, the mpeg demuxer would index
byte offsets to the PTS values of the different streams, it would
probably
also index SCR timestamps to offsets (need to set timecaches on pads?
need to store the pad id in the cache entries too?..). Just a few issues
I'm
thinking of here...

The seeking event on the mpeg video decoder would then first figure out
what the nearest I frame is, it would convert the timestamp (or frame
number)
to the PTS. It would then do a timeseek on its sinkpad with the PTS
value.
Mpegdemux would get the seek on the PTS, it would map this to the
byteoffset
of the videopacket with that PTS and would then forward the byteseek to
filesrc.


Wim

> Everything else is tested & ready.  i just need accurate seek for VBR.








More information about the gstreamer-devel mailing list