[gst-devel] seeking VBR

Joshua N Pritikin vishnu at pobox.com
Wed Nov 20 07:22:05 CET 2002

On Tue, Nov 19, 2002 at 07:24:44PM +0100, Wim Taymans wrote:
> > How hard would it be to extract the time->byte offset mapping into an
> > index file and use that (like cinelerra & mpeg3toc)?  It seems like a
> > matter of:
> >
> > 1. Adding another sink pad (and mode?) to mpeg2dec which outputs the
> > index at maximum parse speed .. then pipe the index to fdsink.
> >
> > 2. Add an "index-file" property to mpeg2dec.  This file could be
> > mmap'd in whole .. and that is it.
> The idea is to hand a GstTimeCache object (object is already included in
> the core)
> to the plugin. The Timecache object could be prefilled with previously
> save data (XML format?),

XML seems a little bulky for (byte,time) tuples.  i guess it depends
what you are optimizing for.

> in which case the plugin would assume that it
> contains byte->time mappings of a previous indexing round.
> If the timecache object is empty (or partially filled) the plugin would
> update if as it receives data.

Yah, i'm sure some apps will work like that.  However, my app needs to
index the whole media stream before playback.

> The procedure to fill the cache would go like:
> a) connect filesrc->mpeg2parse/dec (don't connect any srcpads)
> b) hand an empty timecache object to mpeg2parse/dec
> c) run the pipeline to EOS
> d) read/use/save the timecache

Yah, i will try to implement that ASAP.

> The procedure to hand an index to a plugin would be like:
> a) create/fill timecache from previously saved data
> b) create playback pipeline
> c) hand index to plugin (mpeg2dec)
> d) run/seek etc..
> If no timecache is handed to mpeg2dec, it would simply not do any
> indexing.

Yah, i don't need incremental indexing.

> The design for the timecache is however not done yet. If you want to
> give it a go,

What i will try to do is provide a sample implementation of the batch
mode indexing.

> this is what I think the timecahce API should (at least)
> contain:
> - API to to mapping from byte<->timeoffset
> - API to get individual index records.
> - API to get info about a record (is it a keyframe (I frame,... ), ..)
> - API to load/save timecache (XML?, customizable?)
> nice to have:
> - mapping to other (abritrary) formats (samples, frames, ...)
> - metadata (who created the index, what is the corresponding URI
>     of the master media file, timestamp, ...)
> - API to specifiy the type of records to index (only keyframes, every
>    N seconds, max size of index, ringbuffer, ...)
> - certainty of index (for indexes created after a seek, ...)
> - merging of index entries if certainty is known (playback reaches
>    previously seeked and indexed position from region with higher
>    certainty)
> - indexing of time<->metadata (or other arbitrary properties)

Yah, whatever.  Let apps drive the API design.

> > Let's assume that all frames are I-frames.  What the best byte-offset
> > for a given frame?  Where in mpeg2dec is the "start of frame"
> > identified?  Is this the correct byte-offset to use for seeking?
> libmpeg2 CVS allows you to get notification when an I frame is starting,
> I'm not sure how many bytes of the frame it has already consumed by then
> or if you can even query that. For the plain indexing round, I would
> just count the number of bytes handed to mpeg2dec, guestimate the start
> code of the picture and store that time<->offset pair in the cache.

Yah, that will work well enough given that the current seeking
just throws random byte offsets into mpeg2dec.

> I'm also thinking that the indexing should actually happen both in the
> mpeg demuxer and the mpeg video decoder. The video decoder would map
> ...
> of the videopacket with that PTS and would then forward the byteseek to
> filesrc.

That's too much work for me.  i need a quick solution!

Victory to the Divine Mother!!         after all,
  http://sahajayoga.org                  http://why-compete.org

More information about the gstreamer-devel mailing list