[gst-devel] RFC: A Timeline

MDK mdk at mdk.org.pl
Mon Jun 27 03:15:01 CEST 2005


Dnia 27-06-2005, pon o godzinie 11:17 +0200, Edward Hervey napisał(a):
> > 
> > An arbitrary number of SOURCES is linked to a TIMELINE element - which
> > is a modified videomixer thing. It has infinite number of src pads and
> > one sink pad. In a loopfunc TIMELINE pulls all of it's src pads trying
> > to find a first one that returns a buffer ( != "no data"). It pushes it
> > to the sink pad.
> 
>   Two problems here :
>    1_ gnonlin tries to be as media-agnostic as possible, in your case
> you would have to do a different kind of timeline per media class.

I actually see this (in my implementation) as an advantage. The SOURCE
can have two pads - video and audio. Video is linked to the
VIDEOTIMELINE (explained above) and audio is linked to a AUDIOTIMELINE
(a simpler "adder" in this case). 

>    2_ the timeline loopfunc would have to go over more pads that are
> actually needed.

Yes... it's a hell of pads & unnecessary links... 

> > The advantages I see:
> > 
> >         * Simplicity
> 
>   Simplicity in what sense ? Code-wise ? API-wise ?

Well, in terms of further using that in (my) real application. I mean -
I'm trying to do a video app that doesn't have "video tracks" and "audio
tracks". It just has "tracks". 

> > In other news: I've the playskipper plugin ready (for automatic frame
> > dropping if playback isn't "smooth"). I'm going to post it for a wider
> > testing soon.
> 
>   If I'm not mistaken, that's what videorate does.

No, I think it does something opposite actually... I'm not sure though.
Ronald said some time ago we don't have playskipping (if CPU can't
handle real time processing) so I did that. 

> > Well, my problems with gnonlin are also related to performance.
> > GnlSource (the basic source element in gnonlin) can wrap a bin that has
> > just one sink pad. That means you need two GnlSources for an AVI clip
> > that has audio and video. And that means doubling the filesrc ! avidemux
> > part. I don't know if the scheduler optimizes that (so that double
> > disk-reading doesn't happen), but is seems strange to me.
> 
> Indeed you have a double (or more) demuxer existing several times. But
> in order to suit every scenario, this is needed. Imagine you play the
> audio and video at different times in the timeline, you'll need that
> anyway. The cost of overhead for a demuxer is not that big anyway
> compared to the timeline possibilities you can create this way.

It's not the demuxer I'm aware of. It's the filesrc and disk reading...
Let's say I'm working with full PAL MJPEG compressed AVI's. That's about
8MB/s throughput. Doubling that will choke the disk. Surely - it'll work
out (probably) because of kernel data caching. But relaying on that is
not good. 

> Imagine you play the audio and video at different times in the 
> timeline, you'll need that anyway.

I was thinking about that. It seems to me that for a "typical" (home)
video editing this is a corner case. Usually you just use the video in
sync with the audio it provides. You might add "background music" but
that's a different story. 

> The fact of having N source elements generating data at every loop
> takes more processing than that :)

Right, right, right... but I'm still not convinced :> The N source
elements will not generate "empty buffers" but some kind of (fast)
signal that no data is availible...

> 
> > There are no examples on gnonlin, so I might be plain wrong here.
> > True, I need to create some new ones, I'm adding that to the gnonlin 0.9 TODO.

While experimenting with gnonlin I did a simple 3 sources player. I'll
clean it up and send it to you. 

Regards,

-- 
Michał Dominik K.
mdk at mdk.org.pl
www.mdk.org.pl


 




More information about the gstreamer-devel mailing list