[gst-devel] GANSO for gstreamer

Ruben ryu at gpul.org
Sat Mar 24 17:51:51 CET 2001


On 2001/Mar/23, Wim Taymans wrote:

> setup for the timeline. The interesting part is how one would convert the
> timeline hierarchy to a graph based implementation...

Think in a simple structure (like premiere's one):

------------------------------------------------------
Layer 0 | <---------------Corner Logo.jpg----------->                      
------------------------------------------------------
Layer 1 | <----Video 1---->
------------------------------------------------------
  Mix   |      <Transition>
------------------------------------------------------
Layer 2 |      <-----Video 2---------><---Video 3--->
------------------------------------------------------

It's easy to convert it to a graph based implementation:

,----------.  ,---------.
| Logo.jpg [--] JPEGdec [---------------------------------···
`----------'  `---------'                      
,----------.  ,---------.                      ,------------. 
| Video 1  [--] AVIdec  [----------------------] Transition [-···
`----------'  `---------'                     ,]            |
,----------.  ,---------.                    / `------------'
| Video 2  [--] MPEGdec [.                  /
`----------'  `---------' \ ,------------. /
                           `] Time mixer ['
                           ,]            |
,----------.  ,---------. / `------------'
| Video 3  [--] AVIdec  ['
`----------'  `---------'
========================
,----------.
| JPEGdec  [-----.  ,-------------.
`----------'      `-] Layer Mixer [-------···
,------------.    ,-]             |
| Transition [---'  `-------------'
`------------'

Time mixer: takes several video inputs and gives only one at the output at
each time. Maybe should be combined with a delay element for each input.

Transition: Mixes two videos with a nice effect.

Layer Mixer: Alpha blends two layers.

> >       - Most of time, the user will need to playback little portions of
> > the composition, or even single frames. User will click in a timeline and
> > wants the frame corresponding to this instant of time to appear in a preview
> > window. One solution is to seek, but you said that it isn't a good idea,
> > well, what should I do then?
> 
> I think seeking is a good idea. With a clever caching mechanism this
> will work quite nicely.

	Ok.

>  timeline
>    !
>    !
> 0  - activate atrack1
>    !
>    !
> 5  - activate (or insert) mixer and atrack2
>    !  (mixing happens here)
>    !
> 10 - inactivate (or remove) atrack1
>    !
>    V

	There is no need of modifying the pipeline dinamically, I can insert
delay elements. In fact I can build the pipeline while the user builds the
timeline, and when he moves one video track left or right I need only to
change the value of the corresponding delay element.

> rendering a tumbnail preview of atrack1 would involve running the
> atrack1 bin and sending the output to the screen (using an element with
> 50x40 video capabilities and a scaler, possibly seeking to every nth
> frame etc..). I would tee these frames to a cache (preview cache with
> an index etc..).

	I will need to put a display element in all the nodes of the graph,
because the user would like to see what's happening at any moment. I could
attach a cache element before every display element, but I think that it
could be a good idea to provide gstreamer displaying elements with it,
because it will be needed almost ever.

> >       - If I have two video sources and I want to see them one after
> > another, which structure should I use?
> 
> not sure what you mean, I assume you want to play one media stream after
> the other...

	Yes.

> In this case you would have the scheduler activate the first bin until EOS
> and then activate the second one.

	What do you think of the Idea that I gave before? To use a time
mixer that gives in the output only one of its inputs at each time? It
should be combined with the delay elements, if they are possible.
 
> >       - What is the format of the video/raw data?
> 
> It's still undefined :( We currently use fourcc's to define the type (RGB,
> YUV, etc..) 

	Ok, then I will need to create an element that makes * to RGBA
conversion.

> >       - Is there any plugin to alpha blend two sources into one? and to
> 
> not yet.

	If the input it in RGBA it's very simple to create one. I've the
code in ganso, only need to port it.

Thanks 
-- 
"There is nothing better distributed than the reason, everybody is convinced of having enough" 
                                                                  Descartes
 __
 )_) \/ / /  email: mailto:ryu at gpul.org
/ \  / (_/   www  : http://pinguino.dyndns.org
[ GGL + GAnSO developer ] & [ GPUL member ]




More information about the gstreamer-devel mailing list