[gst-devel] [RFC] Video blending interface

Benjamin Otte in7y118 at public.uni-hamburg.de
Fri Jan 14 05:44:22 CET 2005


Some random comments that came to my mind while reading the mail:

On Fri, 14 Jan 2005, Gergely Nagy wrote:

> The third requirement is that blending should be an interface, not one
> plugin that does all kinds of blending. There are a few reasons I think
> this is a good idea: first, it makes the plugin far less complex;
> second, makes it easier to reuse; third, this design still allows a
> larger, bin-like plugin which has on-demand sinks, and dispatches the
> work to other blenders (together with z-order & stuff).
>
I'm assuming you are talking about a base class here, when you talk about
an "interface" - say GstVideoblender, kind of like GstVideofilter
already is a base class for all sorts of filter effects.
The word interface in the GStreamer sense means a GObject interface used
to provide a common API from an element to applications (see GstOverlay or
GstMixer for examples)

> Videomixer supports the xpos/ypos pad properties, which is one half of
> the bounding_box approach I need. It also supports an alpha pad
> property, which I did not mention, but is important too. However, as far
> as I see, it always adds a background to the image at its srcpad. It is
> also one single plugin doing too much (imho, at least). It would make
> the code simpler and more reusable if different blending methods were
> different plugins implementing the same interface. Then videomixer's job
> would only be co-ordination.
>
I'm not sure I understand why there's need for a bounding box. Given two
video streams, you just need the xpos and ypos to know how to compose
them. If you want to compose only a partial image, you could use an
element beforehand that handles removing the unwanted parts.

> Plugins implementing this would take I420 or AYUV as input,
> and output AYUV.
>
I'm not sure why a blending plugin should be bothered doing colorspace
conversion from I420 to AYUV. That's clearly the job of a colorspace
converter.
Also it sounds useful to abstract away colorspaces so that you can later
on extend it to provide other implementations, like for example RGBA or
even color key blending with non-alpha colorspaces. (Or even OpenGL or
cairo, if we ever start passing data this abstracted)

> They would not have
> on-demand sinks, but two static ones: the background, and the
> foreground. On the srcpad, the result would always have the same size as
> the background (this is because I want to be able to merge a part of a
> larger element onto a smaller one). Basically, that's all. Except, I am
> not sure how the framerate should be handled...
>
Using two static pads sounds like a good idea here. I would make the pad
providing the background image be the "primary" pad (for lack of a better
word) and take output image size and framerate from this plugin.
Now there's two approaches for the framerate on the secondary pad:
1) force it to be equal to the framerate of the primary pad. This sounds
most useful given that we already have the videorate element and that it
makes the blending plugin not be a rate plugin.
2) simply duplicate/drop images from the secondary pad to match the
timestamp of buffers coming in from the primary pad. This approach is
certainly more complicated, but may provide more flexibility.
I'd start with approach 1 and only change if I encounter serious
obstacles.

> One idea is to not care about it at all, and on the src pad, always have
> a frame rate of 0 (a variant of this is to have the framerate of the
> slower sink pad). This would mean that if one of the sinks does not have
> new data, the previous buffer we got from it would be reused.
>
This is probably not a good idea because there's elements expecting a
constant frame rate (like encoder elements). I think it's better to just
use the frame rate from the primary pad here.

Benjamin


PS: To any GStreamer dev desperatly looking for something to do: I
think we still need support for AYUV (and probably RGBA) in quite some
important elements, like colorspace converters or videotestsrc.
Also, shouldn't videotestsrc provide a "transparent" pattern?





More information about the gstreamer-devel mailing list