Accelerated subtitle rendering in 1.0
Arnaud Vrac
rawoul at gmail.com
Mon Aug 6 13:41:38 PDT 2012
On Mon, Aug 6, 2012 at 10:02 PM, Tim-Philipp Müller <t.i.m at zen.co.uk> wrote:
> On Mon, 2012-08-06 at 20:51 +0200, Arnaud Vrac wrote:
>
> Hi,
>
>> 1/ Composition metadata is added to video buffers by the textoverlay element
>> 2/ The buffers are sent to both:
>> - a video sink that renders the video metadata on one plane
>> - an image sink that renders the composition metadata on another
>> plane. The composition metadata is rendered to an ARGB surface by the
>> generic overlayrender element in this pipeline.
>>
>> First, what do you think of this pipeline ? Would it enter the
>> use-cases that are (or will) be handled by subtitleoverlay ? My goal
>> is to make this work with playbin.
>
> If you want to make it work well with playbin, it would be much easier
> if you made a hwsink element that takes care of both the video data and
> the subtitle data in one go, i.e. it would render the video data to
> plane 1 and upload the subtitle composition data to plane 2 as needed.
>
Ok, I wanted to avoid that to be able to switch the overlay renderer
from low level one that blits directly to the plane, to a composited
renderer (wayland or x11). I guess your solution is much simpler.
>
>> Second, here are the problems are I have with this pipeline:
>> - Allocation queries are not forwarded by the tee element, so I had
>> to hack the textoverlay element to always attach composition metadata
>> to the video buffers.
>> - I have to force async=false on the hwimagesink element, otherwise
>> the pipeline will not preroll. Maybe I need to do something special
>> with GAP events or the GAP flag on buffers ?
>
> You need a queue in each branch after tee (at least until we fix tee to
> include them..).
I did do that, as I said I removed the queues from the pipeline for
simplicity. However it still stalls. I guess this won't happen if I
have a single element for rendering.
>
>> - The video API only has blending functions, but in this case my
>> overlayrender element needs to render the composition metadata
>> directly without blending, since the allocated frame to render to is
>> cleared. It would also be nice to have a function to fill a frame with
>> any color in any pixel format (for clearing for example).
>
> What pixel format do you need for your API ?
>
> The blending funcs are mostly meant as utility functions for overlay
> elements that blend the subtitles on top of raw video data directly.
>
> For your use case, the thought was that just getting the ARGB pixels
> with
>
> gst_video_overlay_rectangle_get_pixels_argb() or
> gst_video_overlay_rectangle_get_pixels_unscaled_argb()
>
> would usually be enough. Is your overlay plane in a different format
> then?
ARGB is fine, however I still need to blend all the rectangles in a
single transparent framebuffer, meaning that I have to use a library
like pixman. It's ok, but it wouldn't add much code to get a
gst_video_overlay_composition_render function that does not blend with
the destination surface pixels. I think it would be a good addition to
the public API.
Thanks for your comments !
--
Arnaud Vrac
More information about the gstreamer-devel
mailing list