[gst-devel] Compositing and GStreamer

Edward Hervey bilboed at gmail.com
Sun Nov 28 09:58:25 CET 2010


Hi,

On Sat, 2010-11-27 at 13:50 -0500, Timothy Braun wrote:
> Kapil,
>   Thanks for the suggestion, but with multifilesrc I would have to
> have the files static with incrementing named files.  A single box in
> the 3x2 grid may contain the same file multiple times so I'm afraid it
> won't be the best solution.
> 
>   I guess, ultimately, there's multiple ways to attack this one from
> what I've been able to find.  Here are the two that I've been looking
> at, I'm just not sure which is the better solution at this point:
>       * A single gnonlin composition with a 4 gnlsources similar in
>         setup as the gst-launch text I have below.

  Using one composition would actually be the 'proper' way.

>       * 6 gnonlin compositions, each feeding to a single videomixer
>         which combines them into the final frame.
>               * This path I'm currently investigating.  I have a test
>                 written in C, but I'm having some difficulties with
>                 pad linkage as I still don't have a complete
>                 understanding of when certain things will exist and
>                 how to get them.
>               * Here's currently whats happening:
>                       * Create a new pipeline
>                       * Create a videomixer
>                       * Create 6 gnonlin compositions each with a
>                         pad-added signal callback to connect
>                         gnlcomposition pad to videomixer.
>                       * ... (this is were it's going wrong)
>               * In the pad-added callback I have:
>                       * static void onPad(GstElement *comp, GstPad
>                         *pad, GstElement *sink) {
>                             GstPad *v = gst_element_get_pad(sink,
>                         "sink");
>                             gst_pad_link(pad, v);
>                             gst_object_unref(v);
>                         }
>                       * gst_element_get_pad is not returning a pad
>                         from the video mixer (sink) which leads me to
>                         believe that I'm either not asking in the
>                         right manner or the pad doesn't exist.  (I'm
>                         aware that gst_element_get_pad is deprecated,
>                         I'm just looking to test at the moment)
>                       * I noticed in one of the repositories under a
>                         unit test, the videomixer was attached as a
>                         gnloperation?  Is this the better path to
>                         take?
> 
>   This all leads me to a couple more questions as well:
>       * A video mixer pad has xpos and ypos properties.  This would
>         let me shift the video around without needing a video box
>         which I believe may be more efficient?

  Yes, it will be more efficient.

>       * If I use the xpos and ypos properties, is the video mixer
>         smart enough to change the frame size appropriately or will it
>         simply crop the frame to the size of the largest input frame?
>               * If so, would it be better to add a videobox to do the
>                 adjustments for me, or feed in a solid color
>                 background of the required output size?

  No, it won't change the size, but what you could do is mix the
original sizes with original offsets and then downconvert the video
later.

  Example for one 3x2 segment:

  Create a gnloperation with a videomixer in it with a gnl priority of
0.
  Create a gnlfilesource for each clip with increasing priorities (1->6)
going from left-right and then top to bottom:
     1  2  3
     4  5  6

  Connect to the gnloperation 'input-priority-changed' signal. When your
callback is called, you will know which priority is being connected to
which gnloperation ghostpad. You can get the videomixer sink pad by
using the gst_ghost_pad_get_target() method and then setting the proper
xpos/ypos property on that pad based on the priority of the feed being
provided.

  Set 'video/x-raw-yuv;video/x-raw-rgb' as the caps property on all your
sources.

  Set duration and media-duration of *all* gnlobjects to the same
duration.
  If you want to add another segment of 3x2 clips, you'll need to re-add
all those 7 objects with a modified 'start' property.

  First connect your composition to an imagesink to make sure the result
is what you want. When it is, insert a videoscale element followed with
a capsfilter with your target resolution.

  Hope this helps.

> 
>   Thanks again for the time.  I know there's a lot of questions above,
> but any help of any kind is greatly appreciated.
> 
>   All the best,
>   Tim
> 
> 
> On Fri, Nov 26, 2010 at 1:04 AM, Kapil Agrawal <kapil.agl at gmail.com>
> wrote:
>         Just a quick clue that might help, try using multifilesrc ?
>         
>         
>         On Thu, Nov 25, 2010 at 9:47 PM, Timothy Braun
>         <braunsquared at gmail.com> wrote:
>         
>                 
>                 Hello Everyone,
>                   I'm fairly new to G-Streamer so any input you can
>                 provide is much appreciated.  I'm working on a project
>                 where we need to generate a 2 minute video which is a
>                 composite of a total of 24 input videos.  The output
>                 video will have 4 different 30 second sections, each
>                 containing a 3x2 grid of the smaller input videos.
>                 The input videos are all naturally at 240x240 with the
>                 goal of having a final output frame size of 720x480.
>                 
>                   Using gst-launch, I've been able to construct a
>                 sample 30 second clip using a combination of inputs,
>                 videoboxes and a videomixer.  Here is what I've come
>                 up with so far:
>                 
>                 videomixer name=mix ! ffmpegcolorspace !
>                 ffenc_mpeg1video ! ffmux_mpeg name=mux ! queue !
>                 filesink location=output.mpg
>                 adder name=adder ! audioconvert ! ffenc_mp2 ! mux.
>                 filesrc location=loop1.mp4 ! decodebin name=decode1
>                 decode1. ! videobox border-alpha=0 top=-240 left=0 !
>                 queue ! mix.
>                 decode1. ! adder.
>                 filesrc location=loop2.mp4 ! decodebin name=decode2
>                 decode2. ! videobox border-alpha=0 top=-240
>                 left=-240 ! queue ! mix.
>                 decode2. ! adder.
>                 filesrc location=loop3.mp4 ! decodebin name=decode3
>                 decode3. ! videobox border-alpha=0 top=-240
>                 left=-480 ! queue ! mix.
>                 decode3. ! adder.
>                 filesrc location=loop4.mp4 ! decodebin name=decode4
>                 decode4. ! videobox border-alpha=0 top=0 left=0 !
>                 queue ! mix.
>                 decode4. ! adder.
>                 filesrc location=loop5.mp4 ! decodebin name=decode5
>                 decode5. ! videobox border-alpha=0 top=0 left=-240 !
>                 queue ! mix.
>                 decode5. ! adder.
>                 filesrc location=loop6.mp4 ! decodebin name=decode6
>                 decode6. ! videobox border-alpha=0 top=0 left=-480 !
>                 queue ! mix.
>                 decode6. ! adder.
>                 
>                   Now I need to do this 4 times, each time with a
>                 potentially different video in each box.  I've started
>                 looking into C interfaces as there's other pieces of
>                 the puzzle which need to be tied into this, and I am
>                 trying to determine the best way to tackle this.  I
>                 originally was looking at Gnonlin, but the
>                 documentation is lacking in regards to how
>                 gnloperations work.  I also recently stumbled upon the
>                 GES library by Edward Hervey, this looks promising as
>                 well, but I haven't been able to spend much time on
>                 it.
>                 
>                   If I go the Gnonlin route, I believe I would need 6
>                 compositions, one for each box.  At the 30 second
>                 marker, I would swap the filesource to a new one using
>                 dynamic pads and listening for messages on the
>                 pipeline bus.  Am I far off on this?  Any suggestions?
>                 
>                   As for the GES library, it looks very promising and
>                 powerful from the little I read on it.  Would this be
>                 the smarter route to take?  If so, does anyone have
>                 any suggestions for how the pipeline would be
>                 structured?
>                 
>                   Thank you in advance for your time on this and I
>                 truly appreciate any information you are willing to
>                 share with me.
>                 
>                   Happy Thanksgiving,
>                   Tim
>                 
>                 
>                 ------------------------------------------------------------------------------
>                 Increase Visibility of Your 3D Game App & Earn a
>                 Chance To Win $500!
>                 Tap into the largest installed PC base & get more eyes
>                 on your game by
>                 optimizing for Intel(R) Graphics Technology. Get
>                 started today with the
>                 Intel(R) Software Partner Program. Five $500 cash
>                 prizes are up for grabs.
>                 http://p.sf.net/sfu/intelisp-dev2dev
>                 _______________________________________________
>                 gstreamer-devel mailing list
>                 gstreamer-devel at lists.sourceforge.net
>                 https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>                 
>         
>         
>         
>         -- 
>         www.mediamagictechnologies.com (Gstreamer, ffmpeg, Red5,
>         Streaming)
>         twitter handle: @gst_kaps
>         http://www.linkedin.com/in/kapilagrawal
>         
>         ------------------------------------------------------------------------------
>         Increase Visibility of Your 3D Game App & Earn a Chance To Win
>         $500!
>         Tap into the largest installed PC base & get more eyes on your
>         game by
>         optimizing for Intel(R) Graphics Technology. Get started today
>         with the
>         Intel(R) Software Partner Program. Five $500 cash prizes are
>         up for grabs.
>         http://p.sf.net/sfu/intelisp-dev2dev
>         _______________________________________________
>         gstreamer-devel mailing list
>         gstreamer-devel at lists.sourceforge.net
>         https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>         
> 
> ------------------------------------------------------------------------------
> Increase Visibility of Your 3D Game App & Earn a Chance To Win $500!
> Tap into the largest installed PC base & get more eyes on your game by
> optimizing for Intel(R) Graphics Technology. Get started today with the
> Intel(R) Software Partner Program. Five $500 cash prizes are up for grabs.
> http://p.sf.net/sfu/intelisp-dev2dev
> _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel






More information about the gstreamer-devel mailing list