[gst-devel] Need help using gst-launch to display a PNG image, Part 3+
Donald Poole
donald.poole at swri.org
Tue Nov 16 17:45:19 CET 2010
Tim-Philipp Müller <t.i.m <at> zen.co.uk> writes:
>
> On Thu, 2010-08-12 at 15:56 -0700, Bill West wrote:
>
> > The command I'm using is:
> > gst-launch-0.10 filesrc location=overlay.png ! pngdec !
> > ffmpegcolorspace ! videoscale ! imagefreeze ! videomixer name=mix !
> > autovideosink videotestsrc ! video/x-raw-yuv, width=320, height=240 !
> > mix.
> >
>
> > Looking at the documentation, the only possible solution I could find
> > is an element called "alphacolor", but nowhere placed in the pipeline
> > does it work, so I'm stumped again.
>
> Try the 'alpha' element.
>
> Something like this works for me:
>
> $ gst-launch-0.10 filesrc location=foo.png ! pngdec ! alpha alpha=0.5 !
> videoscale ! imagefreeze ! videomixer name=mix ! ffmpegcolorspace !
> autovideosink videotestsrc ! alpha alpha=0.5 ! mix.
>
> or if the png already has an alpha channel:
>
> gst-launch-0.10 filesrc location=alpha.png ! pngdec ! videoscale !
> imagefreeze ! videomixer name=mix ! ffmpegcolorspace ! autovideosink
> videotestsrc ! alpha alpha=0.5 ! video/x-raw-rgb,bpp=32,depth=32 ! mix.
>
> >
> > 2. we have ALSA audio, and I see there's an element called
> > "alsasink", though I couldn't find much documentation on it. Is there
> > a way with alsasink to specify the output channel? (our card has 8,
> > so 4 stereo pairs).
>
> No. You can specify an output device though using the "device" property.
> And you can probably do some .asoundrc magic to route stuff to the right
> outputs (ie. expose the 4 stereo pairs as 4 stereo devices or 8 mono
> devices or whatever).
>
> >
> > 3. Is there a way to have the other PNG images fade in and fade out
> > of the window to/from black?
>
> Yes, but for this it would be best to write some code (python script or
> C or whatever).
>
> The "alpha" property of the alpha element is controllable (see
> GstController interface), and videomixer's pads also have properties
> like "xpos", "ypos", "zorder" and "alpha", which are also controllable
> and can be changed at runtime.
>
> > 4. Is there a way to loop between two particular frames on a video?
> >
> > 5. Is videobox the best way to position my video behind the overlay
> > PNG? I got it to work kind of, but there were artifacts, like
> > everything to the left and above the video is black.
>
> videobox? No, just use videomixer and set xpos/ypos/zorder/alpha via the
> pad properties.
>
> >
> > 6. I thought someone must have written a translator that takes a
> > gst-launch command line and converts it to the equivalent C code, but
> > couldn't find one. Did I just not look in the right place?
>
> There isn't one, and it's probably not a good idea either (RIP Glade
> code generator).
>
> You can use gst_parse_launch() from C code though (give elements a name
> with name=foo and then use gst_bin_get_by_name() on the pipeline to
> retrieve the elements by name to manipulate them).
>
> Cheers
> -Tim
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by
>
> Make an app they can't live without
> Enter the BlackBerry Developer Challenge
> http://p.sf.net/sfu/RIM-dev2dev
>
Hello All,
I have been reading this thread for the last couple of days because I am trying
to do the same thing that Bill is doing. But, instead overlaying the PNG image
on the videotestsrc, i'm trying to overlay it on a live video stream from a
network video encoder (AXIS M7001 to be specific). It's a MPJEG http stream
using SOUP. I first used this pipeline to make sure I could get something
working:
$ gst-launch filesrc location=images/cross_hair.png ! pngdec ! videoscale !
imagefreeze ! ffmpegcolorspace ! videomixer name=mix ! ffmpegcolorspace !
xvimagesink videotestsrc ! alpha alpha=1 ! mix.
I had to use another ffmpegcolorspace to get it to work, but it displays the PNG
image overlayed on the videotestsrc with the alpha blending. Next, I tried
replacing the videotestsrc with souphttpsrc so that I could get the video stream
to show underneath the PNG image. So I tried this pipeline, but received the
following errors:
$ gst-launch filesrc location=images/cross_hair.png ! pngdec ! videoscale !
imagefreeze ! ffmpegcolorspace ! videomixer name=mix ! ffmpegcolorspace !
xvimagesink souphttpsrc location=http://10.200.30.30/axis-cgi/mjpg/video.cgi !
alpha alpha=1 ! mix.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstAlpha:alpha0: not negotiated
Additional debug info:
gstbasetransform.c(2073): gst_base_transform_handle_buffer ():
/GstPipeline:pipeline0/GstAlpha:alpha0:
not negotiated
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
I'm fairly new to GStreamer, so these error messages appear rather arcane at
this time. But, is there something that I'm not understanding to get the PNG to
overlay on my live video the way it does on the videotestsrc? I thank everyone
in advance for any information and advice provided.
Thanks,
Donald
More information about the gstreamer-devel
mailing list