From katcipis at inf.ufsc.br Wed Sep 1 00:08:11 2010 From: katcipis at inf.ufsc.br (Tiago Katcipis) Date: Tue, 31 Aug 2010 19:08:11 -0300 Subject: [gst-devel] trouble with tee element In-Reply-To: References: <1283289313.31364.0.camel@zingle> Message-ID: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-tee.html best regards, Katcipis On Tue, Aug 31, 2010 at 6:34 PM, Bert Douglas wrote: > Thanks much. > > Is this in docs anywhere? > > > On Tue, Aug 31, 2010 at 4:15 PM, Tim-Philipp M?ller wrote: > >> On Tue, 2010-08-31 at 15:37 -0500, Bert Douglas wrote: >> >> > Forgot to put in the script. Here it is: >> > >> > gst-launch \ >> > videomixer name="mix" \ >> > ! ffmpegcolorspace \ >> > ! xvimagesink name=sink force-aspect-ratio=1 \ >> > \ >> > videotestsrc name=src1 pattern="smpte" \ >> > ! video/x-raw-rgb, bpp=32, framerate=10/1, width=400, height=300 \ >> > ! tee name=t1 \ >> > ! mix.sink_1 \ >> >> You need a queue here for each branch after a tee. >> >> > videotestsrc name=src2 pattern="checkers-8" \ >> > ! video/x-raw-rgb, bpp=32, framerate=10/1, width=400, height=300 \ >> > ! tee name=t2 \ >> > ! mix.sink_2 \ >> >> (so here too) >> >> > \ >> > t1. ! fakesink \ >> >> (and here) >> >> > t2. ! fakesink \ >> >> (and here) >> >> Cheers >> -Tim >> >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- http://www.getgnulinux.org/windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From hoyt6 at llnl.gov Wed Sep 1 01:47:33 2010 From: hoyt6 at llnl.gov (Hoyt, David) Date: Tue, 31 Aug 2010 16:47:33 -0700 Subject: [gst-devel] Windows GUI with Gstreamer In-Reply-To: <1283262471921-2401605.post@n4.nabble.com> References: <1283262471921-2401605.post@n4.nabble.com> Message-ID: It's not incredibly simple, but here you go: http://code.google.com/p/ossbuild/source/browse/trunk/Main/GStreamer/Windows/Build/Tests/xoverlay-start-stop.c It's designed to test multiple xoverlay windows with pipelines starting and stopping a hundred times/second. But it's very minimalistic in the dialog it creates for hosting the xoverlay windows. It's entirely built by VC++. -----Original Message----- From: Wes Miller [mailto:wmiller at sdr.com] Sent: Tuesday, August 31, 2010 6:48 AM To: gstreamer-devel at lists.sourceforge.net Subject: [gst-devel] Windows GUI with Gstreamer Can someone point me to simple examples of using a gst pipeline that send its output to a Windows video widget embedded in some application's GUI? I know vlc has a windows version, but i was hoping for simpler examples. Something that'll make VC++ happy. Wes -- View this message in context: http://*gstreamer-devel.966125.n4.nabble.com/Windows-GUI-with-Gstreamer-tp2401605p2401605.html Sent from the GStreamer-devel mailing list archive at Nabble.com. ------------------------------------------------------------------------------ This SF.net Dev2Dev email is sponsored by: Show off your parallel programming skills. Enter the Intel(R) Threading Challenge 2010. http://*p.sf.net/sfu/intel-thread-sfd _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://*lists.sourceforge.net/lists/listinfo/gstreamer-devel From gurpreet at tataelxsi.co.in Wed Sep 1 06:30:39 2010 From: gurpreet at tataelxsi.co.in (Gurpreet) Date: Wed, 01 Sep 2010 10:00:39 +0530 Subject: [gst-devel] Problem in ffmux_mov Message-ID: <4C7DD6EF.5020701@tataelxsi.co.in> Hi All.. I m encoding pcm file with faac and then muxing it with ffmux_mov. I am using this pipeline. file is being played in VLC and QT player. gst-launch filesrc blocksize=2048 location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, endianness=1234, signed=true ! faac ! aacparse ! audio/mpeg,rate=48000, channels=1, mpegversion=4, layer=3 ! ffmux_mov ! filesink location=/home/Gurpreet/outputfilesmp4/exp_mov.mov But Whenever i m setting faac property "outputformat=1" ( for ADTS Header ) then gst-launch filesrc blocksize=2048 location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, endianness=1234, signed=true ! faac outputformat=1 ! aacparse ! audio/mpeg,rate=48000, channels=1, mpegversion=4, layer=3 ! ffmux_mov ! filesink location=/home/Gurpreet/outputfilesmp4/exp_mov.mov then the generated file is being played in vlc but in QT bar is moving but no audio is coming. what could be the error ? outputformat=0 is for RAW Aac outputformat=1 is for ADTS is this bug in ffmux_mov ? what could be the reason ? Thanks Gurpreet -------------- next part -------------- An HTML attachment was scrubbed... URL: From acandido at hi-iberia.es Wed Sep 1 09:31:47 2010 From: acandido at hi-iberia.es (Andres Gonzalez) Date: Wed, 01 Sep 2010 09:31:47 +0200 Subject: [gst-devel] Windows GUI with Gstreamer In-Reply-To: <1283262471921-2401605.post@n4.nabble.com> References: <1283262471921-2401605.post@n4.nabble.com> Message-ID: <4C7E0163.8010600@hi-iberia.es> On 31/08/10 15:47, Wes Miller wrote: > Can someone point me to simple examples of using a gst pipeline that send its > output to a Windows video widget embedded in some application's GUI? I know > vlc has a windows version, but i was hoping for simpler examples. Something > that'll make VC++ happy. > > Wes Just for completion, there is an example of sending output to a QT window in gstxoverlay documentation: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst-plugins-base-libs-gstxoverlay.html#id644266 But you'll need QT Creator instead of VC++. From ensonic at hora-obscura.de Wed Sep 1 11:16:31 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 01 Sep 2010 12:16:31 +0300 Subject: [gst-devel] Debugging pipeline locks In-Reply-To: References: Message-ID: <4C7E19EF.5090504@hora-obscura.de> On 26.08.2010 10:42, Antoni Silvestre Padr?s wrote: > Hi, I am still trying to figure out how to receive different payloads > using gstrtpbin, any idea regarding that would be really helpful. > > On the other hand, to be more precise in the question I don't if > anyone has a method to debug why a pipeline is locked, I mean when it > is in the playing state but for some reason it's stopped waiting on a > pad to receive data. It would be really useful to be able to debug > which pad is causing this state and have a way to fix it. I have be using GST_DEBUG_BIN_TO_DOT_FILE and in the image look for e.g. unlinked or flushing pads. Also checking the GST_DEBUG log at level :2 (warnings) is a good first check. If the pipeline is not too complex you can break in gdb and ehck what all the threads are doing. This is difficult as it is hard to spot a lock that has no chance of becoming release automatically soon. > > Also, is there a way to have a pipeline like this (i've just made this > pipeline up in order to show the concept but I haven't tested it): > > avimux name=mux > pulsesrc ! lame ! queue ! mux. > v4l2src ! queue ! x264 ! mux. > mux. ! queue ! filesink location=video.avi > > That when it does not receive both the audio and the video at the same > time but only one of them the pipeline doesn't get locked? One more queue after x264 maybe. Stefan > > Thanks, > > Antoni Silvestre > > > ------------------------------------------------------------------------------ > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > Be part of this innovative community and reach millions of netbook users > worldwide. Take advantage of special opportunities to increase revenue and > speed time-to-market. Join now, and jumpstart your future. > http://p.sf.net/sfu/intel-atom-d2d > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Wed Sep 1 11:18:47 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 01 Sep 2010 12:18:47 +0300 Subject: [gst-devel] TexyOverlay in Video In-Reply-To: References: Message-ID: <4C7E1A77.4090401@hora-obscura.de> On 27.08.2010 10:10, LIJIN SYAM K wrote: > Hi , > > I tried The textoverlay plug-in and its working fine , Is it > possible to make the text in the video scrolling...?? > > I tired the pipeline > > gst-launch -v videotestsrc ! textoverlay text="Room A" valign=top halign=left ! xvimagesink > > Regards > Lijin not as it is now. One could make all the position properties controllable. Then you could animate them from the application. Stefan From ensonic at hora-obscura.de Wed Sep 1 11:20:11 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 01 Sep 2010 12:20:11 +0300 Subject: [gst-devel] Crash in v4l2src when device is busy In-Reply-To: References: Message-ID: <4C7E1ACB.7020709@hora-obscura.de> On 27.08.2010 10:24, 4ernov wrote: > Hello, > > I've faced some strange behavior of v4l2src when capture device is > busy and I try to set it to READY state. The program crashes in some > of GStreamer routines with the following error message: > > ERROR! Error: Device '/dev/video0' cannot capture at 640x480 > DEBUG Error: gstv4l2object.c(1920): gst_v4l2_object_set_format (): > /GstPipeline:camera/GstV4l2Src:v4l2src0: > Call to S_FMT failed for YUYV @ 640x480: Device or resource busy > please file a bug with the backtrace and standalone test app attached. thanks Stefan > Backtrace is like this: > > #1 0x02581e1e in gst_v4l2src_create (src=0x8353000, buf=0xb50fd1cc) > at gstv4l2src.c:914 > #2 0x025c3765 in gst_push_src_create (bsrc=0x8353000, > offset=18446744073709551615, length=4096, ret=0xb50fd1cc) at > gstpushsrc.c:117 > #3 0x025b0881 in gst_base_src_get_range (src=, > offset=18446744073709551615, length=4096, buf=0xb50fd1cc) at > gstbasesrc.c:2081 > #4 0x025b3327 in gst_base_src_loop (pad=0x8354010) at gstbasesrc.c:2334 > #5 0x0024ed6b in gst_task_func (task=0x83bd0c0) at gsttask.c:238 > #6 0x00250377 in default_func (tdata=0x809b7c0, pool=0x8076810) at > gsttaskpool.c:70 > > My code is just plain as in tutorial I think: > > GstElement *v4l2src, *deinterlace, *videoscale, *tee, > *ffmpegcolorspace, *queue, *xvideoscale, *ximagesink; > > GstBus* bus; > > GstCaps* caps = gst_caps_new_simple ("video/x-raw-yuv", "width", > G_TYPE_INT, 400, > "height", > G_TYPE_INT, 300, NULL); > > /* Create gstreamer elements */ > > _pipeline = gst_pipeline_new("camera"); > > v4l2src = gst_element_factory_make("v4l2src", NULL); > > deinterlace = gst_element_factory_make("deinterlace", NULL); > > videoscale = gst_element_factory_make("videoscale", NULL); > > tee = gst_element_factory_make("tee", NULL); > > queue = gst_element_factory_make("queue", NULL); > > xvideoscale = gst_element_factory_make("videoscale", NULL); > > ffmpegcolorspace = gst_element_factory_make("ffmpegcolorspace", NULL); > > ximagesink = gst_element_factory_make("ximagesink", NULL); > > if (!_pipeline || !v4l2src || !videoscale || !tee || !queue || > !ffmpegcolorspace || !xvideoscale || !ximagesink) > { > qDebug()<<"Elements could not be created. Exiting."; > } > > gst_bin_add_many (GST_BIN (_pipeline), v4l2src, deinterlace, > videoscale, tee, queue, ffmpegcolorspace, xvideoscale, ximagesink, > NULL); > > gst_element_link_many(v4l2src, deinterlace, videoscale, NULL); > > gst_element_link_filtered(videoscale, tee, caps); > > gst_caps_unref(caps); > > > gst_element_link_many(tee, queue, ffmpegcolorspace, xvideoscale, > ximagesink, NULL); > > > g_object_set(G_OBJECT (ximagesink), "force-aspect-ratio", true, NULL); > > g_object_set(G_OBJECT (deinterlace), "mode", 0, NULL); > > if (_winid) > { > gst_x_overlay_set_xwindow_id ((GstXOverlay*)ximagesink, _winid); > } > > bus = gst_pipeline_get_bus (GST_PIPELINE (_pipeline)); > gst_bus_add_watch (bus, bus_call, this); > gst_object_unref (bus); > > /* Set the pipeline to "playing" state*/ > // gst_element_set_state (_pipeline, GST_STATE_READY); > > /* Iterate */ > g_print ("Running...\n"); > > g_object_set(G_OBJECT (_v4l2src), "device", str.c_str(), NULL); > > gst_element_set_state (_pipeline, GST_STATE_READY); > > Is it a bug and so I should post it or perhaps I did something unsafe? > > ------------------------------------------------------------------------------ > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > Be part of this innovative community and reach millions of netbook users > worldwide. Take advantage of special opportunities to increase revenue and > speed time-to-market. Join now, and jumpstart your future. > http://p.sf.net/sfu/intel-atom-d2d > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Wed Sep 1 11:24:44 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 01 Sep 2010 12:24:44 +0300 Subject: [gst-devel] queue or queue2 In-Reply-To: References: Message-ID: <4C7E1BDC.5070409@hora-obscura.de> On 28.08.2010 16:03, Bert Douglas wrote: > Hi All, > > Sorry to keep bothering you with questions. > > Does "queue" work ok? Is it deprecated, in favor of "queue2"? Both work okay. queue is in memory buffering and queue2 can do disk buffering. queue can work with live-sources, queue2 can't (right now). If you just need to decouple elements into different threads use queue. Stefan > > Thanks, > Bert Douglas > > > > ------------------------------------------------------------------------------ > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > Be part of this innovative community and reach millions of netbook users > worldwide. Take advantage of special opportunities to increase revenue and > speed time-to-market. Join now, and jumpstart your future. > http://p.sf.net/sfu/intel-atom-d2d > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Wed Sep 1 11:29:43 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 01 Sep 2010 12:29:43 +0300 Subject: [gst-devel] plugins for editing In-Reply-To: <1283152190.2667.22.camel@deumeu> References: <1283152190.2667.22.camel@deumeu> Message-ID: <4C7E1D07.3030803@hora-obscura.de> On 30.08.2010 10:09, Edward Hervey wrote: > Hi, > > On Sun, 2010-08-29 at 22:47 -0500, Vasilis Liaskovitis wrote: > >> Hi, >> >> has anyone worked on the plugins for editing project? >> http://gstreamer.freedesktop.org/wiki/PluginsForEditing >> >> If there is still work to be done, >> > Definitely :) Nobody tackled that task for this summer of code. > > >> are there any existing open source >> filters/apps that people would like to use as gstreamer elements? >> > You just made me realize I never got round to putting down a list of > the existing available FOSS projects that could be wrapped :( > > At the top of my head: > * audio noise removal: audacity has a quite decent noise > estimation/removal filter. > Also Gnome Wave Cleaner comes to my mind here: http://gwc.sourceforge.net/ Stefan > * video filters: > ** you should have a look at the avisynth filters (most of them are > open-source but not free software though, and exist only for windows, > but should be easy to wrap as gstreamer plugins) > ** you could also leverage the existing open-cv gstplugins for the > motion estimation > > * color correction: > ** Thibault Saunier (thiblahute, SoC student on PiTiVi) was interested > on working on this, you could maybe contact him. Having these elements > in GStreamer is the top-ranked requirement for PiTiVi in regards to > effects :) > ** The detection part is more the ability of being able to represent > the colors in various spaces (HSL, CMYKRGB, ...) by either emitting it > as messages on the bus or as a separate vectorscope/waveform video > stream. > ** The correction part is (mostly) the ability to change the colors > using HSL (Hue/Saturation/Luminance) in different configurable ranges > (Overall, Shadows, Midtones, Light). > > * once a pair of those plugins are available (ex: audio noise > detection/removal) one could also add a bin element that, by re-using > them and working over a number of previous/future frames, applies some > smart correction. > > Don't hesitate to ask for help, > > Edward > > >> I 'd >> be willing to help out with development / testing. >> >> thanks, >> >> -Vasilis >> > > > ------------------------------------------------------------------------------ > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > Be part of this innovative community and reach millions of netbook users > worldwide. Take advantage of special opportunities to increase revenue and > speed time-to-market. Join now, and jumpstart your future. > http://p.sf.net/sfu/intel-atom-d2d > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From julien.isorce at gmail.com Wed Sep 1 12:34:05 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Wed, 1 Sep 2010 12:34:05 +0200 Subject: [gst-devel] GstBaseSink - getcaps Message-ID: Hi, If my X11 settings are 32 bpp then an annoying thing is that running "gst-launch-0.10 videotestsrc ! "video/x-raw-rgb, bpp=16, depth=16" ! ximagesink" gives me an error from videotestsrc but I think the error should come from ximagesink. In ximagesink::gst_ximagesink_getcaps, the xcontext->caps is setup to bpp=32 and depth=24, so I think at this point it should check that this is not compatible with the required caps from my capfilter. The error is: "videotestsrc0 : Could not negotiate format" and I think it should be: "ximagesink0 : Could not negotiate format" Any comment ? Sincerely Julien -------------- next part -------------- An HTML attachment was scrubbed... URL: From bisht.sudarshan at gmail.com Wed Sep 1 12:53:15 2010 From: bisht.sudarshan at gmail.com (sudarshan bisht) Date: Wed, 1 Sep 2010 13:53:15 +0300 Subject: [gst-devel] GstBaseSink - getcaps In-Reply-To: References: Message-ID: Use ffmpegcolorspace between caps filter and ximagesink. On Wed, Sep 1, 2010 at 1:34 PM, Julien Isorce wrote: > Hi, > > If my X11 settings are 32 bpp then an annoying thing is that running > "gst-launch-0.10 videotestsrc ! "video/x-raw-rgb, bpp=16, depth=16" ! > ximagesink" gives me an error from videotestsrc but I think the error should > come from ximagesink. > > In ximagesink::gst_ximagesink_getcaps, the xcontext->caps is setup to > bpp=32 and depth=24, so I think at this point it should check that this is > not compatible with the required caps from my capfilter. > > The error is: > "videotestsrc0 : Could not negotiate format" > and I think it should be: > "ximagesink0 : Could not negotiate format" > > Any comment ? > > Sincerely > Julien > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Regards, Sudarshan Bisht -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Wed Sep 1 12:54:02 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Wed, 01 Sep 2010 11:54:02 +0100 Subject: [gst-devel] GstBaseSink - getcaps In-Reply-To: References: Message-ID: <1283338442.6384.6.camel@zingle> On Wed, 2010-09-01 at 12:34 +0200, Julien Isorce wrote: > If my X11 settings are 32 bpp then an annoying thing is that running > "gst-launch-0.10 videotestsrc ! "video/x-raw-rgb, bpp=16, depth=16" ! > ximagesink" gives me an error from videotestsrc but I think the error > should come from ximagesink. > > In ximagesink::gst_ximagesink_getcaps, the xcontext->caps is setup to > bpp=32 and depth=24, so I think at this point it should check that > this is not compatible with the required caps from my capfilter. > > The error is: > "videotestsrc0 : Could not negotiate format" > and I think it should be: > "ximagesink0 : Could not negotiate format" > > Any comment ? Well, yes. We should find a way to report errors like this better, no doubt. Currently there's no way to communicate error state to the upstream element that drives the pipeline and eventually errors out though, it just knows the flow return and that's that. I think there's a bug about this somewhere in bugzilla, but can't find it right now. Feel free to file a new one. Cheers -Tim From bertd at tplogic.com Wed Sep 1 13:41:32 2010 From: bertd at tplogic.com (Bert Douglas) Date: Wed, 1 Sep 2010 06:41:32 -0500 Subject: [gst-devel] queue or queue2 In-Reply-To: <4C7E1BDC.5070409@hora-obscura.de> References: <4C7E1BDC.5070409@hora-obscura.de> Message-ID: Understood. Thanks much, Bert Douglas On Wed, Sep 1, 2010 at 4:24 AM, Stefan Kost wrote: > On 28.08.2010 16:03, Bert Douglas wrote: > > Hi All, > > > > Sorry to keep bothering you with questions. > > > > Does "queue" work ok? Is it deprecated, in favor of "queue2"? > > Both work okay. queue is in memory buffering and queue2 can do disk > buffering. queue can work with live-sources, queue2 can't (right now). > If you just need to decouple elements into different threads use queue. > > Stefan > > > > Thanks, > > Bert Douglas > > > > > > > > > ------------------------------------------------------------------------------ > > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > > Be part of this innovative community and reach millions of netbook users > > worldwide. Take advantage of special opportunities to increase revenue > and > > speed time-to-market. Join now, and jumpstart your future. > > http://p.sf.net/sfu/intel-atom-d2d > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gregory.petrosyan at gmail.com Wed Sep 1 15:18:16 2010 From: gregory.petrosyan at gmail.com (Gregory Petrosyan) Date: Wed, 1 Sep 2010 17:18:16 +0400 Subject: [gst-devel] Capturing DV video over FireWire in OS X Message-ID: Hello! Is there any way to capture DV video over FireWire in OS X, since dv1394src is not available there? Maybe, anybody have used DVFamily.bundle (http://www.opensource.apple.com/source/IOFWDVComponents/IOFWDVComponents-199.4.1/DVFamily.h) for this already? ? ? ? ? ? ? ? ? Gregory From bertd at tplogic.com Wed Sep 1 16:33:03 2010 From: bertd at tplogic.com (Bert Douglas) Date: Wed, 1 Sep 2010 09:33:03 -0500 Subject: [gst-devel] how to load aravis plugin with python-gstreamer Message-ID: Hi All, I managed to use aravis plugin from shell script as shown below. How can I do same sort of thing from python? Thanks much, Bert Douglas ----------------------------------- LD_PRELOAD=/usr/lib/libaravis.so \ gst-launch --gst-plugin-load=/usr/lib/gstreamer-0.10/libgstaravis.so \ aravissrc camera-name="" \ ! ffmpegcolorspace \ ! autovideosink -------------- next part -------------- An HTML attachment was scrubbed... URL: From todd.fischer at ridgerun.com Thu Sep 2 03:15:27 2010 From: todd.fischer at ridgerun.com (Todd Fischer) Date: Wed, 01 Sep 2010 19:15:27 -0600 Subject: [gst-devel] Debugging a blocked GStreamer pipeline Message-ID: <1283390127.20100.3195.camel@sax-lx> Hi, We are seeing a behavior where we run a GStreamer application (doing audio / video decoding), that runs continuously for several days, then suddenly locks up in the middle of an A/V stream. Our best guess is there is a defect in the ALSA output driver. We believe this because if we exit the application and try aplay, it doesn't work. I am wondering if there is a debug GStreamer logger element in existence or if one is even possible or helpful. Such a logger element could be put anywhere in the pipeline. The logger would have circular buffers to keep track of all potentially interesting recent history, such as pad activity, bus activity, and any other relevant information. The circular buffer entries would all be timestamped. When some event occurs (a file exists, a message/signal is received, etc), the element would dump the history, and continue capturing new data. This idea is after the pipeline locks up, you could cause the history logger to dump it data, and then get an idea of what is suppose to be happening that isn't not occurring. Does such a logging element exist? If not, does it make any sense to develop? Todd -------------- next part -------------- An HTML attachment was scrubbed... URL: From daniel.saul at gmail.com Thu Sep 2 03:32:23 2010 From: daniel.saul at gmail.com (Dan Saul) Date: Wed, 1 Sep 2010 20:32:23 -0500 Subject: [gst-devel] Trouble changing subtitles while stream is running. Message-ID: Hi all, I've been developing a simple video player. I've got to the point where I need to enable track changing. I've been having some trouble with this. For the video and audio tracks, initially I wasn't able to change the tracks without the video pausing. I was able to solve this however with a call gst_element_seek_simple with the same time as current but with the flush option. Unfortunately this hasn't worked for the text track. When I set the current-text property the playback will freeze. Sometimes the program will dead-lock. If it doesn't dead lock and just freezes I can unfreeze it by seeking to a time far away from the current time. At this point the subtitles will have changed. Here is the code I am using to change the subtitle. I must be missing something. http://code.google.com/p/ude-movie-player/source/browse/trunk/PlayerWidget.py?spec=svn81&r=79#167 Thanks for taking a look, Dan From nico at inattendu.org Thu Sep 2 07:43:19 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Thu, 02 Sep 2010 09:43:19 +0400 Subject: [gst-devel] Debugging a blocked GStreamer pipeline In-Reply-To: <1283390127.20100.3195.camel@sax-lx> References: <1283390127.20100.3195.camel@sax-lx> Message-ID: <4C7F3977.7020403@inattendu.org> Hi, Maybe you can use the GST_DEBUG env variable. cf : http://www.gstreamer.net/data/doc/gstreamer/head/gstreamer/html/gst-running.html http://www.gstreamer.net/data/doc/gstreamer/head/manual/html/section-checklist-debug.html Nico Todd Fischer a ?crit : > Hi, > > We are seeing a behavior where we run a GStreamer application (doing > audio / video decoding), that runs continuously for several days, then > suddenly locks up in the middle of an A/V stream. Our best guess is > there is a defect in the ALSA output driver. We believe this because > if we exit the application and try aplay, it doesn't work. > > I am wondering if there is a debug GStreamer logger element in > existence or if one is even possible or helpful. Such a logger > element could be put anywhere in the pipeline. The logger would have > circular buffers to keep track of all potentially interesting recent > history, such as pad activity, bus activity, and any other relevant > information. The circular buffer entries would all be timestamped. > When some event occurs (a file exists, a message/signal is received, > etc), the element would dump the history, and continue capturing new data. > > This idea is after the pipeline locks up, you could cause the history > logger to dump it data, and then get an idea of what is suppose to be > happening that isn't not occurring. > > Does such a logging element exist? If not, does it make any sense to > develop? > > Todd > > > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > ------------------------------------------------------------------------ > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From julien.isorce at gmail.com Thu Sep 2 09:53:06 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Thu, 2 Sep 2010 09:53:06 +0200 Subject: [gst-devel] GstBaseSink - getcaps In-Reply-To: <1283338442.6384.6.camel@zingle> References: <1283338442.6384.6.camel@zingle> Message-ID: 2010/9/1 Tim-Philipp M?ller > On Wed, 2010-09-01 at 12:34 +0200, Julien Isorce wrote: > > > If my X11 settings are 32 bpp then an annoying thing is that running > > "gst-launch-0.10 videotestsrc ! "video/x-raw-rgb, bpp=16, depth=16" ! > > ximagesink" gives me an error from videotestsrc but I think the error > > should come from ximagesink. > > > > In ximagesink::gst_ximagesink_getcaps, the xcontext->caps is setup to > > bpp=32 and depth=24, so I think at this point it should check that > > this is not compatible with the required caps from my capfilter. > > > > The error is: > > "videotestsrc0 : Could not negotiate format" > > and I think it should be: > > "ximagesink0 : Could not negotiate format" > > > > Any comment ? > > Well, yes. We should find a way to report errors like this better, no > doubt. Currently there's no way to communicate error state to the > upstream element that drives the pipeline and eventually errors out > though, it just knows the flow return and that's that. I think there's a > bug about this somewhere in bugzilla, but can't find it right now. Feel > free to file a new one. > > Cheers > -Tim > Hi, Could you suggest me a tittle for bugzilla ? So we can find it easier. And in core or base ? Julien -------------- next part -------------- An HTML attachment was scrubbed... URL: From gurpreet at tataelxsi.co.in Thu Sep 2 13:07:28 2010 From: gurpreet at tataelxsi.co.in (Gurpreet) Date: Thu, 02 Sep 2010 16:37:28 +0530 Subject: [gst-devel] Problem in qtmux Message-ID: <4C7F8570.50108@tataelxsi.co.in> Hi All.. I m encoding pcm file with faac and then muxing it with qtmux. I am using this pipeline. file is being played in VLC and QT player. gst-launch filesrc blocksize=2048 location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, endianness=1234, signed=true ! faac ! aacparse ! audio/mpeg,rate=48000, channels=1, mpegversion=4, layer=3 ! qtmux ! filesink location=/home/Gurpreet/outputfilesmp4/exp_mov.mov But Whenever i m setting faac property "outputformat=1" ( for ADTS Header ) then gst-launch filesrc blocksize=2048 location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, endianness=1234, signed=true ! faac outputformat=1 ! aacparse ! audio/mpeg,rate=48000, channels=1, mpegversion=4, layer=3 ! qtmux ! filesink location=/home/Gurpreet/outputfilesmp4/exp_mov.mov then the generated file is being played in vlc but in QT bar is moving but no audio is coming. what could be the error ? outputformat=0 is for RAW Aac outputformat=1 is for ADTS is this bug in qtmux ? what could be the reason ? instead of that if i m reading aac (RAW) file with filesrc and then passing it to qtmux then mov file is being played in vlc as well as in quicktime. but if i m reading aac (ADTS) file with filesrc then passing it to qtmux then mov file is playing fine in vlc player but in qt,,bar is moving but no audio is coming.. where is the problem ?? in qtmux or quicktime doesnot support adts file in mov container. Thanks Gurpreet -- / Gurpreet Singh Tata Elxsi Ltd. Bangalore +91-8022984127 +91-9019219511 / Rabba Mehar Kari Asi Ud-de Aasre Tere -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Thu Sep 2 13:22:04 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 2 Sep 2010 06:22:04 -0500 Subject: [gst-devel] GStreamer-CRITICAL assertion failure Message-ID: Greetings ! I am getting an assertion failure. I am writing in the hope that this is a well known issue and you can point me in the right direction. The problem happens when I do a "switch" on an input-selector element and the aravissrc becomes the new active pad. Thanks much, Bert Douglas ------------------------------------------------------------------------------------------ (panocam5.py:10442): GStreamer-CRITICAL **: gst_segment_set_newsegment_full: assertion `segment->format == format' failed (panocam5.py:10442): GStreamer-CRITICAL **: gst_segment_set_newsegment_full: assertion `segment->format == format' failed (panocam5.py:10442): GStreamer-CRITICAL **: gst_segment_set_newsegment_full: assertion `segment->format == format' failed (panocam5.py:10442): GStreamer-CRITICAL **: gst_segment_set_newsegment_full: assertion `segment->format == format' failed Error: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer. xvimagesink.c(2251): gst_xvimagesink_setcaps (): /GstPipeline:pipeline0/GstXvImageSink:xdisplay-pz: Error calculating the output display ratio of the video. Error: GStreamer encountered a general stream error. gstbasesrc.c(2562): gst_base_src_loop (): /GstPipeline:pipeline0/GstAravis:src1: streaming task paused, reason not-negotiated (-4) -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Thu Sep 2 14:32:22 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 2 Sep 2010 07:32:22 -0500 Subject: [gst-devel] input-selector Message-ID: Hi All, Is there an alternative to the input selector element ? Thanks, Bert Douglas -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Thu Sep 2 14:45:12 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 2 Sep 2010 07:45:12 -0500 Subject: [gst-devel] have written new "effect" for geometrictransform Message-ID: Hi All, I have written a new "effect" for geometrictransform. It does image rotation. The code is appended below. If there is any interest, I will do the work to push it upstream. But I need a mentor to help guide the way. On the code itself. I modified the "circle" element, to do rotation. The changes were confined to the "map" function. I used the same parameters, but with different meaning. Thanks, Bert Douglas ------------------------------------------------------------------------ static gboolean circle_map (GstGeometricTransform * gt, gint x, gint y, gdouble * in_x, gdouble * in_y) { //GstCircleGeometricTransform *cgt = GST_CIRCLE_GEOMETRIC_TRANSFORM_CAST (gt); GstCircle *circle = GST_CIRCLE_CAST (gt); gint w, h; gdouble pad, sa; gdouble cix, ciy, cox, coy; // centers, in/out x/y gdouble ai, ao, ar; // angles, in/out/rotate (radians) gdouble r; // radius gdouble xi, yi, xo, yo; // positions in/out x/y // input and output image height and width w = gt->width; h = gt->height; // our parameters pad = circle->height; // padding on bottom and right of input image ar = circle->angle * M_PI / 180.0; // angle of rotation, degrees to radians sa = circle->spread_angle; // not used // get in and out centers cox = 0.5 * w; coy = 0.5 * h; cix = 0.5 * (w - pad); ciy = 0.5 * (h - pad); // convert output image position to polar form xo = x - cox; yo = y - coy; ao = atan2 (yo, xo); r = sqrt (xo * xo + yo * yo); // perform rotation backward to get input image rotation // this seems wrong, but rotation from in-->out is counterclockwise ai = ao + ar; // back to rectangular for input image position xi = r * cos (ai); yi = r * sin (ai); // restore center offset, return values to caller *in_x = xi + cix; *in_y = yi + ciy; GST_DEBUG_OBJECT (circle, "Inversely mapped %d %d into %lf %lf", x, y, *in_x, *in_y); return TRUE; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From thiagossantos at gmail.com Thu Sep 2 15:35:12 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Thu, 2 Sep 2010 10:35:12 -0300 Subject: [gst-devel] have written new "effect" for geometrictransform In-Reply-To: References: Message-ID: On Thu, Sep 2, 2010 at 9:45 AM, Bert Douglas wrote: > Hi All, > > I have written a new "effect" for geometrictransform. It does image > rotation. > The code is appended below. > If there is any interest, I will do the work to push it upstream. > But I need a mentor to help guide the way. > > On the code itself. > I modified the "circle" element, to do rotation. > The changes were confined to the "map" function. > I used the same parameters, but with different meaning. > Great! I can help you getting this in. Can you provide this as a git patch in bugzilla? In case you've never done this before, I could guide you through the details so you'd learn how to do it. I'm available on #gstreamer on freenode as 'thiagoss'. You can talk to me there or send me a private mail. > > Thanks, > Bert Douglas > > ------------------------------------------------------------------------ > > static gboolean > circle_map (GstGeometricTransform * gt, gint x, gint y, gdouble * in_x, > gdouble * in_y) > { > //GstCircleGeometricTransform *cgt = GST_CIRCLE_GEOMETRIC_TRANSFORM_CAST > (gt); > GstCircle *circle = GST_CIRCLE_CAST (gt); > > gint w, h; > gdouble pad, sa; > gdouble cix, ciy, cox, coy; // centers, in/out x/y > gdouble ai, ao, ar; // angles, in/out/rotate (radians) > gdouble r; // radius > gdouble xi, yi, xo, yo; // positions in/out x/y > > // input and output image height and width > w = gt->width; > h = gt->height; > > // our parameters > pad = circle->height; // padding on bottom and right of input > image > ar = circle->angle * M_PI / 180.0; // angle of rotation, degrees to > radians > sa = circle->spread_angle; // not used > > // get in and out centers > cox = 0.5 * w; > coy = 0.5 * h; > cix = 0.5 * (w - pad); > ciy = 0.5 * (h - pad); > > // convert output image position to polar form > xo = x - cox; > yo = y - coy; > ao = atan2 (yo, xo); > r = sqrt (xo * xo + yo * yo); > > // perform rotation backward to get input image rotation > // this seems wrong, but rotation from in-->out is counterclockwise > ai = ao + ar; > > // back to rectangular for input image position > xi = r * cos (ai); > yi = r * sin (ai); > > // restore center offset, return values to caller > *in_x = xi + cix; > *in_y = yi + ciy; > > GST_DEBUG_OBJECT (circle, "Inversely mapped %d %d into %lf %lf", > x, y, *in_x, *in_y); > > return TRUE; > } > > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From thiagossantos at gmail.com Thu Sep 2 15:35:12 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Thu, 2 Sep 2010 10:35:12 -0300 Subject: [gst-devel] have written new "effect" for geometrictransform In-Reply-To: References: Message-ID: On Thu, Sep 2, 2010 at 9:45 AM, Bert Douglas wrote: > Hi All, > > I have written a new "effect" for geometrictransform. It does image > rotation. > The code is appended below. > If there is any interest, I will do the work to push it upstream. > But I need a mentor to help guide the way. > > On the code itself. > I modified the "circle" element, to do rotation. > The changes were confined to the "map" function. > I used the same parameters, but with different meaning. > Great! I can help you getting this in. Can you provide this as a git patch in bugzilla? In case you've never done this before, I could guide you through the details so you'd learn how to do it. I'm available on #gstreamer on freenode as 'thiagoss'. You can talk to me there or send me a private mail. > > Thanks, > Bert Douglas > > ------------------------------------------------------------------------ > > static gboolean > circle_map (GstGeometricTransform * gt, gint x, gint y, gdouble * in_x, > gdouble * in_y) > { > //GstCircleGeometricTransform *cgt = GST_CIRCLE_GEOMETRIC_TRANSFORM_CAST > (gt); > GstCircle *circle = GST_CIRCLE_CAST (gt); > > gint w, h; > gdouble pad, sa; > gdouble cix, ciy, cox, coy; // centers, in/out x/y > gdouble ai, ao, ar; // angles, in/out/rotate (radians) > gdouble r; // radius > gdouble xi, yi, xo, yo; // positions in/out x/y > > // input and output image height and width > w = gt->width; > h = gt->height; > > // our parameters > pad = circle->height; // padding on bottom and right of input > image > ar = circle->angle * M_PI / 180.0; // angle of rotation, degrees to > radians > sa = circle->spread_angle; // not used > > // get in and out centers > cox = 0.5 * w; > coy = 0.5 * h; > cix = 0.5 * (w - pad); > ciy = 0.5 * (h - pad); > > // convert output image position to polar form > xo = x - cox; > yo = y - coy; > ao = atan2 (yo, xo); > r = sqrt (xo * xo + yo * yo); > > // perform rotation backward to get input image rotation > // this seems wrong, but rotation from in-->out is counterclockwise > ai = ao + ar; > > // back to rectangular for input image position > xi = r * cos (ai); > yi = r * sin (ai); > > // restore center offset, return values to caller > *in_x = xi + cix; > *in_y = yi + ciy; > > GST_DEBUG_OBJECT (circle, "Inversely mapped %d %d into %lf %lf", > x, y, *in_x, *in_y); > > return TRUE; > } > > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Thu Sep 2 16:49:10 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Thu, 2 Sep 2010 17:49:10 +0300 Subject: [gst-devel] queue or queue2 In-Reply-To: References: <4C7E1BDC.5070409@hora-obscura.de> Message-ID: I think it works. I can run gst-launch v4l2src ! queue2 max-size-bytes=1000 ! xvimagesink with this output: ////////////////////////////////////////// 0:00:00.019022180 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstpad.c:1886:gst_pad_link_prepare: [00m trying to link v4l2src0:src and queue20:sink 0:00:00.019535863 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstpad.c:2059:gst_pad_link: [00m linked v4l2src0:src and queue20:sink, successful 0:00:00.019568199 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;37;41m GST_PIPELINE ./grammar.y:568:gst_parse_perform_link: [00m linking queue20:(any) to xvimagesink0:(any) (0/0) with caps "(NULL)" 0:00:00.019582936 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;37;41m GST_ELEMENT_PADS gstutils.c:1585:gst_element_link_pads: [00m trying to link element queue20:(any) to element xvimagesink0:(any) 0:00:00.019597463 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstutils.c:1046:gst_pad_check_link: [00m trying to link queue20:src and xvimagesink0:sink 0:00:00.020158638 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstutils.c:1493:prepare_link_maybe_ghosting: [00m queue20 and xvimagesink0 in same bin, no need for ghost pads 0:00:00.020177215 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstpad.c:1886:gst_pad_link_prepare: [00m trying to link queue20:src and xvimagesink0:sink 0:00:00.020702631 [335m12928 [00m 0x1fc1080 [32;01mINFO [00m [00;01;31;41m GST_PADS gstpad.c:2059:gst_pad_link: [00m linked queue20:src and xvimagesink0:sink, successful //////////////////////////// Am I missing smth? 2010/9/1 Bert Douglas > Understood. > > Thanks much, > Bert Douglas > > > On Wed, Sep 1, 2010 at 4:24 AM, Stefan Kost wrote: > >> On 28.08.2010 16:03, Bert Douglas wrote: >> > Hi All, >> > >> > Sorry to keep bothering you with questions. >> > >> > Does "queue" work ok? Is it deprecated, in favor of "queue2"? >> >> Both work okay. queue is in memory buffering and queue2 can do disk >> buffering. queue can work with live-sources, queue2 can't (right now). >> If you just need to decouple elements into different threads use queue. >> >> Stefan >> > >> > Thanks, >> > Bert Douglas >> > >> > >> > >> > >> ------------------------------------------------------------------------------ >> > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program >> > Be part of this innovative community and reach millions of netbook users >> > worldwide. Take advantage of special opportunities to increase revenue >> and >> > speed time-to-market. Join now, and jumpstart your future. >> > http://p.sf.net/sfu/intel-atom-d2d >> > >> > >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > >> >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Thu Sep 2 17:45:06 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Thu, 2 Sep 2010 18:45:06 +0300 Subject: [gst-devel] gstreamer video conferencig In-Reply-To: References: Message-ID: Have you tried to put the queue before decoding, and limit buffer-size? What are your pipelinies currently? 2010/8/17 Nitin Das > Hello, > > i would like to use gstreamer pipeline for video conferencing. Currently > i m using v4l2src for camera capture and alsasrc for audio capture then > encode to h264 and aac respectively and mux into mpegts and send it over > rtp. But this process puts delay of around 3-4 seconds and doesn't give the > real feeling of video conferencing system. Is there any combination of > pipeline , plugins available so that the delay would be in some > milliseconds. > > --nitin > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by > > Make an app they can't live without > Enter the BlackBerry Developer Challenge > http://p.sf.net/sfu/RIM-dev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From pub at ciemborowicz.pl Thu Sep 2 15:40:30 2010 From: pub at ciemborowicz.pl (mc) Date: Thu, 2 Sep 2010 06:40:30 -0700 (PDT) Subject: [gst-devel] Displaying waveform of an audiofile Message-ID: <1283434830729-2521365.post@n4.nabble.com> Hi. I have to write audio editor (it will serve to cut samples from audio files). I thinking about writing it in Vala. I'm looking for ready-made widget which: - displays waveform of an audiofile (scrollable) - displays timeline/frame number - have a zoom - give ability to select part of track May you suggest me something? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Displaying-waveform-of-an-audiofile-tp2521365p2521365.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Thu Sep 2 16:24:57 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Thu, 2 Sep 2010 07:24:57 -0700 (PDT) Subject: [gst-devel] Picture Seetings In-Reply-To: <1283287101356-2402283.post@n4.nabble.com> References: <1283284182923-2402216.post@n4.nabble.com> <1283287101356-2402283.post@n4.nabble.com> Message-ID: <1283437497283-2524247.post@n4.nabble.com> Someone please. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Picture-Seetings-tp2402216p2524247.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From t.i.m at zen.co.uk Thu Sep 2 18:01:48 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Thu, 02 Sep 2010 17:01:48 +0100 Subject: [gst-devel] gstreamer video conferencig In-Reply-To: References: Message-ID: <1283443308.8365.14.camel@zingle> On Tue, 2010-08-17 at 23:11 +0530, Nitin Das wrote: > i would like to use gstreamer pipeline for video conferencing. > Currently i m using v4l2src for camera capture and alsasrc for audio > capture then encode to h264 and aac respectively and mux into mpegts > and send it over rtp. But this process puts delay of around 3-4 > seconds and doesn't give the real feeling of video conferencing > system. Is there any combination of pipeline , plugins available so > that the delay would be in some milliseconds. It would be useful to know what the exact pipeline is that you're using. If you are encoding to h.264 with x264enc, you may want to use x264enc from gst-plugins-ugly git (or latest pre-release) and then set the tune=zerolatency property on it. Lastly: have you looked at farsight? Cheers -Tim From t.i.m at zen.co.uk Thu Sep 2 18:08:49 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Thu, 02 Sep 2010 17:08:49 +0100 Subject: [gst-devel] Picture Settings In-Reply-To: <1283287101356-2402283.post@n4.nabble.com> References: <1283284182923-2402216.post@n4.nabble.com> <1283287101356-2402283.post@n4.nabble.com> Message-ID: <1283443729.8365.18.camel@zingle> On Tue, 2010-08-31 at 13:38 -0700, Pedro.henrique wrote: > I need to open the image from the camera at any resolution, and also want > to know how I can move the image, example: put running on the top right, and > appears this error, can someone help me? > > gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, > height=720 ! directdrawsink > > I tried also > > gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, > height=720 ! dshowvideosink > Seeting Pipeline to Paused . . . > ERROR : Pipeline doesn't want Pause. > ERROR : from element /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0: Could > not negotiate format > Addictional debug info : > > ..\..\..\Source\gstreamer\libs\gst\base\gstbasesrc.c(2719): > gst_base_src_start(): /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0 > > Check your filtered caps, if any Are you sure that the camera supports this resolution natively? Are you sure that the camera/driver supports RGB output? What do these commands output: gst-launch-0.10 -v ksvideosrc device-index=0 ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-raw-rgb ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-raw-rgb,width=1024 ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-raw-rgb,width=1024,height=720 ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-raw-yuv ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-raw-yuv ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/jpeg ! fakesink gst-launch-0.10 -v ksvideosrc device-index=0 ! video/x-dv ! fakesink Also: have you tried adding ffmpegcolorspace ! videoscale in front of your video sinks? Cheers -Tim From katcipis at inf.ufsc.br Thu Sep 2 18:15:54 2010 From: katcipis at inf.ufsc.br (Tiago Katcipis) Date: Thu, 2 Sep 2010 13:15:54 -0300 Subject: [gst-devel] Picture Seetings In-Reply-To: <1283287101356-2402283.post@n4.nabble.com> References: <1283284182923-2402216.post@n4.nabble.com> <1283287101356-2402283.post@n4.nabble.com> Message-ID: On Tue, Aug 31, 2010 at 5:38 PM, Pedro.henrique wrote: > > I need to open the image from the camera at any resolution, and also want > to know how I can move the image, example: put running on the top right, > and > appears this error, can someone help me? > > > gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, > height=720 ! directdrawsink > > I tried also > > gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, > height=720 ! dshowvideosink > i worked very little with video... on windows even less :-). But you could try ffmpegcolorspace: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-ffmpegcolorspace.html like this: gst-launch -v ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, height=720 ! ffmpegcolorspace ! directdrawsink hope this helps. Best regards, Katcipis > > > > Seeting Pipeline to Paused . . . > ERROR : Pipeline doesn't want Pause. > ERROR : from element /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0: > Could > not negotiate format > Addictional debug info : > > ..\..\..\Source\gstreamer\libs\gst\base\gstbasesrc.c(2719): > gst_base_src_start(): /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0 > > Check your filtered caps, if any > Seeting pipeline to Null . . . > Freeing Pipeline . . . > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/Picture-Seetings-tp2402216p2402283.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- http://www.getgnulinux.org/windows -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Thu Sep 2 18:52:50 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Thu, 2 Sep 2010 19:52:50 +0300 Subject: [gst-devel] Picture Seetings In-Reply-To: References: <1283284182923-2402216.post@n4.nabble.com> <1283287101356-2402283.post@n4.nabble.com> Message-ID: you can also use videoparse to set width,height and framerate 2010/9/2 Tiago Katcipis > > > On Tue, Aug 31, 2010 at 5:38 PM, Pedro.henrique < > pedro.faria at grupofox.com.br> wrote: > >> >> I need to open the image from the camera at any resolution, and also want >> to know how I can move the image, example: put running on the top right, >> and >> appears this error, can someone help me? >> >> >> gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, >> height=720 ! directdrawsink >> >> I tried also >> >> gst-launch ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, >> height=720 ! dshowvideosink >> > > i worked very little with video... on windows even less :-). But you could > try ffmpegcolorspace: > > > http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-ffmpegcolorspace.html > > like this: > gst-launch -v ksvideosrc device-index=0 ! video/x-raw-rgb, width=1024, > height=720 ! ffmpegcolorspace ! directdrawsink > > hope this helps. > > Best regards, > Katcipis > > >> >> >> >> Seeting Pipeline to Paused . . . >> ERROR : Pipeline doesn't want Pause. >> ERROR : from element /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0: >> Could >> not negotiate format >> Addictional debug info : >> >> ..\..\..\Source\gstreamer\libs\gst\base\gstbasesrc.c(2719): >> gst_base_src_start(): /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0 >> >> Check your filtered caps, if any >> Seeting pipeline to Null . . . >> Freeing Pipeline . . . >> -- >> View this message in context: >> http://gstreamer-devel.966125.n4.nabble.com/Picture-Seetings-tp2402216p2402283.html >> Sent from the GStreamer-devel mailing list archive at Nabble.com. >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > > > > -- > http://www.getgnulinux.org/windows > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Thu Sep 2 20:08:04 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 2 Sep 2010 13:08:04 -0500 Subject: [gst-devel] trying to get to "hello world" with RTP Message-ID: Hi All, Trying to get some basic rtp working. No luck. Thanks for looking. -------------------------------------------------------- # rtp-source gst-launch \ videotestsrc pattern=snow \ ! video/x-raw-rgb, width=400, height=300, frame-rate=10/1 \ ! rtpvrawpay \ ! udpsink host=127.0.0.1 port=51234 # rtp-sink gst-launch \ udpsrc uri=udp://127.0.0.1:51234 \ ! rtpvrawdepay \ ! video/x-raw-rgb, width=400, height=300, frame-rate=10/1 \ ! ffmpegcolorspace \ ! ximagesink bertd at bertd-laptop:~/gstreamer/panocam$ . rtp-sink.sh Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock ERROR: from element /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer. Additional debug info: gstbasertpdepayload.c(361): gst_base_rtp_depayload_chain (): /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0: Not RTP format was negotiated Execution ended after 12744218 ns. Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ... -------------- next part -------------- An HTML attachment was scrubbed... URL: From 4ernov at gmail.com Thu Sep 2 22:29:46 2010 From: 4ernov at gmail.com (Alexey Chernov) Date: Fri, 3 Sep 2010 00:29:46 +0400 Subject: [gst-devel] Crash in v4l2src when device is busy In-Reply-To: <4C7E1ACB.7020709@hora-obscura.de> References: <4C7E1ACB.7020709@hora-obscura.de> Message-ID: <201009030029.46501.4ernov@gmail.com> Thanks for response and suggestion. Posted a bug here: https://bugzilla.gnome.org/show_bug.cgi?id=628640 On Wednesday 01 September 2010 13:20:11 you wrote: > On 27.08.2010 10:24, 4ernov wrote: > > Hello, > > > > I've faced some strange behavior of v4l2src when capture device is > > busy and I try to set it to READY state. The program crashes in some > > of GStreamer routines with the following error message: > > > > ERROR! Error: Device '/dev/video0' cannot capture at 640x480 > > DEBUG Error: gstv4l2object.c(1920): gst_v4l2_object_set_format (): > > /GstPipeline:camera/GstV4l2Src:v4l2src0: > > Call to S_FMT failed for YUYV @ 640x480: Device or resource busy > > please file a bug with the backtrace and standalone test app attached. > > thanks > Stefan > > > Backtrace is like this: > > > > #1 0x02581e1e in gst_v4l2src_create (src=0x8353000, buf=0xb50fd1cc) > > at gstv4l2src.c:914 > > #2 0x025c3765 in gst_push_src_create (bsrc=0x8353000, > > offset=18446744073709551615, length=4096, ret=0xb50fd1cc) at > > gstpushsrc.c:117 > > #3 0x025b0881 in gst_base_src_get_range (src=, > > offset=18446744073709551615, length=4096, buf=0xb50fd1cc) at > > gstbasesrc.c:2081 > > #4 0x025b3327 in gst_base_src_loop (pad=0x8354010) at gstbasesrc.c:2334 > > #5 0x0024ed6b in gst_task_func (task=0x83bd0c0) at gsttask.c:238 > > #6 0x00250377 in default_func (tdata=0x809b7c0, pool=0x8076810) at > > gsttaskpool.c:70 > > > > My code is just plain as in tutorial I think: > > GstElement *v4l2src, *deinterlace, *videoscale, *tee, > > > > *ffmpegcolorspace, *queue, *xvideoscale, *ximagesink; > > > > GstBus* bus; > > > > GstCaps* caps = gst_caps_new_simple ("video/x-raw-yuv", "width", > > > > G_TYPE_INT, 400, > > > > "height", > > > > G_TYPE_INT, 300, NULL); > > > > /* Create gstreamer elements */ > > > > _pipeline = gst_pipeline_new("camera"); > > > > v4l2src = gst_element_factory_make("v4l2src", NULL); > > > > deinterlace = gst_element_factory_make("deinterlace", NULL); > > > > videoscale = gst_element_factory_make("videoscale", NULL); > > > > tee = gst_element_factory_make("tee", NULL); > > > > queue = gst_element_factory_make("queue", NULL); > > > > xvideoscale = gst_element_factory_make("videoscale", NULL); > > > > ffmpegcolorspace = gst_element_factory_make("ffmpegcolorspace", > > NULL); > > > > ximagesink = gst_element_factory_make("ximagesink", NULL); > > > > if (!_pipeline || !v4l2src || !videoscale || !tee || !queue || > > > > !ffmpegcolorspace || !xvideoscale || !ximagesink) > > > > { > > > > qDebug()<<"Elements could not be created. Exiting."; > > > > } > > > > gst_bin_add_many (GST_BIN (_pipeline), v4l2src, deinterlace, > > > > videoscale, tee, queue, ffmpegcolorspace, xvideoscale, ximagesink, > > NULL); > > > > gst_element_link_many(v4l2src, deinterlace, videoscale, NULL); > > > > gst_element_link_filtered(videoscale, tee, caps); > > > > gst_caps_unref(caps); > > > > > > gst_element_link_many(tee, queue, ffmpegcolorspace, xvideoscale, > > > > ximagesink, NULL); > > > > g_object_set(G_OBJECT (ximagesink), "force-aspect-ratio", true, > > NULL); > > > > g_object_set(G_OBJECT (deinterlace), "mode", 0, NULL); > > > > if (_winid) > > { > > > > gst_x_overlay_set_xwindow_id ((GstXOverlay*)ximagesink, _winid); > > > > } > > > > bus = gst_pipeline_get_bus (GST_PIPELINE (_pipeline)); > > gst_bus_add_watch (bus, bus_call, this); > > gst_object_unref (bus); > > > > /* Set the pipeline to "playing" state*/ > > > > // gst_element_set_state (_pipeline, GST_STATE_READY); > > > > /* Iterate */ > > g_print ("Running...\n"); > > > > g_object_set(G_OBJECT (_v4l2src), "device", str.c_str(), NULL); > > > > gst_element_set_state (_pipeline, GST_STATE_READY); > > > > Is it a bug and so I should post it or perhaps I did something unsafe? > > > > ------------------------------------------------------------------------- > > ----- Sell apps to millions through the Intel(R) Atom(Tm) Developer > > Program Be part of this innovative community and reach millions of > > netbook users worldwide. Take advantage of special opportunities to > > increase revenue and speed time-to-market. Join now, and jumpstart your > > future. > > http://p.sf.net/sfu/intel-atom-d2d > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From rob at ti.com Fri Sep 3 01:17:17 2010 From: rob at ti.com (Rob Clark) Date: Thu, 2 Sep 2010 18:17:17 -0500 Subject: [gst-devel] Gstreamer And Eclipse In-Reply-To: <1283240280787-2401173.post@n4.nabble.com> References: <1283240280787-2401173.post@n4.nabble.com> Message-ID: <7704D28D-BA9D-49EB-8E84-0669E1484C2A@ti.com> maybe not really a gst topic.. these comments apply to any autotools C project. That said, you might find the linuxtools plugin useful: http://www.eclipse.org/linuxtools/ It include support for autotools projects, among other niceties. Setup one autotools C project per gst tree (and optionally glib), and spend a bit of time setting up header file search paths (ie. in gst-plugins-good, right click on project, choose properties, C/C++ General -> Paths and Symbols -> Includes -> Add..., then click all the checkboxes, hit the 'Workspace...' button, and add gstreamer, gstreamer/lib, gst-plugins-base, and gst-plugins-base/gst-libs (assuming you already setup projects for gstreamer and gst-plugins-base). And so on. When you close the properties dialog, it will re-index everything. Hopefully your PC is not short on RAM. BR, -R On Aug 31, 2010, at 2:38 AM, frknml wrote: > > > Hi Everyone; > > I'm very new for gstreamer.I want to use gstreamer with eclipse and > linux.My eclipse is galileo and my linux is 9.04 ubuntu.I installed > gstreamer with synaptic package manager.I wrote below which packages i have > installed. > > libgstreamer-plugins-base0.10-0 > libgstreamer-plugins-base0.10-dev > libgstreamer0.10-0 > libgstreamer0.10-dev > libgstreamer0.10-0-dbg > > I have created a c++ project in my eclipse,and gstreamer-0.10 folder in my > /usr/include directory. > > I'm only trying to include library but it gives > hundreds of errors :). > > I looking forward to hearing your response which may help me . > > Faruk Naml? > > ----------------------------error----------------------------------------------------------- > > In file included from /usr/include/glib-2.0/glib/galloca.h:34, > from /usr/include/glib-2.0/glib.h:32, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gtypes.h:397:2: error: #error unknown ENDIAN type > In file included from /usr/include/glib-2.0/glib.h:33, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/garray.h:50: error: expected \u2018;\u2019 before > \u2018*\u2019 token > /usr/include/glib-2.0/glib/garray.h:144: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/garray.h:147: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > /usr/include/glib-2.0/glib/garray.h:150: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > In file included from /usr/include/glib-2.0/glib/gerror.h:28, > from /usr/include/glib-2.0/glib/gthread.h:34, > from /usr/include/glib-2.0/glib/gasyncqueue.h:34, > from /usr/include/glib-2.0/glib.h:34, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gquark.h:38: error: \u2018guint32\u2019 does not > name a type > /usr/include/glib-2.0/glib/gquark.h:42: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gquark.h:43: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gquark.h:44: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gquark.h:45: error: \u2018GQuark\u2019 was not > declared in this scope > In file included from /usr/include/glib-2.0/glib/gthread.h:34, > from /usr/include/glib-2.0/glib/gasyncqueue.h:34, > from /usr/include/glib-2.0/glib.h:34, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gerror.h:36: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gerror.h:41: error: \u2018GQuark\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gerror.h:42: error: expected primary-expression > before \u2018code\u2019 > /usr/include/glib-2.0/glib/gerror.h:43: error: expected primary-expression > before \u2018const\u2019 > /usr/include/glib-2.0/glib/gerror.h:44: error: expected primary-expression > before \u2018...\u2019 token > /usr/include/glib-2.0/glib/gerror.h:44: error: initializer expression list > treated as compound expression > /usr/include/glib-2.0/glib/gerror.h:46: error: \u2018GQuark\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gerror.h:47: error: expected primary-expression > before \u2018code\u2019 > /usr/include/glib-2.0/glib/gerror.h:48: error: expected primary-expression > before \u2018const\u2019 > /usr/include/glib-2.0/glib/gerror.h:48: error: initializer expression list > treated as compound expression > /usr/include/glib-2.0/glib/gerror.h:54: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gerror.h:61: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gerror.h:67: error: \u2018GQuark\u2019 has not > been declared > In file included from /usr/include/glib-2.0/glib/gasyncqueue.h:34, > from /usr/include/glib-2.0/glib.h:34, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gthread.h:44: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gthread.h:120: error: > \u2018g_thread_gettime\u2019 was not declared in this scope > /usr/include/glib-2.0/glib/gthread.h:120: error: expected \u2018,\u2019 or > \u2018;\u2019 before \u2018(\u2019 token > /usr/include/glib-2.0/glib/gthread.h:246: error: variable or field > \u2018g_static_mutex_init\u2019 declared void > /usr/include/glib-2.0/glib/gthread.h:246: error: \u2018GStaticMutex\u2019 > was not declared in this scope > /usr/include/glib-2.0/glib/gthread.h:246: error: \u2018mutex\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gthread.h:247: error: variable or field > \u2018g_static_mutex_free\u2019 declared void > /usr/include/glib-2.0/glib/gthread.h:247: error: \u2018GStaticMutex\u2019 > was not declared in this scope > /usr/include/glib-2.0/glib/gthread.h:247: error: \u2018mutex\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gthread.h:266: error: \u2018GStaticMutex\u2019 > does not name a type > /usr/include/glib-2.0/glib/gthread.h:268: error: \u2018GSystemThread\u2019 > does not name a type > /usr/include/glib-2.0/glib/gthread.h:285: error: \u2018GStaticMutex\u2019 > does not name a type > /usr/include/glib-2.0/glib/gthread.h:336: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gthread.h:337: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gthread.h:338: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > In file included from /usr/include/glib-2.0/glib.h:37, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gbase64.h:32: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gbase64.h:38: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gbase64.h:43: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gbase64.h:44: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gbase64.h:50: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gbase64.h:52: error: \u2018gsize\u2019 has not > been declared > In file included from /usr/include/glib-2.0/glib.h:38, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gbookmarkfile.h:48: error: \u2018GQuark\u2019 > does not name a type > /usr/include/glib-2.0/glib/gbookmarkfile.h:63: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gbookmarkfile.h:70: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gbookmarkfile.h:97: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gbookmarkfile.h:107: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gbookmarkfile.h:119: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gbookmarkfile.h:172: error: \u2018gsize\u2019 has > not been declared > In file included from /usr/include/glib-2.0/glib/gmem.h:34, > from /usr/include/glib-2.0/glib/glist.h:34, > from /usr/include/glib-2.0/glib/gcache.h:34, > from /usr/include/glib-2.0/glib.h:39, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gslice.h:37: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:38: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:39: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:40: error: expected primary-expression > before \u2018mem_block\u2019 > /usr/include/glib-2.0/glib/gslice.h:40: error: initializer expression list > treated as compound expression > /usr/include/glib-2.0/glib/gslice.h:41: error: variable or field > \u2018g_slice_free1\u2019 declared void > /usr/include/glib-2.0/glib/gslice.h:41: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:42: error: expected primary-expression > before \u2018mem_block\u2019 > /usr/include/glib-2.0/glib/gslice.h:43: error: variable or field > \u2018g_slice_free_chain_with_offset\u2019 declared void > /usr/include/glib-2.0/glib/gslice.h:43: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:44: error: expected primary-expression > before \u2018mem_chain\u2019 > /usr/include/glib-2.0/glib/gslice.h:45: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gslice.h:84: error: \u2018gint64\u2019 has not > been declared > /usr/include/glib-2.0/glib/gslice.h:85: error: \u2018gint64\u2019 does not > name a type > /usr/include/glib-2.0/glib/gslice.h:86: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > In file included from /usr/include/glib-2.0/glib/glist.h:34, > from /usr/include/glib-2.0/glib/gcache.h:34, > from /usr/include/glib-2.0/glib.h:39, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gmem.h:51: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmem.h:52: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmem.h:54: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:56: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmem.h:57: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmem.h:59: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:85: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:87: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:90: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:91: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:92: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:94: error: \u2018gsize\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmem.h:130: error: \u2018gsize\u2019 has not been > declared > In file included from /usr/include/glib-2.0/glib.h:40, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gchecksum.h:63: error: \u2018gssize\u2019 does > not name a type > /usr/include/glib-2.0/glib/gchecksum.h:71: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gchecksum.h:74: error: \u2018guint8\u2019 has not > been declared > /usr/include/glib-2.0/glib/gchecksum.h:75: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gchecksum.h:79: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gchecksum.h:82: error: \u2018gssize\u2019 has not > been declared > In file included from /usr/include/glib-2.0/glib.h:41, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gcompletion.h:47: error: \u2018gsize\u2019 has > not been declared > In file included from /usr/include/glib-2.0/glib.h:42, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gconvert.h:49: error: \u2018GQuark\u2019 does not > name a type > /usr/include/glib-2.0/glib/gconvert.h:57: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gconvert.h:66: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:69: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:70: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:73: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:75: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:76: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:79: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:83: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:84: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:91: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:92: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:93: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:96: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:97: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:98: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:112: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:113: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:114: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:117: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:118: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gconvert.h:119: error: \u2018gsize\u2019 has not > been declared > In file included from /usr/include/glib-2.0/glib.h:43, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gdataset.h:40: error: typedef > \u2018GDataForeachFunc\u2019 is initialized (use __typeof__ instead) > /usr/include/glib-2.0/glib/gdataset.h:40: error: \u2018GQuark\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdataset.h:41: error: expected primary-expression > before \u2018data\u2019 > /usr/include/glib-2.0/glib/gdataset.h:42: error: expected primary-expression > before \u2018user_data\u2019 > /usr/include/glib-2.0/glib/gdataset.h:49: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:51: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:55: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:57: error: > \u2018GDataForeachFunc\u2019 has not been declared > /usr/include/glib-2.0/glib/gdataset.h:95: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:97: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:101: error: \u2018GQuark\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdataset.h:103: error: > \u2018GDataForeachFunc\u2019 has not been declared > In file included from /usr/include/glib-2.0/glib.h:44, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gdate.h:50: error: \u2018gint32\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:51: error: \u2018guint16\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:52: error: \u2018guint8\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:123: error: \u2018GDateDay\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:124: error: expected primary-expression > before \u2018month\u2019 > /usr/include/glib-2.0/glib/gdate.h:125: error: \u2018GDateYear\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:125: error: initializer expression list > treated as compound expression > /usr/include/glib-2.0/glib/gdate.h:126: error: \u2018guint32\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:135: error: \u2018GDateDay\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:137: error: \u2018GDateYear\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:139: error: \u2018guint32\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:140: error: \u2018GDateDay\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:141: error: expected primary-expression > before \u2018month\u2019 > /usr/include/glib-2.0/glib/gdate.h:142: error: \u2018GDateYear\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:142: error: initializer expression list > treated as compound expression > /usr/include/glib-2.0/glib/gdate.h:146: error: \u2018GDateYear\u2019 does > not name a type > /usr/include/glib-2.0/glib/gdate.h:147: error: \u2018GDateDay\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:148: error: \u2018guint32\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:179: error: \u2018GTime\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:184: error: \u2018GDateDay\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:186: error: \u2018GDateYear\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:188: error: \u2018GDateDay\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:190: error: \u2018GDateYear\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:192: error: \u2018guint32\u2019 has not > been declared > /usr/include/glib-2.0/glib/gdate.h:213: error: \u2018GDateYear\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gdate.h:214: error: \u2018guint8\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:216: error: \u2018guint8\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:217: error: \u2018guint8\u2019 does not > name a type > /usr/include/glib-2.0/glib/gdate.h:240: error: \u2018gsize\u2019 does not > name a type > In file included from /usr/include/glib-2.0/glib.h:47, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gfileutils.h:76: error: \u2018GQuark\u2019 does > not name a type > /usr/include/glib-2.0/glib/gfileutils.h:91: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gfileutils.h:95: error: \u2018gssize\u2019 has > not been declared > /usr/include/glib-2.0/glib/gfileutils.h:108: error: \u2018goffset\u2019 was > not declared in this scope > In file included from /usr/include/glib-2.0/glib/giochannel.h:35, > from /usr/include/glib-2.0/glib.h:50, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gmain.h:40: error: typedef > \u2018GChildWatchFunc\u2019 is initialized (use __typeof__ instead) > /usr/include/glib-2.0/glib/gmain.h:40: error: \u2018GPid\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmain.h:41: error: expected primary-expression > before \u2018status\u2019 > /usr/include/glib-2.0/glib/gmain.h:42: error: expected primary-expression > before \u2018data\u2019 > /usr/include/glib-2.0/glib/gmain.h:223: error: \u2018GPid\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmain.h:277: error: \u2018GPid\u2019 has not been > declared > /usr/include/glib-2.0/glib/gmain.h:278: error: \u2018GChildWatchFunc\u2019 > has not been declared > /usr/include/glib-2.0/glib/gmain.h:281: error: \u2018GPid\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gmain.h:282: error: \u2018GChildWatchFunc\u2019 > was not declared in this scope > /usr/include/glib-2.0/glib/gmain.h:283: error: expected primary-expression > before \u2018data\u2019 > /usr/include/glib-2.0/glib/gmain.h:283: error: initializer expression list > treated as compound expression > In file included from /usr/include/glib-2.0/glib/gstring.h:35, > from /usr/include/glib-2.0/glib/giochannel.h:36, > from /usr/include/glib-2.0/glib.h:50, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gunicode.h:34: error: \u2018guint32\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:35: error: \u2018guint16\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:220: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:221: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:222: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:223: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:224: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:225: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:226: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:227: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:228: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:229: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:230: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:231: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:232: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:233: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:234: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:235: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:236: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:240: error: \u2018gunichar\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:241: error: \u2018gunichar\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:242: error: \u2018gunichar\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:246: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:248: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:251: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:254: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:257: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:263: error: variable or field > \u2018g_unicode_canonical_ordering\u2019 declared void > /usr/include/glib-2.0/glib/gunicode.h:263: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:263: error: \u2018string\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:264: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:269: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:278: error: \u2018gunichar\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:279: error: \u2018gunichar\u2019 does > not name a type > /usr/include/glib-2.0/glib/gunicode.h:293: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:298: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:303: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:304: error: \u2018gunichar\u2019 has > not been declared > /usr/include/glib-2.0/glib/gunicode.h:306: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:307: error: \u2018gunichar\u2019 has > not been declared > /usr/include/glib-2.0/glib/gunicode.h:309: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:311: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:316: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:321: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:324: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:329: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:334: error: expected constructor, > destructor, or type conversion before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:339: error: expected \u2018,\u2019 or > \u2018...\u2019 before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:349: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:350: error: expected > primary-expression before \u2018*\u2019 token > /usr/include/glib-2.0/glib/gunicode.h:350: error: \u2018outbuf\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:350: error: initializer expression > list treated as compound expression > /usr/include/glib-2.0/glib/gunicode.h:357: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:361: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:364: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:366: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:368: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:382: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:388: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:390: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gunicode.h:392: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:393: error: \u2018gunichar\u2019 was > not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:393: error: \u2018mirrored_ch\u2019 > was not declared in this scope > /usr/include/glib-2.0/glib/gunicode.h:393: error: initializer expression > list treated as compound expression > /usr/include/glib-2.0/glib/gunicode.h:395: error: \u2018gunichar\u2019 was > not declared in this scope > In file included from /usr/include/glib-2.0/glib/giochannel.h:36, > from /usr/include/glib-2.0/glib.h:50, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/gstring.h:46: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gstring.h:47: error: \u2018gsize\u2019 does not > name a type > /usr/include/glib-2.0/glib/gstring.h:52: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gstring.h:59: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:68: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:69: error: \u2018gsize\u2019 was not > declared in this scope > /usr/include/glib-2.0/glib/gstring.h:78: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:80: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:82: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:84: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:89: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:93: error: \u2018gunichar\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:99: error: \u2018gunichar\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:102: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:104: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:107: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:110: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:111: error: \u2018gunichar\u2019 has > not been declared > /usr/include/glib-2.0/glib/gstring.h:113: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:116: error: \u2018gsize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:118: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:120: error: \u2018gssize\u2019 has not > been declared > /usr/include/glib-2.0/glib/gstring.h:121: error: \u2018gssize\u2019 has not > been declared > In file included from /usr/include/glib-2.0/glib.h:50, > from /usr/include/gstreamer-0.10/gst/gst.h:27, > from ../main.cpp:1: > /usr/include/glib-2.0/glib/giochannel.h:88: error: expected `}' before > \u2018GLIB_SYSDEF_POLLIN\u2019 > /usr/include/glib-2.0/glib/giochannel.h:89: error: expected initializer > before \u2018GLIB_SYSDEF_POLLOUT\u2019 > /usr/include/glib-2.0/glib/giochannel.h:94: error: expected constructor, > destructor, or type conversion before \u2018;\u2019 token > /usr/include/glib-2.0/glib/giochannel.h:120: error: \u2018gsize\u2019 does > not name a type > /usr/include/glib-2.0/glib/giochannel.h:140: error: \u2018GIOCondition\u2019 > has not been declared > /usr/include/glib-2.0/glib/giochannel.h:146: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:147: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:151: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:152: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:155: error: \u2018gint64\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:161: error: \u2018GIOCondition\u2019 > has not been declared > /usr/include/glib-2.0/glib/giochannel.h:176: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:177: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:180: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:181: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:183: error: \u2018gint64\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:193: error: \u2018GIOCondition\u2019 > has not been declared > /usr/include/glib-2.0/glib/giochannel.h:198: error: \u2018GIOCondition\u2019 > has not been declared > /usr/include/glib-2.0/glib/giochannel.h:200: error: \u2018GIOCondition\u2019 > has not been declared > /usr/include/glib-2.0/glib/giochannel.h:208: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:209: error: \u2018gsize\u2019 does > not name a type > /usr/include/glib-2.0/glib/giochannel.h:210: error: \u2018GIOCondition\u2019 > does not name a type > /usr/include/glib-2.0/glib/giochannel.h:236: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:237: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:241: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:245: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:249: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:250: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:253: error: \u2018gunichar\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:257: error: \u2018gssize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:258: error: \u2018gsize\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:261: error: \u2018gunichar\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:264: error: \u2018gint64\u2019 has > not been declared > /usr/include/glib-2.0/glib/giochannel.h:277: error: \u2018GQuark\u2019 does > not name a type > /usr/include/glib-2.0/glib/giochannel.h:364: error: expected declaration > before \u2018}\u2019 token > make: *** [main.o] Error 1 > > > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Gstreamer-And-Eclipse-tp2401173p2401173.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From t.i.m at zen.co.uk Fri Sep 3 01:30:03 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Fri, 03 Sep 2010 00:30:03 +0100 Subject: [gst-devel] Gstreamer And Eclipse In-Reply-To: <1283240280787-2401173.post@n4.nabble.com> References: <1283240280787-2401173.post@n4.nabble.com> Message-ID: <1283470204.28334.3.camel@zingle> On Tue, 2010-08-31 at 00:38 -0700, frknml wrote: > I have created a c++ project in my eclipse,and gstreamer-0.10 folder in my > /usr/include directory. > > I'm only trying to include library but it gives > hundreds of errors :) That's not going to work, and not how it's supposed to work. You need to get the include paths to pass to the compiler using pkg-config --cflags gstreamer-0.10 and then #include How this has to be done exactly in Eclipse I don't know, but pkg-config is a pretty standard tool on linux so I'm sure there's a solution for this. (The reason this doesn't work is that there are includes in other directories as well - glib for example may have platform-specific headers in /usr/lib/glib-2.0/include that are needed). Cheers -Tim From tristan at sat.qc.ca Fri Sep 3 02:09:52 2010 From: tristan at sat.qc.ca (Tristan Matthews) Date: Thu, 2 Sep 2010 20:09:52 -0400 Subject: [gst-devel] trying to get to "hello world" with RTP In-Reply-To: References: Message-ID: 2010/9/2 Bert Douglas > Hi All, > > Trying to get some basic rtp working. No luck. > Thanks for looking. > > -------------------------------------------------------- > > # rtp-source > gst-launch \ > videotestsrc pattern=snow \ > ! video/x-raw-rgb, width=400, height=300, frame-rate=10/1 \ > ! rtpvrawpay \ > ! udpsink host=127.0.0.1 port=51234 > > > # rtp-sink > gst-launch \ > udpsrc uri=udp://127.0.0.1:51234 \ > ! rtpvrawdepay \ > ! video/x-raw-rgb, width=400, height=300, frame-rate=10/1 \ > ! ffmpegcolorspace \ > ! ximagesink > > > You need to specify the caps property on your "rtp-sink" property. See gst-plugins-good/tests/examples/rtp/client-H264.sh and the other rtp examples in that directory. > bertd at bertd-laptop:~/gstreamer/panocam$ . rtp-sink.sh > Setting pipeline to PAUSED ... > Pipeline is live and does not need PREROLL ... > Setting pipeline to PLAYING ... > New clock: GstSystemClock > ERROR: from element /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0: > Internal GStreamer error: negotiation problem. Please file a bug at > http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer. > Additional debug info: > gstbasertpdepayload.c(361): gst_base_rtp_depayload_chain (): > /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0: > Not RTP format was negotiated > Execution ended after 12744218 ns. > Setting pipeline to PAUSED ... > Setting pipeline to READY ... > Setting pipeline to NULL ... > Freeing pipeline ... > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Tristan Matthews email: tristan at sat.qc.ca web: http://tristanswork.blogspot.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From tristan at sat.qc.ca Fri Sep 3 03:55:03 2010 From: tristan at sat.qc.ca (Tristan Matthews) Date: Thu, 2 Sep 2010 21:55:03 -0400 Subject: [gst-devel] trying to get to "hello world" with RTP In-Reply-To: References: Message-ID: 2010/9/2 Tristan Matthews > 2010/9/2 Bert Douglas > > You need to specify the caps property on your "rtp-sink" property. > Sorry, you need to set the "caps" property on your "rtp-sink" pipeline's udpsrc. -- Tristan Matthews email: tristan at sat.qc.ca web: http://tristanswork.blogspot.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Fri Sep 3 06:54:37 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 2 Sep 2010 23:54:37 -0500 Subject: [gst-devel] simple raw video RTP -- still no joy Message-ID: Tristan tried to help me. I tried setting caps as recommended. Still not working. Thanks for looking. # rtp-server gst-launch \ videotestsrc pattern=red \ ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, frame-rate=1/1 \ ! rtpvrawpay \ ! udpsink host=127.0.0.1 port=51234 # rtp-client CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," CAPS=$CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" gst-launch \ udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ ! rtpvrawdepay \ ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, frame-rate=1/1 \ ! ffmpegcolorspace \ ! ximagesink Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock 0:00:00.056868267 14918 0x1188c90 ERROR rtpvrawdepay gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no width specified ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error. Additional debug info: gstbasesrc.c(2562): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: streaming task paused, reason not-negotiated (-4) Execution ended after 18091128 ns. Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ... -------------- next part -------------- An HTML attachment was scrubbed... URL: From gune.harshada at gmail.com Fri Sep 3 08:49:09 2010 From: gune.harshada at gmail.com (harshada gune) Date: Fri, 3 Sep 2010 12:19:09 +0530 Subject: [gst-devel] plugins not found by gst-inspect Message-ID: Hi, I have built some of my plugins. I think they are successfully built as I can see .so files being generated. I set GST_PLUGIN_PATH to the directory containing these .so files. But when I do gst-inspect, I am not finding any of them. What could be the reason? thanks, Harshada -------------- next part -------------- An HTML attachment was scrubbed... URL: From aki.suihkonen at symbio.com Fri Sep 3 09:22:54 2010 From: aki.suihkonen at symbio.com (Aki Suihkonen) Date: Fri, 3 Sep 2010 10:22:54 +0300 (EEST) Subject: [gst-devel] plugins not found by gst-inspect In-Reply-To: References: Message-ID: <54253.10.118.250.9.1283498574.squirrel@wm.flander.com> Hi, Depending on which existing plugins you have taken as the template or source, you might have missed some crucial declaration, such as GST_PLUGIN_DEFINE and a call to gst_element_register Take a look at for instance plugins-good/gst/effectv, where the gst_element_register is called for all the subtypes in gsteffectv.c > Hi, > > I have built some of my plugins. I think they are successfully built as I > can see .so files being generated. I set GST_PLUGIN_PATH to the directory > containing these .so files. > But when I do gst-inspect, I am not finding any of them. > > What could be the reason? > > > thanks, > Harshada > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd_______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From gune.harshada at gmail.com Fri Sep 3 10:09:16 2010 From: gune.harshada at gmail.com (harshada gune) Date: Fri, 3 Sep 2010 13:39:16 +0530 Subject: [gst-devel] plugins not found by gst-inspect In-Reply-To: <54253.10.118.250.9.1283498574.squirrel@wm.flander.com> References: <54253.10.118.250.9.1283498574.squirrel@wm.flander.com> Message-ID: Can you please elaborate further? Should I attach sample plugin C file for reference? I could not understand the saying "Depending on which existing plugins you have taken as the template" . I thought writing new plugins should follow the same template. Thanks, Harshada On Fri, Sep 3, 2010 at 12:52 PM, Aki Suihkonen wrote: > Hi, > > Depending on which existing plugins you have taken as the template > or source, you might have missed some crucial declaration, such as > GST_PLUGIN_DEFINE and a call to gst_element_register > Take a look at for instance plugins-good/gst/effectv, > where the gst_element_register is called for all the subtypes > in gsteffectv.c > > > Hi, > > > > I have built some of my plugins. I think they are successfully built as > I > > can see .so files being generated. I set GST_PLUGIN_PATH to the directory > > containing these .so files. > > But when I do gst-inspect, I am not finding any of them. > > > > What could be the reason? > > > > > > thanks, > > Harshada > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > > http://p.sf.net/sfu/intel-thread-sfd_______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wim.taymans at gmail.com Fri Sep 3 10:10:32 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Fri, 03 Sep 2010 10:10:32 +0200 Subject: [gst-devel] simple raw video RTP -- still no joy In-Reply-To: References: Message-ID: <1283501432.4299.6.camel@metal> On Thu, 2010-09-02 at 23:54 -0500, Bert Douglas wrote: > Tristan tried to help me. I tried setting caps as recommended. Still > not working. You're doing it wrong. read the instructions in this document carefully: http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251 Wim > Thanks for looking. > > # rtp-server > gst-launch \ > videotestsrc pattern=red \ > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, > frame-rate=1/1 \ > ! rtpvrawpay \ > ! udpsink host=127.0.0.1 port=51234 > > # rtp-client > CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," > CAPS= > $CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" > gst-launch \ > udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ > ! rtpvrawdepay \ > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, > frame-rate=1/1 \ > ! ffmpegcolorspace \ > ! ximagesink > > Setting pipeline to PAUSED ... > Pipeline is live and does not need PREROLL ... > Setting pipeline to PLAYING ... > New clock: GstSystemClock > 0:00:00.056868267 14918 0x1188c90 ERROR rtpvrawdepay > gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no > width specified > ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal > data flow error. > Additional debug info: > gstbasesrc.c(2562): gst_base_src_loop > (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > streaming task paused, reason not-negotiated (-4) > Execution ended after 18091128 ns. > Setting pipeline to PAUSED ... > Setting pipeline to READY ... > Setting pipeline to NULL ... > Freeing pipeline ... > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From bilboed at gmail.com Fri Sep 3 11:12:59 2010 From: bilboed at gmail.com (Edward Hervey) Date: Fri, 03 Sep 2010 11:12:59 +0200 Subject: [gst-devel] Pre-release of GNonLin GStreamer editing plugins 0.10.15.2 Message-ID: <1283505179.14358.25.camel@localhost> Hi all, I just made a pre-release of GNonLin since it was long overdue. Please test it thoroughly and file blocker bugs for all regressions or other major issues you find at http://gstreamer.freedesktop.org/bugs/ http://gstreamer.freedesktop.org/data/src/gnonlin/pre/gnonlin-0.10.15.2.tar.bz2 http://gstreamer.freedesktop.org/data/src/gnonlin/pre/gnonlin-0.10.15.2.tar.gz md5sums in the same directory. Unless any blocker bug or regression pops up, expect the 0.10.16 release on monday. Edward From aki.suihkonen at symbio.com Fri Sep 3 11:54:07 2010 From: aki.suihkonen at symbio.com (Aki Suihkonen) Date: Fri, 3 Sep 2010 12:54:07 +0300 (EEST) Subject: [gst-devel] plugins not found by gst-inspect In-Reply-To: References: <54253.10.118.250.9.1283498574.squirrel@wm.flander.com> Message-ID: <57397.10.118.250.9.1283507647.squirrel@wm.flander.com> Please do attach. > Can you please elaborate further? Should I attach sample plugin C file for > reference? > I could not understand the saying "Depending on which existing plugins you > have taken as the template" . > I thought writing new plugins should follow the same template. > > Thanks, > Harshada > > > On Fri, Sep 3, 2010 at 12:52 PM, Aki Suihkonen > wrote: > >> Hi, >> >> Depending on which existing plugins you have taken as the template >> or source, you might have missed some crucial declaration, such as >> GST_PLUGIN_DEFINE and a call to gst_element_register >> Take a look at for instance plugins-good/gst/effectv, >> where the gst_element_register is called for all the subtypes >> in gsteffectv.c >> >> > Hi, >> > >> > I have built some of my plugins. I think they are successfully built >> as >> I >> > can see .so files being generated. I set GST_PLUGIN_PATH to the >> directory >> > containing these .so files. >> > But when I do gst-inspect, I am not finding any of them. >> > >> > What could be the reason? >> > >> > >> > thanks, >> > Harshada >> > >> ------------------------------------------------------------------------------ >> > This SF.net Dev2Dev email is sponsored by: >> > >> > Show off your parallel programming skills. >> > Enter the Intel(R) Threading Challenge 2010. >> > >> http://p.sf.net/sfu/intel-thread-sfd_______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > >> >> >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd_______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From t.i.m at zen.co.uk Fri Sep 3 11:54:24 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Fri, 03 Sep 2010 10:54:24 +0100 Subject: [gst-devel] RELEASE: GStreamer Good Plug-ins 0.10.25 "Woe to You Oh Earth and Sea" Message-ID: <1283507664.4196.27.camel@zingle> This mail announces the release of GStreamer Good Plug-ins 0.10.25 "Woe to You Oh Earth and Sea". GStreamer Good Plug-ins is a set of plug-ins that we consider to have good quality code and correct functionality, under our preferred license (LGPL for the plug-in code, LGPL or LGPL-compatible for the supporting library). Highlights of this release: * v4l2src: massive performance improvement in many cases * streaming mode fixes for avi and matroska/webm * seeking in matroska and webm files that don't have an index * new cpureport element for debugging For more information, see http://gstreamer.freedesktop.org/modules/gst-plugins-good.html To file bugs, request features or submit patches, please go to https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gst-plugins-good Direct links: http://gstreamer.freedesktop.org/src/gst-plugins-good/gst-plugins-good-0.10.25.tar.bz2 http://gstreamer.freedesktop.org/src/gst-plugins-good/gst-plugins-good-0.10.25.tar.gz MD5 sums (for tarballs downloaded from gstreamer.freedesktop.org): d734bc866788d1d6fc74c4ff1318926c gst-plugins-good-0.10.25.tar.bz2 7cf6984ef6ee0c205b3b08bb49bc0e40 gst-plugins-good-0.10.25.tar.gz Enjoy! -------------- next part -------------- Release notes for GStreamer Good Plug-ins?0.10.25 "Woe to You Oh Earth and Sea" The GStreamer team is proud to announce a new release in the 0.10.x stable series of the GStreamer Good Plug-ins. The 0.10.x series is a stable series targeted at end users. It is not API or ABI compatible with the stable 0.8.x series. It is, however, parallel installable with the 0.8.x series. "Such ingratitude. After all the times I've saved your life." A collection of plug-ins you'd want to have right next to you on the battlefield. Shooting sharp and making no mistakes, these plug-ins have it all: good looks, good code, and good licensing. Documented and dressed up in tests. If you're looking for a role model to base your own plug-in on, here it is. If you find a plot hole or a badly lip-synced line of code in them, let us know - it is a matter of honour for us to ensure Blondie doesn't look like he's been walking 100 miles through the desert without water. This module contains a set of plug-ins that we consider to have good quality code, correct functionality, our preferred license (LGPL for the plug-in code, LGPL or LGPL-compatible for the supporting library). We believe distributors can safely ship these plug-ins. People writing elements should base their code on these elements. Other modules containing plug-ins are: gst-plugins-base contains a basic set of well-supported plug-ins gst-plugins-ugly contains a set of well-supported plug-ins, but might pose problems for distributors gst-plugins-bad contains a set of less supported plug-ins that haven't passed the rigorous quality testing we expect Features of this release * v4l2src: massive performance improvement in many cases * streaming mode fixes for avi and matroska/webm * seeking in matroska and webm files that don't have an index * new cpureport element for debugging * avidemux: improve VBR audio stream handling * avidemux: streaming mode fixes: use proper offset for movi-based index, handle 0-size data chunks * debugutils: new element cpureport, posts "cpu-report" element messages on bus * flacdec, rtspsrc, rtph264pay, rtpmp4vdepay: memory leak fixes * gconfvideosrc: use correct GConf key (ie. not the audiosrc key) * gdkpixbuf: remove gdkpixbuf3 plugin again, gdk-pixbuf was split out of gtk+ and will stay at 2.x * id3v2mux: write beats-per-minute tag using TBPM frame * jpegdec: fix markers parsing regression * matroskademux: do not error out on a block with unknown tracknumber * matroskademux: fix streaming in case where the size in bytes is unknown * matroskademux: handle bogus files storing ADTS AAC data * matroskademux: support seeking in local files even if they don't have an index * matroskamux: don't try to seek back and fix up headers if streamable=TRUE * pulsesink: fix race when creating multiple pulsesinks at the same time * qtdemux: also calculate PAR using track width and height for QT files * qtdemux: fix the max/avg in btrt atom reading * qtdemux: improve reverse playback * qtdemux: parse 64-bit version of mvhd atom as well instead of erroring out * qtdemux: prevent reading past avc1 atom when parsing * rtpg729pay: avoid basertppayload perfect-rtptime mode * rtph263pdepay: allow more clock-rates as input * rtpL16depay: also parse encoding-params for the number of channels * rtpL16depay: default to 1 channel if number of channels not specified * rtpmp4gpay: implement perfect timestamps * rtspsrc: add "port-range" property, useful for setups with firewall/ipsec * rtspsrc: don't reuse udp sockets (avoids odd errors when data from previous streams is received) * udpsrc: add "reuse" property to enable or disable port reuse (enabled by default, but disabled in rtspsrc) * v4l2: sort formats in the right order so that non-emulated formats are prefered * videobalance: fix wrong locking order that could lead to a deadlock * videomixer: only reset QoS information and send a NEWSEGMENT event downstream for NEWSEGMENT events on the master pad Bugs fixed in this release * 626463 : [matroskademux] " reading large block of size 14688496 not supported " * 593117 : [avidemux] Support AVF files * 618535 : [avidemux] fails to stream eva_2010_2.22_promo1.avi * 621520 : [id3v2mux] write beats-per-minute tag * 622017 : [GstRtpMP4GDepay] Packet payload was too short. * 622577 : rtspsrc has confusing error messages * 623209 : bug in rtpL16depay * 623357 : avidemux: push mode doesn't work for some http streaming avi files * 623629 : Reverse Playback Issue in QtDemux * 624173 : [qtdemux] qt file with dimension data in tkhd does not get pixel-aspect-ratio in caps * 624331 : videobalance: deadlocks/freezes when changing " brightness " property * 624455 : The matroska muxer seeks even when used with streamable=TRUE * 624770 : rtspsrc: memory leak in gst_rtspsrc_handle_request * 625002 : [examples] Don't use GdkDraw * 625153 : rtspsrc: add property to set client port range * 625302 : [qtdemux] Set the pixel-aspect-ratio field also for par=1/1 * 625371 : [matroskademux] critical warnings when playing live webm with progressive download enabled * 625442 : pulsesink: crash - pa_threaded_mainloop_stop is called from the pa thread * 625452 : [videomixer] Pipeline from the docs doesn't work anymore * 626467 : matroskademux: CRITICAL **: file matroska-demux.c: line 578 (gst_matroska_demux_get_length): should not be reached * 626609 : [qtdemux] segfault when parsing avc1 atom * 626619 : [imagefreeze] Incorrect seek behaviour * 627689 : [deinterlace] Broken timestamps * 617368 : [matroska] Implement push-mode seeking and non-cue seeking Download You can find source releases of gst-plugins-good in the download directory: http://gstreamer.freedesktop.org/src/gst-plugins-good/ GStreamer Homepage More details can be found on the project's website: http://gstreamer.freedesktop.org/ Support and Bugs We use GNOME's bugzilla for bug reports and feature requests: http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Developers GStreamer is stored in Git, hosted at git.freedesktop.org, and can be cloned from there. Interested developers of the core library, plug-ins, and applications should subscribe to the gstreamer-devel list. If there is sufficient interest we will create more lists as necessary. Applications Contributors to this release * Alessandro Decina * Andoni Morales Alastruey * Arnaud Vrac * David Schleef * Jonathan Matthew * Mark Nauwelaerts * Philippe Normand * Sebastian Dr?ge * Sjoerd Simons * Stefan Kost * Thiago Santos * Tim-Philipp M?ller * Wim Taymans * Zaheer Abbas Merali ? From t.i.m at zen.co.uk Fri Sep 3 11:54:31 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Fri, 03 Sep 2010 10:54:31 +0100 Subject: [gst-devel] RELEASE: GStreamer Ugly Plug-ins 0.10.16 "Because He Knows the Time is Short" Message-ID: <1283507671.4196.28.camel@zingle> This mail announces the release of GStreamer Ugly Plug-ins 0.10.16 "Because He Knows the Time is Short". GStreamer Ugly Plug-ins is a set of plug-ins that have good quality and correct functionality, but distributing them might pose problems. The license on either the plug-ins or the supporting libraries might not be how we'd like. The code might be widely known to present patent problems. Highlights of this release: * refactor x264enc to use current x264 API correctly, add new properties and new defaults * liboil is no longer required, but orc is now required For more information, see http://gstreamer.freedesktop.org/modules/gst-plugins-ugly.html To file bugs, request features or submit patches, please go to https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gst-plugins-ugly Direct links: http://gstreamer.freedesktop.org/src/gst-plugins-ugly/gst-plugins-ugly-0.10.16.tar.gz http://gstreamer.freedesktop.org/src/gst-plugins-ugly/gst-plugins-ugly-0.10.16.tar.bz2 MD5 sums (for tarballs downloaded from gstreamer.freedesktop.org): 989e1b0fab010f73f76912f70ec5f62a gst-plugins-ugly-0.10.16.tar.bz2 3671366525edaf9838a7a03b1b4a797e gst-plugins-ugly-0.10.16.tar.gz Enjoy! -------------- next part -------------- Release notes for GStreamer Ugly Plug-ins?0.10.16 "Because He Knows the Time is Short" The GStreamer team is proud to announce a new release in the 0.10.x stable series of the GStreamer Ugly Plug-ins. The 0.10.x series is a stable series targeted at end users. It is not API or ABI compatible with the stable 0.8.x series. It is, however, parallel installable with the 0.8.x series. "When you have to shoot, shoot. Don't talk." There are times when the world needs a color between black and white. Quality code to match the good's, but two-timing, backstabbing and ready to sell your freedom down the river. These plug-ins might have a patent noose around their neck, or a lock-up license, or any other problem that makes you think twice about shipping them. We don't call them ugly because we like them less. Does a mother love her son less because he's not as pretty as the other ones ? No - she commends him on his great personality. These plug-ins are the life of the party. And we'll still step in and set them straight if you report any unacceptable behaviour - because there are two kinds of people in the world, my friend: those with a rope around their neck and the people who do the cutting. This module contains a set of plug-ins that have good quality and correct functionality, but distributing them might pose problems. The license on either the plug-ins or the supporting libraries might not be how we'd like. The code might be widely known to present patent problems. Distributors should check if they want/can ship these plug-ins. Other modules containing plug-ins are: gst-plugins-base contains a basic set of well-supported plug-ins gst-plugins-good contains a set of well-supported plug-ins under our preferred license gst-plugins-bad contains a set of less supported plug-ins that haven't passed the rigorous quality testing we expect Features of this release * refactor x264enc to use current x264 API correctly, add new properties and new defaults * liboil is no longer required, but orc is now required * build: require orc >= 0.4.5, GLib >= 2.20, automake >= 1.10, autoconf >= 2.60; liboil is no longer required * asfdemux: fix playback of files or streams that are shorter than the advertised preroll value * asfdemux: fix sending eos event for chained asfs in pull mode (exotic) * asfdemux: fix playback of files or streams that advertise miniscule preroll values * lamemp3enc: implement latency query * rmdemux: fix playback of sipro audio streams * x264enc: refactor code in preparation for presets/tunings * x264enc: add "profile" property (and default to MAIN profile) * x264enc: improve defaults: medium speed/quality preset; auto mode for threads * x264enc: add "speed-preset", "tune" and "psy-tune" properties * x264enc: add "option-string" property to specify advanced parameters * x264enc: set time base if needed, fixes visual artifacts * x264enc: add "sliced-threads", "sync-lookahead", "intra-refresh", "mb-tree", and "rc-lookahead" properties * x264enc: fix compilation against ancient x264 versions (X264_BUILD <= 75) * x264enc: speed up first pass of multi-pass encoding (has no impact on quality) * x264enc: fix flushing of delayed frames with new default settings Bugs fixed in this release * 599718 : [asf] support chained asfs * 600412 : [asfdemux] Wrong handling of downstream GstFlowReturn * 607798 : x264enc needs updating to support new features and use x264 correctly * 618896 : lamemp3enc doesn't implement latency query * 620007 : Gibberish sound when playing a certain RealMedia file of Sipro/ACELP.net audio codec * 620344 : Update gst-plugins-ugly docs on website * 622407 : [asfdemux] doesn't detect some streams if preroll value is very small * 624786 : x264enc time base is wrong * 625557 : x264enc doesn't flush delayed frames properly * 626577 : [x264enc] regression: doesn't work with older versions of x264 * 627946 : mp3parse misuses GST_FLOW_IS_FATAL, doesn't forward GST_FLOW_UNEXPECTED upstream Download You can find source releases of gst-plugins-ugly in the download directory: http://gstreamer.freedesktop.org/src/gst-plugins-ugly/ GStreamer Homepage More details can be found on the project's website: http://gstreamer.freedesktop.org/ Support and Bugs We use GNOME's bugzilla for bug reports and feature requests: http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Developers GStreamer is stored in Git, hosted at git.freedesktop.org, and can be cloned from there. Interested developers of the core library, plug-ins, and applications should subscribe to the gstreamer-devel list. If there is sufficient interest we will create more lists as necessary. Applications Contributors to this release * Alessandro Decina * David Hoyt * David Schleef * Edward Hervey * Mark Nauwelaerts * Olivier Cr?te * Robert Swain * Sebastian Dr?ge * Stefan Kost * Thiago Santos * Tim-Philipp M?ller * Tristan Matthews * Wim Taymans ? From t.i.m at zen.co.uk Fri Sep 3 11:54:39 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Fri, 03 Sep 2010 10:54:39 +0100 Subject: [gst-devel] RELEASE: GStreamer Bad Plug-ins 0.10.20 "For it is a Human Number" Message-ID: <1283507679.4196.29.camel@zingle> This mail announces the release of GStreamer Bad Plug-ins 0.10.20 "For it is a Human Number". GStreamer Bad Plugins is a set of plugins that aren't up to par compared to the rest. They might be close to being good quality, but they're missing something - be it a good code review, some documentation, a set of tests, a real live maintainer, or some actual wide use. Highlights of this release: * asfmux streaming fixes and improvements * new video effects plugins: coloreffects, gaudieffects, geometrictransform * new gsettings plugin with gsettings{audio,video}{src,sink} elements * new ivfparse element * new rtmpsrc element * new shmsink and shmsrc elements for IPC using shared memory * new videomaxrate element * dshowvideosink improvements * vdpau: H.264 and MPEG-4 decoder (not enabled for autoplugging yet though) * vp8enc: support multipass encoding and keyframe-only mode * neonhttpsrc: timeout properties and cookie support * h264parse and mpegvideoparse: can periodically insert codec data into stream now For more information, see http://gstreamer.freedesktop.org/modules/gst-plugins-bad.html To file bugs, request features or submit patches, please go to http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gst-plugins-bad Direct links: http://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-0.10.20.tar.gz http://gstreamer.freedesktop.org/src/gst-plugins-bad/gst-plugins-bad-0.10.20.tar.bz2 MD5 sums (for tarballs downloaded from gstreamer.freedesktop.org): 7c84766f6d24f41ba90c3f6141012ab8 gst-plugins-bad-0.10.20.tar.bz2 cafb4aee08a1994dd0ba3d119756ef0f gst-plugins-bad-0.10.20.tar.gz Enjoy! -------------- next part -------------- Release notes for GStreamer Bad Plug-ins?0.10.20 "For it is a Human Number" The GStreamer team is proud to announce a new release in the 0.10.x stable series of the GStreamer Bad Plug-ins. The 0.10.x series is a stable series targeted at end users. It is not API or ABI compatible with the stable 0.8.x series. It is, however, parallel installable with the 0.8.x series. "That an accusation?" No perfectly groomed moustache or any amount of fine clothing is going to cover up the truth - these plug-ins are Bad with a capital B. They look fine on the outside, and might even appear to get the job done, but at the end of the day they're a black sheep. Without a golden-haired angel to watch over them, they'll probably land in an unmarked grave at the final showdown. Don't bug us about their quality - exercise your Free Software rights, patch up the offender and send us the patch on the fastest steed you can steal from the Confederates. Because you see, in this world, there's two kinds of people, my friend: those with loaded guns and those who dig. You dig. This module contains a set of plug-ins that aren't up to par compared to the rest. They might be close to being good quality, but they're missing something - be it a good code review, some documentation, a set of tests, a real live maintainer, or some actual wide use. If the blanks are filled in they might be upgraded to become part of either gst-plugins-good or gst-plugins-ugly, depending on the other factors. If the plug-ins break, you can't complain - instead, you can fix the problem and send us a patch, or bribe someone into fixing them for you. New contributors can start here for things to work on. Other modules containing plug-ins are: gst-plugins-base contains a basic set of well-supported plug-ins gst-plugins-good contains a set of well-supported plug-ins under our preferred license gst-plugins-ugly contains a set of well-supported plug-ins, but might pose problems for distributors Features of this release * asfmux streaming fixes and improvements * new video effects plugins: coloreffects, gaudieffects, geometrictransform * new gsettings plugin with gsettings{audio,video}{src,sink} elements * new ivfparse element * new rtmpsrc element * new shmsink and shmsrc elements for IPC using shared memory * new videomaxrate element * dshowvideosink improvements * vdpau: H.264 and MPEG-4 decoder (not enabled for autoplugging yet though) * vp8enc: support multipass encoding and keyframe-only mode * neonhttpsrc: timeout properties and cookie support * h264parse and mpegvideoparse: can periodically insert codec data into stream now * build: require GLib >= 2.20, automake >= 1.10, autoconf >= 2.60, want orc * asfmux: deprecate "is-live" property, replaced by new "streamable" property * asfmux: don't set the 'seekable' flag in headers if we are streaming * asfmux: put headers into "streamheader" field in output caps for streaming * asfmux: write preroll info in the header at initialization * bayer: support more formats in bayer2rgb, add rgb2bayer element * camerabin: make viewfinder-sink property work with bins * celt: add support for celt 0.8, remove support for celt < 0.5 * celtenc: add "prediction" and "start band" properties * coloreffects: new element with heat, sepia, xray and cross-process effects * dshowvideosink: many fixes and improvements * fpsdisplaysink: add "fps-update-interval" and read-only "max-fps"/"min-fps" properties * frei0r: check for plugins in /usr/{local/,}lib{32,64}/frei0r-1 too * gaudieffects: new plugin with burn, chromium, dilate, dodge, exclusion, gaussianblur and solarize video effect elements * geometrictransform: new plugin with circle, diffuse, kaleidoscope, marble, pinch, sphere, twirl, and waterripple, fisheye, mirror, square, tunnel, bulge, stretch video effect elements * gsettings: new GSettings plugin with audio/video sources and sinks (to replace gconf plugin) * h264parse: add "config-interval" property to insert SPS/PPS at periodic intervals * h264parse: handle 3-byte bytestream sync codes; process incoming timestamps more correctly * id3mux: add support for beats-per-minute tag * invtelecine: support more video formats, more pulldown formats, add "verify-field-flags" property * ivfparse: add simple IVF parser element (simple framing for VP8 video data) * jpegformat: add exif writing to jifmux and exif parsing to jpegparse * jpegparse: skip extra 0xff markers, optimize jpeg image parsing * mimic: lots of fixes and clean-ups * mpeg4videoparse: add "config-interval" property to re-insert config in stream * mpegtsmux: start pmt at 0x020; take all the pmt in the streamheaders * mpegtsparse: actually work when we have small buffers coming in * mpegvideoparse: apply previous timestamp when there isn't any newer * neonhttpsrc: add "connect-timeout", "read-timeout" and "cookies" properties * qtmux: write audio/video stream bitrates into header, if available * qtmux: write track-number etc. tags even if count is not available * rtmpsrc: new RTMP source element based on librtmp * rtpdtmfmux: add priority sink pads and drop buffers on non-priority sink pads when something is incoming on the priority sink * rtpmux: add support for GstBufferLists; aggregate incoming segments; fix buffer leak * shm: add new shm-based shmsink and shmsrc elements for IPC using shared memory * vdpau: add H.264 decoder and MPEG-4 part 2 decoder; countless other fixes and improvements * videomaxrate: new plugin/element to limit videorate conditionally based on threshold * vp8dec: mark discont buffers, set decoder deadline from the QoS information * vp8enc: allow a maximum keyframe distance of 0, i.e. all frames are keyframes * vp8enc: fix handling of invisible/alt ref frames * vp8enc: add support for enabling automatic insertion of alt-ref frames by the encoder * vp8enc: implement multipass encoding * wildmidi: Add support for wildmidi 0.2.3 Bugs fixed in this release * 625908 : [geometrictransform] Some more configuration options for effects * 625076 : neonhttpsrc: add connect-timeout and read-timeout properties * 620746 : basevideodecoder: remove spurious warning * 566614 : bayer2rgb: Make first line configurable * 570428 : autogen.sh fails * 574290 : [dshowvideosink] make set_xwindow_id() in PLAYING state work * 579926 : [directshowvideosink] Doesn't update the last frame after a seek with the pipeline in PAUSED state. * 580967 : shared memory based sink and source * 591622 : [vdpau] needs better error/failure handling * 602551 : dshowvideosink window close doesn't cause gst-launch to exit * 602936 : [ mp4mux] Lipsync issue when converting mkv to mp4 using h264/aac * 613346 : [dshowvideosink] Add support for updating video caps * 616265 : Add a GSettings plugin that provides the same services as the GConf plugin * 618336 : [mpegvideoparse] mpegvideoparse makes some streams unplayable * 618522 : [asfmux][patch] Improve support for streaming * 618921 : [dshowvideosink] Replace CoIntialize with CoInitializeEx for bettrer integration with GStreamer threads * 618936 : [dshowvideosink] close the created window in ::stop() * 620324 : Format warning in ivfparse * 620717 : [geometrytransform] Incomplete template caps * 620825 : [geometrytransform] Make properties controllable and threadsafe * 620978 : insert NAL7/8 always when encountering I frame * 621205 : [mpeg4videoparse] add config-interval property to insert mpeg4video config data in regular intervals * 621348 : [vp8enc] Implement multipass encoding * 621523 : [id3mux] write beats-per-minute tag * 622369 : [rtmpsrc] crash if correct server but wrong video file name * 622484 : [qtmux] missing track number tag when transcoding to aac * 622690 : elements/jpegparse check fails * 622692 : pipelines/metadata check fails * 622725 : [mpgtsparse] Doesn't remove pids from pes_pids * 623272 : [dshowvideosink] setting force-aspect-ratio has no effect after the sink's renderer has been configured * 623365 : [qtmux and variant] Don't store codec tags * 623550 : doesn't compile with celt 0.8 * 623678 : qtmux: Write AAC/H.264 bitrate if available * 623710 : [frei0r] Load frei0r plugins in /usr/lib64/frei0r-1 too * 623713 : [dshowaudiodec][patch] Fix compilation error * 623722 : gstwildmidi element update to newer library version * 623802 : camerabin: Bin based viewfinder sink support is broken * 623854 : jpegparse reads a wrong EXIF section size * 623881 : aiffmux.c divide by zero * 623883 : [winks] gstksvideosrc.c error on MSVC using gst_element_class_set_details() * 625003 : [examples] Don't use GdkDraw * 625138 : [dshowvideosrc] Don't use a range in the caps if min==max * 625174 : neonhttpsrc: adds cookies support * 625496 : qtmux - misc fix on btrt box * 625722 : [geometrictransform] Some new effect elements for cheese * 625817 : coloreffects: new plugin for lookup table color mapping * 625959 : geometrictransform: make CircleGeometricTransform " radius " property relative * 626049 : [vdpau] crashes in states.check unit test * 626603 : generic/states check fails with gsettings element installed * 626670 : gaudieffects: Fails to link inline functions properly * 626815 : vp8dec: infinite loop if EOS event before started * 627413 : jifmux causes broken jpeg images at least with some rgb pixel format * 627918 : Do not install gst-camera.ui * 627991 : rtpmux will freeze whenever a flush is sent * 624212 : mp4mux produces incorrect frame rates when h264 input uses bframes * 619158 : IVF parser plugin * 619484 : vp8dec: s/IMG_FMT_I420/VPX_IMG_FMT_I420/ * 621404 : [dvbsrc] Set stats-reporting-interval on construction Download You can find source releases of gst-plugins-bad in the download directory: http://gstreamer.freedesktop.org/src/gst-plugins-bad/ GStreamer Homepage More details can be found on the project's website: http://gstreamer.freedesktop.org/ Support and Bugs We use GNOME's bugzilla for bug reports and feature requests: http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Developers GStreamer is stored in Git, hosted at git.freedesktop.org, and can be cloned from there. Interested developers of the core library, plug-ins, and applications should subscribe to the gstreamer-devel list. If there is sufficient interest we will create more lists as necessary. Applications Contributors to this release * Alessandro Decina * Andoni Morales * Andoni Morales Alastruey * Arun Raghavan * Austin Lund * Bastien Nocera * Benjamin Otte * Carl-Anton Ingmarsson * David Hoyt * David Schleef * Edward Hervey * Filippo Argiolas * Jan Schmidt * Jonathan Matthew * Julien Moutte * Luis de Bethencourt * Marc-Andr? Lureau * Mark Nauwelaerts * Michael Smith * Olivier Cr?te * Philip J?genstedt * Philippe Normand * Raimo Jarvi * Robert Swain * Sameer Naik * Sebastian Dr?ge * Sebastian P?lsterl * Stefan Kost * Thiago Santos * Thijs Vermeir * Tim-Philipp M?ller * V?ctor Manuel J?quez Leal * Youness Alaoui * Zaheer Abbas Merali * ?????? ????????? ? From thomazavila at gmail.com Fri Sep 3 13:37:20 2010 From: thomazavila at gmail.com (Thomaz Barros) Date: Fri, 3 Sep 2010 08:37:20 -0300 Subject: [gst-devel] Ubuntu and Gstreamer In-Reply-To: References: Message-ID: Hi Rob, thanks for your response. I just tried your suspect today because i had other problems during the week. I built the lastest gstreamer core and plugins version and the latest libx264 release but it didn't work as well but your options aren't displayed to me. 2010/8/28 Rob > On 27 August 2010 21:23, Thomaz Barros wrote: > > Hi all, I'm having some problems with Gstreamer in a Ubuntu 10.04 > desktop. > > I'm trying to make an H.264 streaming application but there is a delay > about > > 2-3 seconds. > > > VENC=" timeoverlay ! x264enc byte-stream=true bitrate=2000 cabac=false ! > > I'm not certain but I suspect it could be x264enc introducing the > delay. Ideally you need an x264 newer than the one in 10.04 (newer > than API version 85) and then the next GStreamer release compiled > against it. Then you could use the tune=0x4 option which enables a > zero latency tuning. However, the zero latency tuning corresponds to: > > else if( !strncasecmp( s, "zerolatency", 11 ) ) > { > param->rc.i_lookahead = 0; > param->i_sync_lookahead = 0; > param->i_bframe = 0; > param->b_sliced_threads = 1; > param->b_vfr_input = 0; > param->rc.b_mb_tree = 0; > } > > (from x264/common/common.c) > > Which, in GStreamer properties with newer -ugly from the GStreamer > developers' PPA could also be given as: > > rc-lookahead=0 sync-lookahead=0 bframes=0 sliced-threads=1 mb-tree=0 > (variable frame rate input is not exposed through the properties). > Perhaps some of those options are only available in newer x264 APIs > and so are not relevant to older x264 as they were not yet > implemented. > > Give it a go. Hopefully it will help. :) I think lookahead introduces > the largest delay when encoding. > > Regards, > Rob > > > ------------------------------------------------------------------------------ > Sell apps to millions through the Intel(R) Atom(Tm) Developer Program > Be part of this innovative community and reach millions of netbook users > worldwide. Take advantage of special opportunities to increase revenue and > speed time-to-market. Join now, and jumpstart your future. > http://p.sf.net/sfu/intel-atom-d2d > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Fri Sep 3 13:45:22 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Fri, 3 Sep 2010 14:45:22 +0300 Subject: [gst-devel] Gstreamer And Eclipse In-Reply-To: <1283470204.28334.3.camel@zingle> References: <1283240280787-2401173.post@n4.nabble.com> <1283470204.28334.3.camel@zingle> Message-ID: You shoul be able to setup additional include directories in Project properties -> C/C++ Build -> Settings -> Tool settings tab ->directories. Libraries you can add at the same tab but look to the cross g++ linker -> libraries. On the right you'll have libraries (-l) and library search path. Put there what pkg-config gives you for the library you need. 2010/9/3 Tim-Philipp M?ller > On Tue, 2010-08-31 at 00:38 -0700, frknml wrote: > > > I have created a c++ project in my eclipse,and gstreamer-0.10 folder in > my > > /usr/include directory. > > > > I'm only trying to include library but it > gives > > hundreds of errors :) > > That's not going to work, and not how it's supposed to work. > > You need to get the include paths to pass to the compiler using > > pkg-config --cflags gstreamer-0.10 > > and then > > #include > > How this has to be done exactly in Eclipse I don't know, but pkg-config > is a pretty standard tool on linux so I'm sure there's a solution for > this. > > (The reason this doesn't work is that there are includes in other > directories as well - glib for example may have platform-specific > headers in /usr/lib/glib-2.0/include that are needed). > > Cheers > -Tim > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.swain at gmail.com Fri Sep 3 14:48:12 2010 From: robert.swain at gmail.com (Rob) Date: Fri, 3 Sep 2010 14:48:12 +0200 Subject: [gst-devel] Ubuntu and Gstreamer In-Reply-To: References: Message-ID: On 3 September 2010 13:37, Thomaz Barros wrote: > Hi Rob, thanks for your response. I just tried your suspect today because i > had other problems during the week. I built the lastest gstreamer core and > plugins version and the latest libx264 release but it didn't work as well > but your options aren't displayed to me. Which packages did you build? How did you build them? Did you install them or are you running them uninstalled or using JHBuild? What do you mean by the latest libx264 release? The latest release as far as the x264 developers are concerned is always the current tip of the x264 git repository. With current x264 (from git) and current core -base and -ugly (also from git repositories) I see all the options I mentioned. Regards, Rob From thiagossantos at gmail.com Fri Sep 3 15:29:22 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Fri, 3 Sep 2010 10:29:22 -0300 Subject: [gst-devel] input-selector In-Reply-To: References: Message-ID: On Thu, Sep 2, 2010 at 9:32 AM, Bert Douglas wrote: > Hi All, > > Is there an alternative to the input selector element ? > We'd need more context to try to help here. What is the problem with input-selector? Why can't you use it? What are you trying to do? > > Thanks, > Bert Douglas > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From sawan.das at videonetics.com Fri Sep 3 17:00:55 2010 From: sawan.das at videonetics.com (sawan das) Date: Fri, 3 Sep 2010 08:00:55 -0700 (PDT) Subject: [gst-devel] rtsp server pipeline pointer Message-ID: <1283526055327-2525809.post@n4.nabble.com> Hi all, i want to access the pipeline pointer of GstRTSPMediaFactory....so I can push my own streaming data to the pipeline...Is there any doc or suggestion regarding this.(The GstRTSPMediaFactory internally use the GstRTSPMedia which contain a pipeline)... waiting for the answer... Sawan -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/rtsp-server-pipeline-pointer-tp2525809p2525809.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From bertd at tplogic.com Fri Sep 3 17:07:32 2010 From: bertd at tplogic.com (Bert Douglas) Date: Fri, 3 Sep 2010 10:07:32 -0500 Subject: [gst-devel] simple raw video RTP -- still no joy In-Reply-To: <1283501432.4299.6.camel@metal> References: <1283501432.4299.6.camel@metal> Message-ID: Thanks much to Wim for the excellent pointer. --Bert On Fri, Sep 3, 2010 at 3:10 AM, Wim Taymans wrote: > On Thu, 2010-09-02 at 23:54 -0500, Bert Douglas wrote: > > Tristan tried to help me. I tried setting caps as recommended. Still > > not working. > > You're doing it wrong. read the instructions in this document carefully: > > > http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251 > > Wim > > > > Thanks for looking. > > > > # rtp-server > > gst-launch \ > > videotestsrc pattern=red \ > > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, > > frame-rate=1/1 \ > > ! rtpvrawpay \ > > ! udpsink host=127.0.0.1 port=51234 > > > > # rtp-client > > > CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," > > CAPS= > > $CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" > > gst-launch \ > > udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ > > ! rtpvrawdepay \ > > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, > > frame-rate=1/1 \ > > ! ffmpegcolorspace \ > > ! ximagesink > > > > Setting pipeline to PAUSED ... > > Pipeline is live and does not need PREROLL ... > > Setting pipeline to PLAYING ... > > New clock: GstSystemClock > > 0:00:00.056868267 14918 0x1188c90 ERROR rtpvrawdepay > > gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no > > width specified > > ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal > > data flow error. > > Additional debug info: > > gstbasesrc.c(2562): gst_base_src_loop > > (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > > streaming task paused, reason not-negotiated (-4) > > Execution ended after 18091128 ns. > > Setting pipeline to PAUSED ... > > Setting pipeline to READY ... > > Setting pipeline to NULL ... > > Freeing pipeline ... > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > _______________________________________________ gstreamer-devel mailing > list gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sawan.das at videonetics.com Fri Sep 3 05:50:52 2010 From: sawan.das at videonetics.com (sawan das) Date: Thu, 2 Sep 2010 20:50:52 -0700 (PDT) Subject: [gst-devel] stream my encoded mp4 video using gst-rtsp library. Message-ID: <1283485852068-2525079.post@n4.nabble.com> I am new to gstreamer and want to stream live mp4 encoded video using gst-rtsp. I tried using callback function but couldn't link the source to factory.Can somebody send me a sample source code for the above. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/stream-my-encoded-mp4-video-using-gst-rtsp-library-tp2525079p2525079.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From m_g_singh at yahoo.com Fri Sep 3 07:26:45 2010 From: m_g_singh at yahoo.com (MGS) Date: Fri, 3 Sep 2010 05:26:45 +0000 (UTC) Subject: [gst-devel] Stream on iPad, iPhone, Android... References: <4C73F939.60909@simulation3d.biz> <4C7572AA.20000@hora-obscura.de> <4C76B03F.8090608@hora-obscura.de> Message-ID: Stefan Kost hora-obscura.de> writes: > > Am 25.08.2010 22:54, schrieb Admin simulation3d.biz: > > That is correct, it just help with accelerating encoding among other > > things > > > > Now that this is corrected, do you have anything to suggest related to > > solving my problem? That is code or library or doc that would help > > doing the encoding using neon instructions without programming an > > encoder in neon from scratch? > > gcc autovectorizer might use neon a bit, but chances are low. GStreamer is using > orc a lot, but that still mean to modify a software encoder to make use of it. > > Stefan > > > > > Have a nice day! > > Thank you > > > > Yohan Baillot > > > > Augmented Reality consulting > > > > yohan simulation3d.biz > > (425) BAILLOT > > > > Stefan is correct that gcc auto-vectorization will not give any significant gain in acceleration of the video encoders. Moreover, if you are thinking of encoding on mobile devices, you need to spend several months writing/optimizing encoder using NEON, and end of day you might end up with something which is not realtime encoding. FFMPEG's usefulness is limited to Desktop PCs. You may contact some codec company which provides encoder on iPhone or Android. That should help. MGS From pedro.faria at grupofox.com.br Fri Sep 3 13:22:36 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Fri, 3 Sep 2010 04:22:36 -0700 (PDT) Subject: [gst-devel] Picture Settings In-Reply-To: <1283443729.8365.18.camel@zingle> References: <1283284182923-2402216.post@n4.nabble.com> <1283287101356-2402283.post@n4.nabble.com> <1283443729.8365.18.camel@zingle> Message-ID: <1283512956478-2525493.post@n4.nabble.com> Thanks for the reply !! If in case the camera doesn't support RGB? What to do? Can i use de yuv? Thanks -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Picture-Seetings-tp2402216p2525493.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From ashok.mpally at gmail.com Fri Sep 3 15:42:17 2010 From: ashok.mpally at gmail.com (ashokm) Date: Fri, 3 Sep 2010 06:42:17 -0700 (PDT) Subject: [gst-devel] is ASFDEMUX supports the parsing of of VC1 advanced profile streams Message-ID: <1283521337332-2525681.post@n4.nabble.com> Hi, I am writing a gst-omx plugin for wmv decoder. I am using the asfdemux from ugly-plug-ins. is this supports the parsing of vc1 advanced profile streams. I found that, on sink_set caps, the codec data is coming. This has sequence layer header. After passing this to decoder, decoder detected the stream as vc1 advanced profile stream. But, after this decoder throws an error. -Ashok -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/is-ASFDEMUX-supports-the-parsing-of-of-VC1-advanced-profile-streams-tp2525681p2525681.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From bertd at tplogic.com Fri Sep 3 17:35:04 2010 From: bertd at tplogic.com (Bert Douglas) Date: Fri, 3 Sep 2010 10:35:04 -0500 Subject: [gst-devel] input-selector In-Reply-To: References: Message-ID: I am trying to use the aravissrc element to get data from a gig-e-vision camera. http://git.gnome.org/browse/aravis/ But whenever I use it in conjunction with the input-selector element, I get this error message: (panocam5-shrink.py:410): GStreamer-CRITICAL **: gst_segment_set_newsegment_full: assertion `segment->format == format' failed Code is attached. Thanks much, Bert Douglas ps. I know it is very hard for anyone to help, because you have to have an actual camera to see the error. Maybe I can find a software simulator for the camera. On Fri, Sep 3, 2010 at 8:29 AM, thiagossantos at gmail.com < thiagossantos at gmail.com> wrote: > > > On Thu, Sep 2, 2010 at 9:32 AM, Bert Douglas wrote: > >> Hi All, >> >> Is there an alternative to the input selector element ? >> > > We'd need more context to try to help here. What is the problem with > input-selector? Why can't you use it? What are you trying to do? > > >> >> Thanks, >> Bert Douglas >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > -- > Thiago Sousa Santos > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: panocam5-shrink.py Type: text/x-python Size: 3197 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: panocam5.sh Type: application/x-sh Size: 69 bytes Desc: not available URL: From bertd at tplogic.com Fri Sep 3 18:35:01 2010 From: bertd at tplogic.com (Bert Douglas) Date: Fri, 3 Sep 2010 11:35:01 -0500 Subject: [gst-devel] simple raw video RTP -- still no joy In-Reply-To: References: <1283501432.4299.6.camel@metal> Message-ID: I followed Wim's very helpful advice. It goes a bit further than before. Sadly, then there is a segment fault. # rtp-server gst-launch -v \ videotestsrc pattern=red \ ! video/x-raw-rgb, width=400, height=300, framerate=\(fraction\)10/1 \ ! rtpvrawpay ssrc=1 timestamp-offset=0 seqnum-offset=0 \ ! udpsink host=127.0.0.1 port=51234 # rtp-client caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)RGBA,depth=(string)8,width=(string)4,height=(string)4,colorimetry=(string)SMPTE240M,payload=(int)96,ssrc=(uint)1,clock-base=(uint)0,seqnum-base=(uint)0,framerate=(fraction)10/1" gst-launch -v \ udpsrc uri=udp://127.0.0.1:51234 caps=$caps \ ! rtpvrawdepay \ ! video/x-raw-rgb,format=\(fourcc\)RGBA,framerate=\(fraction\)10/1 \ ! ffmpegcolorspace \ ! ximagesink $ . rtp-client.sh Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-rgb, width=(int)4, height=(int)4, format=(fourcc)0x00000000, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGBA, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)1, clock-base=(uint)0, seqnum-base=(uint)0, framerate=(fraction)10/1 Segmentation fault On Fri, Sep 3, 2010 at 10:07 AM, Bert Douglas wrote: > Thanks much to Wim for the excellent pointer. > > --Bert > > > On Fri, Sep 3, 2010 at 3:10 AM, Wim Taymans wrote: > >> On Thu, 2010-09-02 at 23:54 -0500, Bert Douglas wrote: >> > Tristan tried to help me. I tried setting caps as recommended. Still >> > not working. >> >> You're doing it wrong. read the instructions in this document carefully: >> >> >> http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251 >> >> Wim >> >> >> > Thanks for looking. >> > >> > # rtp-server >> > gst-launch \ >> > videotestsrc pattern=red \ >> > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, >> > frame-rate=1/1 \ >> > ! rtpvrawpay \ >> > ! udpsink host=127.0.0.1 port=51234 >> > >> > # rtp-client >> > >> CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," >> > CAPS= >> > $CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" >> > gst-launch \ >> > udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ >> > ! rtpvrawdepay \ >> > ! video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, >> > frame-rate=1/1 \ >> > ! ffmpegcolorspace \ >> > ! ximagesink >> > >> > Setting pipeline to PAUSED ... >> > Pipeline is live and does not need PREROLL ... >> > Setting pipeline to PLAYING ... >> > New clock: GstSystemClock >> > 0:00:00.056868267 14918 0x1188c90 ERROR rtpvrawdepay >> > gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no >> > width specified >> > ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal >> > data flow error. >> > Additional debug info: >> > gstbasesrc.c(2562): gst_base_src_loop >> > (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: >> > streaming task paused, reason not-negotiated (-4) >> > Execution ended after 18091128 ns. >> > Setting pipeline to PAUSED ... >> > Setting pipeline to READY ... >> > Setting pipeline to NULL ... >> > Freeing pipeline ... >> > >> > >> > >> > >> ------------------------------------------------------------------------------ >> > This SF.net Dev2Dev email is sponsored by: >> > >> > Show off your parallel programming skills. >> > Enter the Intel(R) Threading Challenge 2010. >> > http://p.sf.net/sfu/intel-thread-sfd >> > _______________________________________________ gstreamer-devel mailing >> list gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> >> >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wim.taymans at gmail.com Fri Sep 3 18:47:07 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Fri, 03 Sep 2010 18:47:07 +0200 Subject: [gst-devel] simple raw video RTP -- still no joy In-Reply-To: References: <1283501432.4299.6.camel@metal> Message-ID: <1283532427.12018.6.camel@metal> On Fri, 2010-09-03 at 11:35 -0500, Bert Douglas wrote: > I followed Wim's very helpful advice. It goes a bit further than > before. > Sadly, then there is a segment fault. You didn't copy the caps correctly, the width and height are not right in the receiver. The crash is because of a bug in the depayloader (fixing now). Wim > > # rtp-server > gst-launch -v \ > videotestsrc pattern=red \ > ! video/x-raw-rgb, width=400, height=300, framerate=\(fraction > \)10/1 \ > ! rtpvrawpay ssrc=1 timestamp-offset=0 seqnum-offset=0 \ > ! udpsink host=127.0.0.1 port=51234 > > # rtp-client > caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)RGBA,depth=(string)8,width=(string)4,height=(string)4,colorimetry=(string)SMPTE240M,payload=(int)96,ssrc=(uint)1,clock-base=(uint)0,seqnum-base=(uint)0,framerate=(fraction)10/1" > gst-launch -v \ > udpsrc uri=udp://127.0.0.1:51234 caps=$caps \ > ! rtpvrawdepay \ > ! video/x-raw-rgb,format=\(fourcc\)RGBA,framerate=\(fraction\)10/1 > \ > ! ffmpegcolorspace \ > ! ximagesink > > > $ . rtp-client.sh > Setting pipeline to PAUSED ... > Pipeline is live and does not need PREROLL ... > Setting pipeline to PLAYING ... > New clock: GstSystemClock > /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps > = video/x-raw-rgb, width=(int)4, height=(int)4, > format=(fourcc)0x00000000, framerate=(fraction)0/1 > /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps > = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)RAW, sampling=(string)RGBA, depth=(string)8, > width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, > payload=(int)96, ssrc=(uint)1, clock-base=(uint)0, > seqnum-base=(uint)0, framerate=(fraction)10/1 > Segmentation fault > > > On Fri, Sep 3, 2010 at 10:07 AM, Bert Douglas > wrote: > Thanks much to Wim for the excellent pointer. > > --Bert > > > > On Fri, Sep 3, 2010 at 3:10 AM, Wim Taymans > wrote: > On Thu, 2010-09-02 at 23:54 -0500, Bert Douglas wrote: > > Tristan tried to help me. I tried setting caps as > recommended. Still > > not working. > > > You're doing it wrong. read the instructions in this > document carefully: > > http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251 > > Wim > > > > > Thanks for looking. > > > > # rtp-server > > gst-launch \ > > videotestsrc pattern=red \ > > ! video/x-raw-rgb, format=\(fourcc\)RGB, > width=4, height=4, > > frame-rate=1/1 \ > > ! rtpvrawpay \ > > ! udpsink host=127.0.0.1 port=51234 > > > > # rtp-client > > > CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," > > CAPS= > > > $CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" > > gst-launch \ > > udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ > > ! rtpvrawdepay \ > > ! video/x-raw-rgb, format=\(fourcc\)RGB, > width=4, height=4, > > frame-rate=1/1 \ > > ! ffmpegcolorspace \ > > ! ximagesink > > > > Setting pipeline to PAUSED ... > > Pipeline is live and does not need PREROLL ... > > Setting pipeline to PLAYING ... > > New clock: GstSystemClock > > 0:00:00.056868267 14918 0x1188c90 ERROR > rtpvrawdepay > > > gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no > > width specified > > ERROR: from > element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > Internal > > data flow error. > > Additional debug info: > > gstbasesrc.c(2562): gst_base_src_loop > > (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > > streaming task paused, reason not-negotiated (-4) > > Execution ended after 18091128 ns. > > Setting pipeline to PAUSED ... > > Setting pipeline to READY ... > > Setting pipeline to NULL ... > > Freeing pipeline ... > > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From thibault.jochem at amundis.fr Fri Sep 3 18:35:34 2010 From: thibault.jochem at amundis.fr (Thibault Jochem) Date: Fri, 3 Sep 2010 18:35:34 +0200 Subject: [gst-devel] Gstreamer on Intel CE Message-ID: <7492277DDE612C4BA6C2278A6E452E3005A399F74C@FR-TH2-MB01.cwu.vpn> Hi, Does anyone had a look on Intel's gstreamer support on sodaville ? I'm struggling hooking a soft sink ( fakesink ) to their hardware pipeline ... and I can't find any clues ... Cheers, -- Thibault Jochem R&D Developer From emmanuel at gnome.org Fri Sep 3 19:00:49 2010 From: emmanuel at gnome.org (Emmanuel Pacaud) Date: Fri, 03 Sep 2010 19:00:49 +0200 Subject: [gst-devel] input-selector In-Reply-To: References: Message-ID: <1283533249.1938.3.camel@lappc-p348> Le vendredi 03 septembre 2010 ? 17:35 +0200, Bert Douglas a ?crit : > I am trying to use the aravissrc element to get data from a > gig-e-vision camera. > http://git.gnome.org/browse/aravis/ > ... > ps. I know it is very hard for anyone to help, because you have to > have an actual camera to see the error. Maybe I can find a software > simulator for the camera. Aravis provides a (very basic) camera simulator, arv-fake-gv-camera. It's may be enough to reproduce the issue. Emmanuel. From bertd at tplogic.com Fri Sep 3 19:11:36 2010 From: bertd at tplogic.com (Bert Douglas) Date: Fri, 3 Sep 2010 12:11:36 -0500 Subject: [gst-devel] simple raw video RTP -- still no joy In-Reply-To: <1283532427.12018.6.camel@metal> References: <1283501432.4299.6.camel@metal> <1283532427.12018.6.camel@metal> Message-ID: Hi Wim, Thanks for the quick reply. I fixed the width and height to be the same. Now it just sits there after going to playing state. Cpu utilization is about 10 percent. I expected it to open a window for ximagesink, but it does not. I see some strange caps: /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-rgb, width=(int)400, height=(int)300, format=(fourcc)0x00000000, framerate=(fraction)0/1 The framerate is 0, despite the fact that I set it to 10. Thanks again, Bert Douglas On Fri, Sep 3, 2010 at 11:47 AM, Wim Taymans wrote: > On Fri, 2010-09-03 at 11:35 -0500, Bert Douglas wrote: > > I followed Wim's very helpful advice. It goes a bit further than > > before. > > Sadly, then there is a segment fault. > > You didn't copy the caps correctly, the width and height are not right > in the receiver. The crash is because of a bug in the depayloader > (fixing now). > > Wim > > > > # rtp-server > > gst-launch -v \ > > videotestsrc pattern=red \ > > ! video/x-raw-rgb, width=400, height=300, framerate=\(fraction > > \)10/1 \ > > ! rtpvrawpay ssrc=1 timestamp-offset=0 seqnum-offset=0 \ > > ! udpsink host=127.0.0.1 port=51234 > > > > # rtp-client > > > caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)RGBA,depth=(string)8,width=(string)4,height=(string)4,colorimetry=(string)SMPTE240M,payload=(int)96,ssrc=(uint)1,clock-base=(uint)0,seqnum-base=(uint)0,framerate=(fraction)10/1" > > gst-launch -v \ > > udpsrc uri=udp://127.0.0.1:51234 caps=$caps \ > > ! rtpvrawdepay \ > > ! video/x-raw-rgb,format=\(fourcc\)RGBA,framerate=\(fraction\)10/1 > > \ > > ! ffmpegcolorspace \ > > ! ximagesink > > > > > > $ . rtp-client.sh > > Setting pipeline to PAUSED ... > > Pipeline is live and does not need PREROLL ... > > Setting pipeline to PLAYING ... > > New clock: GstSystemClock > > /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps > > = video/x-raw-rgb, width=(int)4, height=(int)4, > > format=(fourcc)0x00000000, framerate=(fraction)0/1 > > /GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps > > = application/x-rtp, media=(string)video, clock-rate=(int)90000, > > encoding-name=(string)RAW, sampling=(string)RGBA, depth=(string)8, > > width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, > > payload=(int)96, ssrc=(uint)1, clock-base=(uint)0, > > seqnum-base=(uint)0, framerate=(fraction)10/1 > > Segmentation fault > > > > > > On Fri, Sep 3, 2010 at 10:07 AM, Bert Douglas > > wrote: > > Thanks much to Wim for the excellent pointer. > > > > --Bert > > > > > > > > On Fri, Sep 3, 2010 at 3:10 AM, Wim Taymans > > wrote: > > On Thu, 2010-09-02 at 23:54 -0500, Bert Douglas wrote: > > > Tristan tried to help me. I tried setting caps as > > recommended. Still > > > not working. > > > > > > You're doing it wrong. read the instructions in this > > document carefully: > > > > > http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README#n251 > > > > Wim > > > > > > > > > Thanks for looking. > > > > > > # rtp-server > > > gst-launch \ > > > videotestsrc pattern=red \ > > > ! video/x-raw-rgb, format=\(fourcc\)RGB, > > width=4, height=4, > > > frame-rate=1/1 \ > > > ! rtpvrawpay \ > > > ! udpsink host=127.0.0.1 port=51234 > > > > > > # rtp-client > > > > > > CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW," > > > CAPS= > > > > > > $CAPS"sampling=(string)RGB,depth=(int)8,width=(int)4,height=(int)4" > > > gst-launch \ > > > udpsrc uri=udp://127.0.0.1:51234 caps=$CAPS \ > > > ! rtpvrawdepay \ > > > ! video/x-raw-rgb, format=\(fourcc\)RGB, > > width=4, height=4, > > > frame-rate=1/1 \ > > > ! ffmpegcolorspace \ > > > ! ximagesink > > > > > > Setting pipeline to PAUSED ... > > > Pipeline is live and does not need PREROLL ... > > > Setting pipeline to PLAYING ... > > > New clock: GstSystemClock > > > 0:00:00.056868267 14918 0x1188c90 ERROR > > rtpvrawdepay > > > > > > gstrtpvrawdepay.c:240:gst_rtp_vraw_depay_setcaps: no > > > width specified > > > ERROR: from > > element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > > Internal > > > data flow error. > > > Additional debug info: > > > gstbasesrc.c(2562): gst_base_src_loop > > > (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: > > > streaming task paused, reason not-negotiated (-4) > > > Execution ended after 18091128 ns. > > > Setting pipeline to PAUSED ... > > > Setting pipeline to READY ... > > > Setting pipeline to NULL ... > > > Freeing pipeline ... > > > > > > > > > > > > > > > > > ------------------------------------------------------------------------------ > > > This SF.net Dev2Dev email is sponsored by: > > > > > > Show off your parallel programming skills. > > > Enter the Intel(R) Threading Challenge 2010. > > > http://p.sf.net/sfu/intel-thread-sfd > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > _______________________________________________ gstreamer-devel mailing > list gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bertd at tplogic.com Sat Sep 4 13:59:49 2010 From: bertd at tplogic.com (Bert Douglas) Date: Sat, 4 Sep 2010 06:59:49 -0500 Subject: [gst-devel] RTP and VRAWDEPAY --- found a new crasher Message-ID: This one crashes immediately on start up before you even send any data to it. Thanks for looking. Bert Douglas # rtp-client.sh caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW,sampling=(string)YCbCr-4:2:2,depth=(string)8,width=(string)400,height=(string)300,colorimetry=(string)SMPTE240M,payload=(int)96,ssrc=(uint)1,clock-base=(uint)1,seqnum-base=(uint)1" gst-launch -v \ udpsrc name=src1 uri=udp://127.0.0.1:51234 caps=$caps \ ! rtpvrawdepay name=depay1 \ ! ffmpegcolorspace name=cs1 \ ! ximagesink name=xis1 $ . rtp-client.sh Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstRtpVRawDepay:depay1.GstPad:src: caps = video/x-raw-yuv, width=(int)400, height=(int)300, format=(fourcc)UYVY, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstRtpVRawDepay:depay1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8, width=(string)400, height=(string)300, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)1, clock-base=(uint)1, seqnum-base=(uint)1 /GstPipeline:pipeline0/GstFFMpegCsp:cs1.GstPad:src: caps = video/x-raw-rgb, bpp=(int)32, depth=(int)24, endianness=(int)4321, red_mask=(int)65280, green_mask=(int)16711680, blue_mask=(int)-16777216, width=(int)400, height=(int)300, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1 /GstPipeline:pipeline0/GstFFMpegCsp:cs1.GstPad:sink: caps = video/x-raw-yuv, width=(int)400, height=(int)300, format=(fourcc)UYVY, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstXImageSink:xis1.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)32, depth=(int)24, endianness=(int)4321, red_mask=(int)65280, green_mask=(int)16711680, blue_mask=(int)-16777216, width=(int)400, height=(int)300, framerate=(fraction)0/1, pixel-aspect-ratio=(fraction)1/1 *** glibc detected *** /usr/bin/gst-launch-0.10: free(): invalid next size (normal): 0x0000000000e02200 *** ======= Backtrace: ========= /lib/libc.so.6(+0x775b6)[0x7f1c7315f5b6] /lib/libc.so.6(cfree+0x73)[0x7f1c73165e53] /usr/lib/libgstreamer-0.10.so.0(+0x37ec9)[0x7f1c743faec9] /usr/lib/libgstreamer-0.10.so.0(gst_mini_object_unref+0x153)[0x7f1c7441ebf3] /usr/lib/libgstbase-0.10.so.0(+0x24f4c)[0x7f1c72998f4c] /usr/lib/libgstbase-0.10.so.0(+0x25c22)[0x7f1c72999c22] /usr/lib/libgstreamer-0.10.so.0(+0x605bd)[0x7f1c744235bd] /usr/lib/libgstreamer-0.10.so.0(+0x60eae)[0x7f1c74423eae] /usr/lib/gstreamer-0.10/libgstrtp.so(+0x31b0a)[0x7f1c6eb3fb0a] /usr/lib/libgstrtp-0.10.so.0(+0x100c7)[0x7f1c6e4bd0c7] /usr/lib/libgstreamer-0.10.so.0(+0x605bd)[0x7f1c744235bd] /usr/lib/libgstreamer-0.10.so.0(+0x60eae)[0x7f1c74423eae] /usr/lib/libgstbase-0.10.so.0(+0x1ee11)[0x7f1c72992e11] /usr/lib/libgstreamer-0.10.so.0(+0x8b9b3)[0x7f1c7444e9b3] /lib/libglib-2.0.so.0(+0x69a5f)[0x7f1c73afda5f] /lib/libglib-2.0.so.0(+0x67b84)[0x7f1c73afbb84] /lib/libpthread.so.0(+0x69ca)[0x7f1c734719ca] /lib/libc.so.6(clone+0x6d)[0x7f1c731ce6fd] ======= Memory map: ======== 00400000-00406000 r-xp 00000000 08:01 42624 /usr/bin/gst-launch-0.10 00606000-00607000 r--p 00006000 08:01 42624 /usr/bin/gst-launch-0.10 00607000-00608000 rw-p 00007000 08:01 42624 /usr/bin/gst-launch-0.10 00c5c000-00e5d000 rw-p 00000000 00:00 0 [heap] 7f1c677ff000-7f1c67800000 ---p 00000000 00:00 0 7f1c67800000-7f1c68000000 rw-p 00000000 00:00 0 7f1c68000000-7f1c68021000 rw-p 00000000 00:00 0 7f1c68021000-7f1c6c000000 ---p 00000000 00:00 0 7f1c6c100000-7f1c6c116000 r-xp 00000000 08:01 6843 /lib/libgcc_s.so.1 7f1c6c116000-7f1c6c315000 ---p 00016000 08:01 6843 /lib/libgcc_s.so.1 7f1c6c315000-7f1c6c316000 r--p 00015000 08:01 6843 /lib/libgcc_s.so.1 7f1c6c316000-7f1c6c317000 rw-p 00016000 08:01 6843 /lib/libgcc_s.so.1 7f1c6c317000-7f1c6c38d000 rw-s 00000000 00:04 460095511 /SYSV00000000 (deleted) 7f1c6c38d000-7f1c6c403000 rw-s 00000000 00:04 460062741 /SYSV00000000 (deleted) 7f1c6c403000-7f1c6c404000 ---p 00000000 00:00 0 7f1c6c404000-7f1c6cc04000 rw-p 00000000 00:00 0 7f1c6cc04000-7f1c6cc05000 ---p 00000000 00:00 0 7f1c6cc05000-7f1c6d405000 rw-p 00000000 00:00 0 7f1c6d405000-7f1c6d409000 r-xp 00000000 08:01 1951 /lib/libuuid.so.1.3.0 7f1c6d409000-7f1c6d608000 ---p 00004000 08:01 1951 /lib/libuuid.so.1.3.0 7f1c6d608000-7f1c6d609000 r--p 00003000 08:01 1951 /lib/libuuid.so.1.3.0 7f1c6d609000-7f1c6d60a000 rw-p 00004000 08:01 1951 /lib/libuuid.so.1.3.0 7f1c6d60a000-7f1c6d61b000 r-xp 00000000 08:01 2576 /usr/lib/libXext.so.6.4.0 7f1c6d61b000-7f1c6d81a000 ---p 00011000 08:01 2576 /usr/lib/libXext.so.6.4.0 7f1c6d81a000-7f1c6d81b000 r--p 00010000 08:01 2576 /usr/lib/libXext.so.6.4.0 7f1c6d81b000-7f1c6d81c000 rw-p 00011000 08:01 2576 /usr/lib/libXext.so.6.4.0 7f1c6d81c000-7f1c6d833000 r-xp 00000000 08:01 7426 /usr/lib/libICE.so.6.3.0 7f1c6d833000-7f1c6da32000 ---p 00017000 08:01 7426 /usr/lib/libICE.so.6.3.0 7f1c6da32000-7f1c6da33000 r--p 00016000 08:01 7426 /usr/lib/libICE.so.6.3.0 7f1c6da33000-7f1c6da34000 rw-p 00017000 08:01 7426 /usr/lib/libICE.so.6.3.0 7f1c6da34000-7f1c6da37000 rw-p 00000000 00:00 0 7f1c6da37000-7f1c6da3f000 r-xp 00000000 08:01 7429 /usr/lib/libSM.so.6.0.1 7f1c6da3f000-7f1c6dc3e000 ---p 00008000 08:01 7429 /usr/lib/libSM.so.6.0.1 7f1c6dc3e000-7f1c6dc3f000 r--p 00007000 08:01 7429 /usr/lib/libSM.so.6.0.1 7f1c6dc3f000-7f1c6dc40000 rw-p 00008000 08:01 7429 /usr/lib/libSM.so.6.0.1 7f1c6dc40000-7f1c6dc4c000 r-xp 00000000 08:01 8465 /usr/lib/gstreamer-0.10/libgstximagesink.so 7f1c6dc4c000-7f1c6de4b000 ---p 0000c000 08:01 8465 /usr/lib/gstreamer-0.10/libgstximagesink.so 7f1c6de4b000-7f1c6de4c000 r--p 0000b000 08:01 8465 /usr/lib/gstreamer-0.10/libgstximagesink.so 7f1c6de4c000-7f1c6de4d000 rw-p 0000c000 08:01 8465 /usr/lib/gstreamer-0.10/libgstximagesink.so 7f1c6de4d000-7f1c6de55000 r-xp 00000000 08:01 5876 /usr/lib/libgstvideo-0.10.so.0.21.0 7f1c6de55000-7f1c6e054000 ---p 00008000 08:01 5876 /usr/lib/libgstvideo-0.10.so.0.21.0 7f1c6e054000-7f1c6e055000 r--p 00007000 08:01 5876 /usr/lib/libgstvideo-0.10.so.0.21.0 7f1c6e055000-7f1c6e056000 rw-p 00008000 08:01 5876 /usr/lib/libgstvideo-0.10.so.0.21.0 7f1c6e056000-7f1c6e098000 r-xp 00000000 08:01 7001 /usr/lib/gstreamer-0.10/libgstffmpegcolorspace.so 7f1c6e098000-7f1c6e297000 ---p 00042000 08:01 7001 /usr/lib/gstreamer-0.10/libgstffmpegcolorspace.so 7f1c6e297000-7f1c6e298000 r--p 00041000 08:01 7001 /usr/lib/gstreamer-0.10/libgstffmpegcolorspace.so 7f1c6e298000-7f1c6e29a000 rw-p 00042000 08:01 7001 /usr/lib/gstreamer-0.10/libgstffmpegcolorspace.so 7f1c6e29a000-7f1c6e29b000 rw-p 00000000 00:00 0 7f1c6e29b000-7f1c6e2ab000 r-xp 00000000 08:01 6208 /usr/lib/libgstinterfaces-0.10.so.0.21.0 7f1c6e2ab000-7f1c6e4ab000 ---p 00010000 08:01 6208 /usr/lib/libgstinterfaces-0.10.so.0.21.0 7f1c6e4ab000-7f1c6e4ac000 r--p 00010000 08:01 6208 /usr/lib/libgstinterfaces-0.10.so.0.21.0 7f1c6e4ac000-7f1c6e4ad000 rw-p 00011000 08:01 6208 /usr/lib/libgstinterfaces-0.10.so.0.21.0 7f1c6e4ad000-7f1c6e4c2000 r-xp 00000000 08:01 6813 /usr/lib/libgstrtp-0.10.so.0.21.0 7f1c6e4c2000-7f1c6e6c1000 ---p 00015000 08:01 6813 /usr/lib/libgstrtp-0.10.so.0.21.0 7f1c6e6c1000-7f1c6e6c3000 r--p 00014000 08:01 6813 /usr/lib/libgstrtp-0.10.so.0.21.0 7f1c6e6c3000-7f1c6e6c4000 rw-p 00016000 08:01 6813 /usr/lib/libgstrtp-0.10.so.0.21.0 7f1c6e6c4000-7f1c6e6e9000 r-xp 00000000 08:01 42605 /usr/lib/libgsttag-0.10.so.0.21.0 7f1c6e6e9000-7f1c6e8e8000 ---p 00025000 08:01 42605 /usr/lib/libgsttag-0.10.so.0.21.0Aborted -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Sat Sep 4 14:23:21 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sat, 04 Sep 2010 13:23:21 +0100 Subject: [gst-devel] RTP and VRAWDEPAY --- found a new crasher In-Reply-To: References: Message-ID: <1283603001.9201.0.camel@zingle> On Sat, 2010-09-04 at 06:59 -0500, Bert Douglas wrote: > This one crashes immediately on start up before you even send any data > to it. > Thanks for looking. Could you file a bug in bugzilla please, so it's not missed or forgottten about? Thanks! http://gstreamer.freedesktop.org/bugs/ Cheers -Tim From bertd at tplogic.com Sat Sep 4 14:57:37 2010 From: bertd at tplogic.com (Bert Douglas) Date: Sat, 4 Sep 2010 07:57:37 -0500 Subject: [gst-devel] RTP and VRAWDEPAY --- found a new crasher In-Reply-To: <1283603001.9201.0.camel@zingle> References: <1283603001.9201.0.camel@zingle> Message-ID: Ok. My first bug filed. https://bugzilla.gnome.org/show_bug.cgi?id=628768 On Sat, Sep 4, 2010 at 7:23 AM, Tim-Philipp M?ller wrote: > On Sat, 2010-09-04 at 06:59 -0500, Bert Douglas wrote: > > > This one crashes immediately on start up before you even send any data > > to it. > > Thanks for looking. > > Could you file a bug in bugzilla please, so it's not missed or > forgottten about? Thanks! > > http://gstreamer.freedesktop.org/bugs/ > > Cheers > -Tim > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Sat Sep 4 15:19:53 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sat, 04 Sep 2010 14:19:53 +0100 Subject: [gst-devel] RELEASE: GStreamer OpenGL Plug-ins 0.10.2 "Long forgotten rules" Message-ID: <1283606393.9201.7.camel@zingle> This mail announces the release of GStreamer OpenGL Plug-ins 0.10.2 "Long forgotten rules". The GStreamer OpenGL Plug-ins module contains integration libraries and plug-ins for using OpenGL within GStreamer pipelines. This module contains elements for, among others: output: glimagesink adapters: glupload, gldownload video processing: gldeinterlace, glcolorscale GL effects: glfiltersobel, glfilterblur, gleffects, others sources: gltestsrc Please note that at this time, the GStreamer GL plugins module is not considered API/ABI stable, and public interfaces may change from release to release. This release is not API or ABI compatible with the previous release. Highlights of this release: * New elements and examples * Internal changes to GL context management * Support RGB format in gloverlay * Many fixes and improvements * Cocoa backend for OS/X * Dependency on libpng instead of gdkpixbuf For more information, see http://gstreamer.freedesktop.org/modules/gst-plugins-gl.html To file bugs, request features or submit patches, please go to http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gst-plugins-gl Direct links: http://gstreamer.freedesktop.org/src/gst-plugins-gl/gst-plugins-gl-0.10.2.tar.gz http://gstreamer.freedesktop.org/src/gst-plugins-gl/gst-plugins-gl-0.10.2.tar.bz2 MD5 sums (for tarballs downloaded from gstreamer.freedesktop.org): 878fe4199be1c94f8aa2f7f23891cc95 gst-plugins-gl-0.10.2.tar.bz2 2b39bac4d521fd016f6feb16f04bb765 gst-plugins-gl-0.10.2.tar.gz Enjoy! -------------- next part -------------- Release notes for GStreamer OpenGL Plug-ins?0.10.2 "Long forgotten rules" The GStreamer team is proud to announce a new release in the 0.10.x stable series of the GStreamer OpenGL Plug-ins. The 0.10.x series is a stable series targeted at end users. It is not API or ABI compatible with the stable 0.8.x series. It is, however, parallel installable with the 0.8.x series. Please note that at this time, the GStreamer GL plugins module is not considered API/ABI stable, and public interfaces may change from release to release. This module contains integration libraries and plug-ins for using OpenGL within GStreamer pipelines. This module contains elements for, among others: output: glimagesink adapters: glupload, gldownload video processing: gldeinterlace, glcolorscale GL effects: glfiltersobel, glfilterblur, gleffects, others sources: gltestsrc Other modules containing plug-ins are: gst-plugins-base contains a basic set of well-supported plug-ins gst-plugins-good contains a set of well-supported plug-ins under our preferred license gst-plugins-ugly contains a set of well-supported plug-ins, but might pose problems for distributors gst-plugins-bad contains a set of less supported plug-ins that haven't passed the rigorous quality testing we expect This gst-plugins-gl release isn't API/ABI compatible with 0.10.1. Features of this release * New elements and examples * Internal changes to GL context management * Support RGB format in gloverlay * Many fixes and improvements * Cocoa backend for OS/X * Dependency on libpng instead of gdkpixbuf Bugs fixed in this release * 591591 : Gloverlay makes it hard to manipulate the position of the image * 595303 : Plugin gloverlay to put video over png image * 601277 : patch for gloverlay * 559135 : Add five more opengl elements to plugin docs * 562699 : configure.ac doesn't find opengl on os x * 578811 : crash with cocoa backend * 588510 : No copyright/license information in a lot of files * 590351 : Compile issue on GLEW version. * 593786 : support OpenGL es in autofoo * 597253 : [gst-plugins-gl] rewrite the way a gstgldisplay is forwarded * 599883 : [differencematte] regressions after gdkpixbuf to libpng migration * 599885 : [gtk examples] unstable behaviour with recent gtk (post csw merge) * 600630 : gldownload not working anymore * 600797 : New example illustrates texture sharing between glupload and Qt * 602153 : gstglmixer prints warnings when it has no parent * 602771 : qglwtextureshare example crashes sometimes on startup * 605121 : Bug when you configure gst-plugins-gl * 608643 : libpng 1.4.x incompatibilities * 611067 : Shared opengl context not shared correctly * 612157 : [gst-plugins-gl]enable stencil buffer's use * 612159 : [gst-plugins-gl]enable global stencil buffer use * 612163 : [gst-plugins-gl]new gl filter: gstglfilterreflectedscreen * 613186 : -gl can't be built against libpng 1.4.x * 613371 : gst_element_class_set_details = > gst_element_class_set_details_simple * 615696 : Shaders using GLSL 1.20 without #version. * 616748 : multiply_fragment_source shader compilation error with Apple GLSL compiler * 625144 : [gst-plugins-gl]?fixed qglwtextureshare example to work on a Mac * 626708 : Add OpenGL checks for Solaris, *BSD and GNU Hurd * 559131 : Have client-draw-callback accept a gpointer of user data * 588454 : Cygwin also uses GLX * 588653 : glimagesink fails to render I420 and YV12 frames properly * 593165 : glimagesink bug decoding from ts * 593486 : sdlshare example test on linux * 595588 : Added patch to manage RGB or RGBA Download You can find source releases of gst-plugins-gl in the download directory: http://gstreamer.freedesktop.org/src/gst-plugins-gl/ GStreamer Homepage More details can be found on the project's website: http://gstreamer.freedesktop.org/ Support and Bugs We use GNOME's bugzilla for bug reports and feature requests: http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Developers GStreamer is stored in Git, hosted at git.freedesktop.org, and can be cloned from there. Interested developers of the core library, plug-ins, and applications should subscribe to the gstreamer-devel list. If there is sufficient interest we will create more lists as necessary. Applications Contributors to this release * Andrey Nechypurenko * Anthony Violo * Benjamin Otte * Christian Schaller * Cygwin Ports maintainer * David Hoyt * David Schleef * Edward Hervey * Eric Anholt * Filippo Argiolas * Jan Schmidt * Julien Isorce * Mark Nauwelaerts * Miquel ?ngel Farr? * Nicholas Panayis * Nuno Santos * Olivier Cr?te * Pierre Pouzol * Pratheesh Gangadhar * Roland Peffer * Sebastian Dr?ge * Stefan Kost * Tim-Philipp M?ller * Vinson Lee * ?????? ????????? ? From bertd at tplogic.com Sat Sep 4 16:08:53 2010 From: bertd at tplogic.com (Bert Douglas) Date: Sat, 4 Sep 2010 09:08:53 -0500 Subject: [gst-devel] are my libraries installed in the right place? Message-ID: Really strange things are happening, and I am wondering if I have libraries installed correctly. Does this look right to you? Thanks Bert Douglas bertd at bertd-laptop:/usr/lib/gstreamer-0.10$ ls -l libgstrtp* -rwxr-xr-x 1 root root 1269 2010-09-03 14:10 libgstrtp.la -rwxr-xr-x 1 root root 1254 2010-09-03 14:10 libgstrtpmanager.la -rwxr-xr-x 1 root root 616395 2010-09-03 14:10 libgstrtpmanager.so -rwxr-xr-x 1 root root 1197 2010-09-03 14:14 libgstrtpmux.la -rwxr-xr-x 1 root root 104298 2010-09-03 14:14 libgstrtpmux.so -rwxr-xr-x 1 root root 1918328 2010-09-03 14:10 libgstrtp.so bertd at bertd-laptop:/usr/lib/gstreamer-0.10$ cd .. bertd at bertd-laptop:/usr/lib$ ls -l libgstrtp* -rw-r--r-- 1 root root 496810 2010-09-03 14:06 libgstrtp-0.10.a -rwxr-xr-x 1 root root 1195 2010-09-03 14:06 libgstrtp-0.10.la lrwxrwxrwx 1 root root 24 2010-09-03 14:06 libgstrtp-0.10.so -> libgstrtp-0.10.so.0.21.0 lrwxrwxrwx 1 root root 24 2010-09-03 14:06 libgstrtp-0.10.so.0 -> libgstrtp-0.10.so.0.21.0 -rw-r--r-- 1 root root 93248 2010-03-09 17:42 libgstrtp-0.10.so.0.19.2 -rwxr-xr-x 1 root root 292636 2010-09-03 14:06 libgstrtp-0.10.so.0.21.0 -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Sat Sep 4 19:43:51 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sat, 04 Sep 2010 18:43:51 +0100 Subject: [gst-devel] are my libraries installed in the right place? In-Reply-To: References: Message-ID: <1283622231.1195.2.camel@zingle> On Sat, 2010-09-04 at 09:08 -0500, Bert Douglas wrote: > Really strange things are happening, and I am wondering if I have > libraries installed correctly. What strange things? Cheers -Tim From mat30.mail at gmail.com Sat Sep 4 20:20:17 2010 From: mat30.mail at gmail.com (=?ISO-8859-1?Q?Mateo_Matachana_L=F3pez?=) Date: Sat, 04 Sep 2010 20:20:17 +0200 Subject: [gst-devel] Documentation for create a src element Message-ID: <4C828DE1.6090405@gmail.com> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi, I am creating a GStreamer source element for get a live streaming from a P2P network. I am stucked at one point. My SrcObject inheritances from GstPushSrc as recomended on GStreamer documentation for live sources. I have implemented the _start, _query and _change_state functions where I mostly do the setup of the protocol (contact the tracker, get the media description, etc.) but I don't know how to pass the streamed (received) data to the next GStreamer element. How can I pass data to the next GStreamer element? Should I push the data or GStreamer tolds the SrcObject when it should push data? Are available some documentation where is explained in detail the process of build a source element? Thanks in advance, Mateo Matachana -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkyCjdwACgkQ3zaBWsUKsPsbvQCfbKgXbgrz5fZlc/hsIGcTe96r cAUAnRAEbA6ONy8zdOC6bbvznw+jSz4T =4btj -----END PGP SIGNATURE----- From t.i.m at zen.co.uk Sat Sep 4 20:41:29 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sat, 04 Sep 2010 19:41:29 +0100 Subject: [gst-devel] Documentation for create a src element In-Reply-To: <4C828DE1.6090405@gmail.com> References: <4C828DE1.6090405@gmail.com> Message-ID: <1283625689.1195.6.camel@zingle> On Sat, 2010-09-04 at 20:20 +0200, Mateo Matachana L?pez wrote: Hi, > I have implemented the _start, _query and _change_state functions where > I mostly do the setup of the protocol (contact the tracker, get the > media description, etc.) but I don't know how to pass the streamed > (received) data to the next GStreamer element. It shouldn't be needed to implement/override GstElement::change_state(), just implement GstBaseSrc::start() and ::stop() instead. > How can I pass data to the next GStreamer element? Should I push the > data or GStreamer tolds the SrcObject when it should push data? You should implement GstPushSrc::create(), from where you can return your buffer. (Note: not GstBaseSrc::create()). Cheers -Tim From delvauxbernard at yahoo.fr Sat Sep 4 21:18:16 2010 From: delvauxbernard at yahoo.fr (victor) Date: Sat, 4 Sep 2010 12:18:16 -0700 (PDT) Subject: [gst-devel] how to set an audiofirfilter? Message-ID: <1283627896731-2526929.post@n4.nabble.com> hello to all i'm brand new to programming and to gstreamer; i find it great and allows me to potentially manipulate sound. I've just made an audiofirfilter on a file that i'm playing from disk. But is there any explanations somewhere on how to set the filter? I mean how to define the cutoff frequencies and how to define the slope of the cutoff if it is possible. I've seen the link to the documentatrion on wich the FIR filter is explained, burt as a musician, i cannot get it at all... about the same, regarding the audioirrfilter, is there a documented exeample about a high pass filter? this seems a mathematical challenge to me... Thank you very much! victor -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/how-to-set-an-audiofirfilter-tp2526929p2526929.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From bug-track at fisher-privat.net Sun Sep 5 10:28:21 2010 From: bug-track at fisher-privat.net (Alexey Fisher) Date: Sun, 05 Sep 2010 10:28:21 +0200 Subject: [gst-devel] code with optional plugins Message-ID: <1283675301.2160.5.camel@zwerg> Hallo all, i have this kind peas of code: ======================================================================= GstElement *maxrate, *scale, *colorspace, *noisefilter, *capsfilter; GstPad *ghost, *src; GstCaps *caps; maxrate = gst_element_factory_make ("videomaxrate", NULL); scale = gst_element_factory_make ("videoscale", NULL); colorspace = gst_element_factory_make ("ffmpegcolorspace", NULL); noisefilter = gst_element_factory_make ("postproc_tmpnoise", NULL); gst_bin_add_many (GST_BIN (obj), priv->src, maxrate, scale, colorspace, noisefilter, capsfilter, NULL); gst_element_link_many (priv->src, maxrate, scale, colorspace, noisefilter, capsfilter, NULL); ======================================================================== i wont to make videomaxrate and postproc_tmpnoise optional. If these are not installed i will pipeline should work. Are there any easy option to do this? Or i need to create different gst_bin_add_many and gst_element_link_many for different situations? Regards, Alexey From t.i.m at zen.co.uk Sun Sep 5 17:22:41 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sun, 05 Sep 2010 16:22:41 +0100 Subject: [gst-devel] code with optional plugins In-Reply-To: <1283675301.2160.5.camel@zwerg> References: <1283675301.2160.5.camel@zwerg> Message-ID: <1283700161.4071.2.camel@zingle> On Sun, 2010-09-05 at 10:28 +0200, Alexey Fisher wrote: > i wont to make videomaxrate and postproc_tmpnoise optional. If these are > not installed i will pipeline should work. > Are there any easy option to do this? Or i need to create different > gst_bin_add_many and gst_element_link_many for different situations? You don't have to gst_bin_add* all elements to the pipeline in one go, so just keep your gst_bin_add_many() and then gst_bin_add() the optional elements one-by-one if they exist. Alternatively, just create "identity" elements if those optional elements don't exist, then you can gst_bin_add*() and _link*() unconditionally. Cheers -Tim From ensonic at hora-obscura.de Sun Sep 5 21:33:25 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Sun, 05 Sep 2010 22:33:25 +0300 Subject: [gst-devel] Debugging a blocked GStreamer pipeline In-Reply-To: <1283390127.20100.3195.camel@sax-lx> References: <1283390127.20100.3195.camel@sax-lx> Message-ID: <4C83F085.8040801@hora-obscura.de> Am 02.09.2010 04:15, schrieb Todd Fischer: > Hi, > > We are seeing a behavior where we run a GStreamer application (doing audio / > video decoding), that runs continuously for several days, then suddenly locks up > in the middle of an A/V stream. Our best guess is there is a defect in the ALSA > output driver. We believe this because if we exit the application and try > aplay, it doesn't work. > > I am wondering if there is a debug GStreamer logger element in existence or if > one is even possible or helpful. Such a logger element could be put anywhere in > the pipeline. The logger would have circular buffers to keep track of all > potentially interesting recent history, such as pad activity, bus activity, and > any other relevant information. The circular buffer entries would all be > timestamped. When some event occurs (a file exists, a message/signal is > received, etc), the element would dump the history, and continue capturing new data. > > This idea is after the pipeline locks up, you could cause the history logger to > dump it data, and then get an idea of what is suppose to be happening that isn't > not occurring. > > Does such a logging element exist? If not, does it make any sense to develop? You can use gst-tracelib. It is a ld-preload thing for linux boxes. It can log all dataflow (buffers, events, messages and queries) as well as structural changes. Logging can be file, memory or socket. Stefan > > Todd > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From ensonic at hora-obscura.de Sun Sep 5 21:36:29 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Sun, 05 Sep 2010 22:36:29 +0300 Subject: [gst-devel] Displaying waveform of an audiofile In-Reply-To: <1283434830729-2521365.post@n4.nabble.com> References: <1283434830729-2521365.post@n4.nabble.com> Message-ID: <4C83F13D.8080002@hora-obscura.de> Am 02.09.2010 16:40, schrieb mc: > > Hi. I have to write audio editor (it will serve to cut samples from audio > files). I thinking about writing it in Vala. I'm looking for ready-made > widget which: > > - displays waveform of an audiofile (scrollable) > - displays timeline/frame number > - have a zoom > - give ability to select part of track > > May you suggest me something? I have one in buzztard, but it does not zoom right now. I think such widget is always someone tied to the feature scope of the app and thus apps do their own. Stefan From todd.fischer at ridgerun.com Mon Sep 6 00:07:41 2010 From: todd.fischer at ridgerun.com (Todd Fischer) Date: Sun, 05 Sep 2010 16:07:41 -0600 Subject: [gst-devel] Debugging a blocked GStreamer pipeline In-Reply-To: <4C83F085.8040801@hora-obscura.de> References: <1283390127.20100.3195.camel@sax-lx> <4C83F085.8040801@hora-obscura.de> Message-ID: <1283724461.13098.2882.camel@sax-lx> Hi Stefan, Thanks for the information about gst-tracelib. We have used this library in the past to produce some great performance graphs. I see the library supports a GSTTL_LOG_SIZE variable. Does that keep the first log data until the buffer is full, or does it overwrite old data with the most current information? Todd On Sun, 2010-09-05 at 22:33 +0300, Stefan Kost wrote: > Am 02.09.2010 04:15, schrieb Todd Fischer: > > Hi, > > > > We are seeing a behavior where we run a GStreamer application (doing audio / > > video decoding), that runs continuously for several days, then suddenly locks up > > in the middle of an A/V stream. Our best guess is there is a defect in the ALSA > > output driver. We believe this because if we exit the application and try > > aplay, it doesn't work. > > > > I am wondering if there is a debug GStreamer logger element in existence or if > > one is even possible or helpful. Such a logger element could be put anywhere in > > the pipeline. The logger would have circular buffers to keep track of all > > potentially interesting recent history, such as pad activity, bus activity, and > > any other relevant information. The circular buffer entries would all be > > timestamped. When some event occurs (a file exists, a message/signal is > > received, etc), the element would dump the history, and continue capturing new data. > > > > This idea is after the pipeline locks up, you could cause the history logger to > > dump it data, and then get an idea of what is suppose to be happening that isn't > > not occurring. > > > > Does such a logging element exist? If not, does it make any sense to develop? > > You can use gst-tracelib. It is a ld-preload thing for linux boxes. It can log > all dataflow (buffers, events, messages and queries) as well as structural > changes. Logging can be file, memory or socket. > > Stefan > > > > > Todd > > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gabrbedd at gmail.com Mon Sep 6 03:59:47 2010 From: gabrbedd at gmail.com (Gabriel M. Beddingfield) Date: Sun, 5 Sep 2010 20:59:47 -0500 Subject: [gst-devel] Position info when doing time stretch Message-ID: <201009052059.48215.gabrbedd@gmail.com> Hello, I'm doing time stretching app, and want the UI to show position/length info in terms of the original media. However, gst_element_query_position() and _duration() always gives me playback times relative to the wall clock. This time is affected by the playback speed. Does anyone know a way to handle this?[1] Here's the details................ Application: StretchPlayer[2] Implementation: C++ Current API's: sndfile, rubberband, JACK My GStreamer port is setting up this pipeline: playbin --> audioconvert --> pitch --> volume --> autoaudiosink Pitch the soundtouch 'pitch' plugin that does time stretching and pitch shifting. The playbin is only given local files. Whenever I query playbin for position and duration, it always reports time in terms of the wall clock. That is, if I've been listening for 30 secs, it tells me 30 secs. However, if I'm playing at 1/2 speed, I want it to tell me "15 secs". At the moment, the only solution I see is to break up my pipeline like this: playbin --> appsink --> memory memory --> appsrc --> pitch --> volume --> autoaudiosink This would allow me to precisely track the position of the audio, since I'm metering it out from memory. However, when I tried to set up appsrc/appsink in a Qt application it gave me fits last week.[3] Does anyone have any suggestions?? Thanks in advance, Gabriel [1] I did search the archives, though I had to use something like Nabble. For some reason, I couldn't make SF's search work. The only thing I found was the app playitslowly. [2] http://www.teuton.org/~gabriel/stretchplayer/ [3] I started off trying to do appsrc because it would allow me to wrap gstreamer into a form that looked like JACK or ALSA. From pranay.samanta at gmail.com Mon Sep 6 07:30:30 2010 From: pranay.samanta at gmail.com (pranay.samanta) Date: Sun, 5 Sep 2010 22:30:30 -0700 (PDT) Subject: [gst-devel] stream my encoded mp4 video using gst-rtsp library. Message-ID: <1283751030500-2527921.post@n4.nabble.com> I am a newbie to gstreamer and wondering how to stream my mpeg4 encoded video using x264 or anything by gst rtsp. I read a lot of posts and examples but not found a way to register my custom callback in the rtsp library. I am actually trying to use this library to stram rtsp video to my cellphone. Any help will be highly appreciated. Thanx in addvance. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/stream-my-encoded-mp4-video-using-gst-rtsp-library-tp2527921p2527921.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From praveen.pande1 at gmail.com Mon Sep 6 09:59:55 2010 From: praveen.pande1 at gmail.com (praveen pandey) Date: Mon, 6 Sep 2010 13:29:55 +0530 Subject: [gst-devel] plugins not found by gst-inspect In-Reply-To: References: Message-ID: Hi, There could be 2 possible reasons 1) Your .SOs might have undefined symbols in it. Just Validate this. 2) GST_PLUGIN_DEFINE might be missing - Praveen On Fri, Sep 3, 2010 at 12:19 PM, harshada gune wrote: > Hi, > > I have built some of my plugins. I think they are successfully built as I > can see .so files being generated. I set GST_PLUGIN_PATH to the directory > containing these .so files. > But when I do gst-inspect, I am not finding any of them. > > What could be the reason? > > > thanks, > Harshada > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at entropywave.com Mon Sep 6 10:39:14 2010 From: ds at entropywave.com (David Schleef) Date: Mon, 6 Sep 2010 04:39:14 -0400 Subject: [gst-devel] Orc-0.4.9 Message-ID: <20100906083914.GA28265@cooker.entropywave.com> New Orc release. http://code.entropywave.com/projects/orc/ 0.4.9 ===== This is primarily a bug fixing release. Changes: - Added handling for 64-bit constants - Fix building and use of static library - Fix register allocation on Win64 (still partly broken, however) - Quiet some non-errors printed by orcc in 0.4.8. - Fix implementation of several opcodes. Until this release, the shared libraries all had the same versioning information. This should be fixed going forward. David From weian.chen at intel.com Mon Sep 6 10:38:35 2010 From: weian.chen at intel.com (Chen, Weian) Date: Mon, 6 Sep 2010 16:38:35 +0800 Subject: [gst-devel] Questions on dynamic resolution/frame rate changes for video capture/encode pipeline Message-ID: Hi All, I am running a video capture/encode pipeline, if I want to change the resolution and frame rate dynamically, whose responsibility it is to initiate these changes (suppose it is application?), and if so what camera source element (such as v4l2src ) and video encoder element (such as x264enc) need to do to response to these changes? Thanks a lot. Weian -------------- next part -------------- An HTML attachment was scrubbed... URL: From n770galaxy at gmail.com Mon Sep 6 14:53:55 2010 From: n770galaxy at gmail.com (Josep Torra) Date: Mon, 06 Sep 2010 14:53:55 +0200 Subject: [gst-devel] Gstreamer on Intel CE In-Reply-To: <7492277DDE612C4BA6C2278A6E452E3005A399F74C@FR-TH2-MB01.cwu.vpn> References: <7492277DDE612C4BA6C2278A6E452E3005A399F74C@FR-TH2-MB01.cwu.vpn> Message-ID: <4C84E463.7050401@gmail.com> Hi, I need more context to give you some usefull info. Please could you copy the pipeline? Which plugins are you using Intel or Fluendo variant? Which SDK version are you running? Which gstreamer version is installed in your Sodaville system? Best regards, Josep On 09/03/2010 06:35 PM, Thibault Jochem wrote: > Hi, > > Does anyone had a look on Intel's gstreamer support on sodaville ? > > I'm struggling hooking a soft sink ( fakesink ) to their hardware pipeline ... and I can't find any clues ... > > Cheers, > > -- > Thibault Jochem > R&D Developer > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From wl2776 at gmail.com Mon Sep 6 15:52:37 2010 From: wl2776 at gmail.com (wl2776) Date: Mon, 6 Sep 2010 06:52:37 -0700 (PDT) Subject: [gst-devel] Cannot set caps. Message-ID: <1283781157023-2528411.post@n4.nabble.com> Hi all. I am writing a small function, making screenshots. I have studied several examples already, and now I'm trying to copy them to my application. My function gets the last displayed buffer from the videosink, then builds the pipeline: appsrc -> ffmpegcolorspace -> videoscale -> encoder -> filesink I have to save the screenshot to one of several formats, so I use one of several encoders: jpegenc, ffenc_bmp, pngenc, ffenc_tiff. All elements are created and added to the pipeline. The problem is that my function cannot set caps on the encoder's sink pad. Even following simple statements return false: caps=gst_caps_from_string("video/x-raw-rgb" ",bpp=24,depth=24" ); pad=gst_element_get_static_pad(enc,"sink"); rb=gst_pad_set_caps(pad,caps); rb is always 0 (FALSE) However, I need this, because my function also has to scale the screenshot, if new width and height are given. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Cannot-set-caps-tp2528411p2528411.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From wl2776 at gmail.com Mon Sep 6 16:54:01 2010 From: wl2776 at gmail.com (wl2776) Date: Mon, 6 Sep 2010 07:54:01 -0700 (PDT) Subject: [gst-devel] Error 'not negotiated' from ffmpegcolorspace. Message-ID: <1283784841406-2528483.post@n4.nabble.com> I am writing a small function, making screenshots, I've wrote about it in my previous message. Now, instead of setting pad's caps, I'm trying to use capsfilter. My function gets the last displayed buffer from the videosink, then builds the pipeline: appsrc -> ffmpegcolorspace -> videoscale -> capsfilter -> encoder -> filesink However, my problem in another place. After adding some debug messages, I've found that ffmpegcolorspace for some reason cannot negotiate. Why? What does it want? My code: /* get last-buffer from videosink */ pipeline=(GstPipeline *)gst_pipeline_new("ipipeline"); bus=gst_pipeline_get_bus(pipeline); appsrc=(GstAppSrc *)gst_element_factory_make("appsrc","appsrc0"); colorconv=gst_element_factory_make("ffmpegcolorspace","colorconv"); videoscale=gst_element_factory_make("videoscale","videoscale"); capsflt=gst_element_factory_make("capsfilter","capsfilter"); if(width && height){ caps=gst_caps_new_simple("video/x-raw-rgb", "bpp",G_TYPE_INT,24, "depth",G_TYPE_INT,24, "width",G_TYPE_INT,width, "height",G_TYPE_INT,height, NULL); }else{ caps=gst_caps_new_simple("video/x-raw-rgb", "bpp",G_TYPE_INT,24, "depth",G_TYPE_INT,24, NULL); } g_object_set(capsflt,"caps",caps,NULL); gst_object_unref(caps); switch(format){ case 1: //jpeg enc=gst_element_factory_make("jpegenc","jpeg"); break; case 2: //bmp enc=gst_element_factory_make("ffenc_bmp","bmp"); break; case 3: //png enc=gst_element_factory_make("pngenc","png"); break; case 4: //tiff enc=gst_element_factory_make("ffenc_tiff","tiff"); break; } filesink=gst_element_factory_make("filesink","fsink"); g_object_set(filesink,"location",filename,NULL); gst_bin_add_many(GST_BIN(pipeline),GST_ELEMENT(appsrc),colorconv,videoscale,capsflt,enc,filesink,NULL); rb=gst_element_link_many(GST_ELEMENT(appsrc),colorconv,videoscale,capsflt,enc,filesink,NULL); gst_app_src_set_stream_type(appsrc,GST_APP_STREAM_TYPE_STREAM); GstAppSrcCallbacks callbacks; callbacks.need_data=need_data; // this callback pushes the buffer to the ipipeline and also the eos. callbacks.enough_data=NULL; callbacks.seek_data=NULL; gst_app_src_set_callbacks(appsrc,&callbacks,this,NULL); if(rb){ gst_element_set_state(GST_ELEMENT(pipeline),GST_STATE_PLAYING); while(1){ //pop messages from the bus and process them without running the main loop msg=gst_bus_pop(bus); //process messages. } } And I can see in the debug output that colorspace always gives the error 'not negotiated'. If I leave only appsrc, colorspace and filesink in gst_bin_add_many and gst_bin_link_many, then I get a file on disk and no errors... -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Error-not-negotiated-from-ffmpegcolorspace-tp2528483p2528483.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From wl2776 at gmail.com Mon Sep 6 16:57:23 2010 From: wl2776 at gmail.com (wl2776) Date: Mon, 6 Sep 2010 07:57:23 -0700 (PDT) Subject: [gst-devel] Error 'not negotiated' from ffmpegcolorspace. In-Reply-To: <1283784841406-2528483.post@n4.nabble.com> References: <1283784841406-2528483.post@n4.nabble.com> Message-ID: <1283785043291-2528486.post@n4.nabble.com> Sorry, mistake: wl2776 wrote: > > If I leave only appsrc and filesink in gst_bin_add_many and > gst_bin_link_many, then I get a file on disk and no errors... > Only two elements in the pipeline give no errors. The pipeline of three elements: appsrc->colorspace->filesink does give the 'not negotiated' error. Colorspace doesn't negotiate. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Error-not-negotiated-from-ffmpegcolorspace-tp2528483p2528486.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From wl2776 at gmail.com Mon Sep 6 17:13:19 2010 From: wl2776 at gmail.com (wl2776) Date: Mon, 6 Sep 2010 08:13:19 -0700 (PDT) Subject: [gst-devel] Error 'not negotiated' from ffmpegcolorspace. In-Reply-To: <1283784841406-2528483.post@n4.nabble.com> References: <1283784841406-2528483.post@n4.nabble.com> Message-ID: <1283785999307-2528516.post@n4.nabble.com> Solved. I have just set caps from the GstBuffer, I already have, on the appsrc. Looks like colorspace had too many variants, and could not decide which one to choose. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Error-not-negotiated-from-ffmpegcolorspace-tp2528483p2528516.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From bilboed at gmail.com Mon Sep 6 17:51:38 2010 From: bilboed at gmail.com (Edward Hervey) Date: Mon, 06 Sep 2010 17:51:38 +0200 Subject: [gst-devel] RELEASE: GNonLin non-linear editing plugins 0.10.16 "I needed time to think to get the memories from my mind" Message-ID: <1283788298.14358.39.camel@localhost> This mail announces the release of the GnonLin non-linear editing plugins 0.10.16 "I needed time to think to get the memories from my mind" This module contains a set of plug-ins for GStreamer to ease the creation of multimedia editors, or any other application where a timeline-oriented use of GStreamer makes sense. For more information, see http://gstreamer.freedesktop.org/modules/gnonlin.html To file bugs, go to http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gnonlin -------------- next part -------------- Release notes for GNonLin Non-Linear Editing Plug-ins?0.10.16 "I needed time to think to get the memories from my mind" The GStreamer team is proud to announce a new release in the 0.10.x stable series of GNonLin. This module contains a set of plug-ins for GStreamer to ease the creation of multimedia editors, or any other application where a timeline-oriented use of GStreamer makes sense. These elements include: gnlsource An element for using source elements/bins in a GnlComposition gnlfilesource A higher-level element for using a uri in a GnlComposition gnlcomposition A container element that handles GNonLin objects gnloperation An element for using filters in a GnlComposition Features of this release * More race fixes * gnlcomposition: propagate caps to childs * gnlurisource: Only use needed streams * gnlcomposition: Fix QoS handling * Bugs fixed in this release * 613283 : gst_element_class_set_details = > gst_element_class_set_details_simple * 626501 : Caps property of gnlfilesource works incorrectly * 626733 : Race in gnlcomposition between no_more_pads_object_cb and compare_relink_single_node Download You can find source releases of gnonlin in the download directory: http://gstreamer.freedesktop.org/src/gnonlin/ GStreamer Homepage More details can be found on the project's website: http://gstreamer.freedesktop.org/ Support and Bugs We use GNOME's bugzilla for bug reports and feature requests: http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Developers GStreamer is stored in Git, hosted at git.freedesktop.org, and can be cloned from there. Interested developers of the core library, plug-ins, and applications should subscribe to the gstreamer-devel list. If there is sufficient interest we will create more lists as necessary. Applications GNonLin is primarily used by PiTiVi (http://www.pitivi.org/) and Jokosher (http://www.jokosher.org/). Contributors to this release * Alessandro Decina * Benjamin Otte * David Schleef * Edward Hervey * Sebastian Dr?ge * Stefan Kost * Tim-Philipp M?ller ? From mjoachimiak at gmail.com Mon Sep 6 18:38:28 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Mon, 6 Sep 2010 19:38:28 +0300 Subject: [gst-devel] Cannot set caps. In-Reply-To: <1283781157023-2528411.post@n4.nabble.com> References: <1283781157023-2528411.post@n4.nabble.com> Message-ID: Maybe you are not giving full caps? What caps have you tried? 2010/9/6 wl2776 > > Hi all. > I am writing a small function, making screenshots. > I have studied several examples already, and now I'm trying to copy them to > my application. > > My function gets the last displayed buffer from the videosink, then builds > the pipeline: > appsrc -> ffmpegcolorspace -> videoscale -> encoder -> filesink > > I have to save the screenshot to one of several formats, so I use one of > several encoders: jpegenc, ffenc_bmp, pngenc, ffenc_tiff. > All elements are created and added to the pipeline. > > The problem is that my function cannot set caps on the encoder's sink pad. > Even following simple statements return false: > caps=gst_caps_from_string("video/x-raw-rgb" > ",bpp=24,depth=24" > ); > pad=gst_element_get_static_pad(enc,"sink"); > rb=gst_pad_set_caps(pad,caps); > > rb is always 0 (FALSE) > > However, I need this, because my function also has to scale the screenshot, > if new width and height are given. > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/Cannot-set-caps-tp2528411p2528411.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Mon Sep 6 21:02:48 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Mon, 06 Sep 2010 22:02:48 +0300 Subject: [gst-devel] Debugging a blocked GStreamer pipeline In-Reply-To: <1283724461.13098.2882.camel@sax-lx> References: <1283390127.20100.3195.camel@sax-lx> <4C83F085.8040801@hora-obscura.de> <1283724461.13098.2882.camel@sax-lx> Message-ID: <4C853AD8.5020506@hora-obscura.de> Am 06.09.2010 01:07, schrieb Todd Fischer: > Hi Stefan, > > Thanks for the information about gst-tracelib. We have used this library in the > past to produce some great performance graphs. > > I see the library supports a GSTTL_LOG_SIZE variable. Does that keep the first > log data until the buffer is full, or does it overwrite old data with the most > current information? If the buffer is full it counts and tells at the end how much it would have needed. Then you can run again with a bigger size. If you would need a different scheme let me know, it can probably be added easily. Stefan > > Todd > > On Sun, 2010-09-05 at 22:33 +0300, Stefan Kost wrote: >> Am 02.09.2010 04:15, schrieb Todd Fischer: >> > Hi, >> > >> > We are seeing a behavior where we run a GStreamer application (doing audio / >> > video decoding), that runs continuously for several days, then suddenly locks up >> > in the middle of an A/V stream. Our best guess is there is a defect in the ALSA >> > output driver. We believe this because if we exit the application and try >> > aplay, it doesn't work. >> > >> > I am wondering if there is a debug GStreamer logger element in existence or if >> > one is even possible or helpful. Such a logger element could be put anywhere in >> > the pipeline. The logger would have circular buffers to keep track of all >> > potentially interesting recent history, such as pad activity, bus activity, and >> > any other relevant information. The circular buffer entries would all be >> > timestamped. When some event occurs (a file exists, a message/signal is >> > received, etc), the element would dump the history, and continue capturing new data. >> > >> > This idea is after the pipeline locks up, you could cause the history logger to >> > dump it data, and then get an idea of what is suppose to be happening that isn't >> > not occurring. >> > >> > Does such a logging element exist? If not, does it make any sense to develop? >> >> You can use gst-tracelib. It is a ld-preload thing for linux boxes. It can log >> all dataflow (buffers, events, messages and queries) as well as structural >> changes. Logging can be file, memory or socket. >> >> Stefan >> >> > >> > Todd >> > >> > >> > >> > >> > ------------------------------------------------------------------------------ >> > This SF.net Dev2Dev email is sponsored by: >> > >> > Show off your parallel programming skills. >> > Enter the Intel(R) Threading Challenge 2010. >> > http://p.sf.net/sfu/intel-thread-sfd >> > >> > >> > >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> From wl2776 at gmail.com Tue Sep 7 10:26:25 2010 From: wl2776 at gmail.com (wl2776) Date: Tue, 7 Sep 2010 01:26:25 -0700 (PDT) Subject: [gst-devel] Cannot set caps. In-Reply-To: References: <1283781157023-2528411.post@n4.nabble.com> Message-ID: <1283847985348-2529332.post@n4.nabble.com> Michael Joachimiak wrote: > > Maybe you are not giving full caps? > What caps have you tried? > I tried much longer caps, anyway, return was FALSE. I have an impression, that I shouldn't give full caps to a pad, and it fill missing values from the default. Right? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Cannot-set-caps-tp2528411p2529332.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From barthelemy at crans.org Tue Sep 7 10:43:48 2010 From: barthelemy at crans.org (=?ISO-8859-1?Q?S=E9bastien_Barth=E9lemy?=) Date: Tue, 7 Sep 2010 10:43:48 +0200 Subject: [gst-devel] passing through compilation and album artist metadata Message-ID: Hello list, I'm using gstreamer together with scons to convert my music library (mostly flac files) to mp3 I can listen to in iTunes. This is done through the following command, ran by scons: cmd = Command(tgt, src, 'gst-launch-0.10 filesrc location=$SOURCE ! decodebin ! ' + 'audioconvert ! lame vbr=new vbr-quality=4 ! id3v2mux ! ' + 'filesink location=$TARGET') I noticed that the mp3 files produced that way are missing the "TCMP=1" and "TPE2=Various Artists" tags, even if the source file has the corresponding "compilation=1" and "albumartist=Various Artists" flags. This is patent when you use iTunes, because it screws up cover flow. I use gstreamer 0.10.15 on ubuntu lucid lynx. Is this a known problem? I could not find a reference in the bug tracker nor in the archive. Maybe I am missing something? Regards, Sebastien From bilboed at gmail.com Tue Sep 7 11:56:43 2010 From: bilboed at gmail.com (Edward Hervey) Date: Tue, 07 Sep 2010 11:56:43 +0200 Subject: [gst-devel] Cannot set caps. In-Reply-To: <1283781157023-2528411.post@n4.nabble.com> References: <1283781157023-2528411.post@n4.nabble.com> Message-ID: <1283853403.14358.41.camel@localhost> On Mon, 2010-09-06 at 06:52 -0700, wl2776 wrote: > Hi all. > I am writing a small function, making screenshots. > I have studied several examples already, and now I'm trying to copy them to > my application. > > My function gets the last displayed buffer from the videosink, then builds > the pipeline: > appsrc -> ffmpegcolorspace -> videoscale -> encoder -> filesink > > I have to save the screenshot to one of several formats, so I use one of > several encoders: jpegenc, ffenc_bmp, pngenc, ffenc_tiff. > All elements are created and added to the pipeline. > > The problem is that my function cannot set caps on the encoder's sink pad. > Even following simple statements return false: > caps=gst_caps_from_string("video/x-raw-rgb" > ",bpp=24,depth=24" > ); > pad=gst_element_get_static_pad(enc,"sink"); > rb=gst_pad_set_caps(pad,caps); gst_pad_set_caps() must not be used from applications. If you wish to restrict/specify the caps flowing between two elements, insert a capsfilter element. Edward > > rb is always 0 (FALSE) > > However, I need this, because my function also has to scale the screenshot, > if new width and height are given. From t.i.m at zen.co.uk Tue Sep 7 12:01:50 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Tue, 07 Sep 2010 11:01:50 +0100 Subject: [gst-devel] passing through compilation and album artist metadata In-Reply-To: References: Message-ID: <1283853710.3561.39.camel@zingle> On Tue, 2010-09-07 at 10:43 +0200, S?bastien Barth?lemy wrote: Hi, > I'm using gstreamer together with scons to convert my music library > (mostly flac files) to mp3 I can listen to in iTunes. > > This is done through the following command, ran by scons: > > cmd = Command(tgt, src, > 'gst-launch-0.10 filesrc location=$SOURCE ! decodebin ! ' + > 'audioconvert ! lame vbr=new vbr-quality=4 ! id3v2mux ! ' + > 'filesink location=$TARGET') > > I noticed that the mp3 files produced that way are missing the > "TCMP=1" and "TPE2=Various Artists" tags, even if the source file > has the corresponding "compilation=1" and "albumartist=Various > Artists" flags. This is patent when you use iTunes, because it screws > up cover flow. > > Is this a known problem? I could not find a reference in the bug > tracker nor in the archive. Maybe I am missing something? You could try id3mux from -bad as well. There are some known issues with the taglib-based id3v2mux, but I would've expected it to handle unknown text frames just fine. > I use gstreamer 0.10.15 on ubuntu lucid lynx. That doesn't sound right. What's the output of gst-inspect-0.10 id3v2mux | grep Version ? Cheers -Tim From nico at inattendu.org Tue Sep 7 12:27:38 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Tue, 07 Sep 2010 14:27:38 +0400 Subject: [gst-devel] Looking for some advices for playing sequence of images with a sound track Message-ID: <4C86139A.2090508@inattendu.org> Hi I develop a stop-motion tool : luciole. It takes snapshot from an external device and allow play of taken snapshot. I use gstreamer for making snapshot an play the movie. I want to add the possibily of a playing a music or sound in the same time the video is playing. The input of the 'play' function is a sequence of image. To mix a sound track in the play function. It is better to use an audio-sink or use gnolin ? I expect to implement also a 'seek' function, controlled by a scale bar who allow to start the play at any time/frame. For that function, what is in your the best gstreamer choice ? N.B : the app is developed with python Thanks in advance Nico From nico at inattendu.org Tue Sep 7 12:28:02 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Tue, 07 Sep 2010 14:28:02 +0400 Subject: [gst-devel] Looking for some advices for playing sequence of images with a sound track Message-ID: <4C8613B2.4060005@inattendu.org> Hi I develop a stop-motion tool : luciole. It takes snapshot from an external device and allow play of taken snapshot. I use gstreamer for making snapshot an play the movie. I want to add the possibily of a playing a music or sound in the same time the video is playing. The input of the 'play' function is a sequence of image. To mix a sound track in the play function. It is better to use an audio-sink or use gnolin ? I expect to implement also a 'seek' function, controlled by a scale bar who allow to start the play at any time/frame. For that function, what is the best gstreamer choice ? N.B : the app is developed with python Thanks in advance Nico From ykumar23 at gmail.com Tue Sep 7 12:48:00 2010 From: ykumar23 at gmail.com (kumar) Date: Tue, 7 Sep 2010 03:48:00 -0700 (PDT) Subject: [gst-devel] Try using Ossink Message-ID: <1283856480837-2529484.post@n4.nabble.com> Developed an simple music player app using "osssink" but when any other application like mozilla /firefox using osssink my application is unable to use. Any solution to this problem -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Try-using-Ossink-tp2529484p2529484.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From wl2776 at gmail.com Tue Sep 7 12:57:32 2010 From: wl2776 at gmail.com (wl2776) Date: Tue, 7 Sep 2010 03:57:32 -0700 (PDT) Subject: [gst-devel] Looking for some advices for playing sequence of images with a sound track In-Reply-To: <4C8613B2.4060005@inattendu.org> References: <4C8613B2.4060005@inattendu.org> Message-ID: <1283857052046-2529498.post@n4.nabble.com> Nicolas Bertrand-4 wrote: > > I use gstreamer for making snapshot and play the movie. > I want to add the possibility of playing a music or sound in the same > time the video is playing. > The input of the 'play' function is a sequence of images. > I'd suggest using imagefreeze element to play images. Then, use filesrc, multifilesrc, or appsrc to feed the images in. Appsrc would help you to create a sophisticated image sequence, if you need one. Also, it can help you to avoid storing images to disk, you your app generates them on the fly. Usual sound playback chain would suit, if you have sound files on the disk. So, to put it all together, you create a pipeline and put two chains in it: 1. ! decodebin2 ! imagefreeze ! video-sink 2. filesrc ! audio-decoder ! audio-sink This can be accomplished in 3 calls: gst_bin_add_many(pipeline, imagesrc, decodebin2, imagefreeze, videosink, audiofile_src, audio_decoder, audio_sink,NULL); gst_element_link_many(imagesrc,decodebin2,imagefreeze,videosink,NULL); gst_element_link_many(audiofile_src, audio_decoder, audio_sink,NULL); Probably, it would be useful to add a couple of queues, to create several threads. Nicolas Bertrand-4 wrote: > > To mix a sound track in the play function. It is better to use an > audio-sink or use gnolin ? > I doubt, you can use gnonlin, because, afair, it doesn't support single images. You must use audio-sink in any case, whether you use gnonlin or not, because gnonlin gives you only decoded streams, and you're on your own, when you want to play them or save to a file. Nicolas Bertrand-4 wrote: > > I expect to implement also a 'seek' function, controlled by a scale bar > who allow to start the play at any time/frame. > For that function, what is the best gstreamer choice ? > As for filesrc and appsrc, they support seeking. For multifilesrc, consult manuals. Nicolas Bertrand-4 wrote: > > N.B : the app is developed with python > Don't see any obstacles. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Looking-for-some-advices-for-playing-sequence-of-images-with-a-sound-track-tp2529476p2529498.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From luiverco at gmail.com Tue Sep 7 13:12:45 2010 From: luiverco at gmail.com (Miguel Verdu) Date: Tue, 7 Sep 2010 14:12:45 +0300 Subject: [gst-devel] New EVENT : Video Fast Update Message-ID: Hi all We are planning to add support in our video codecs for the fast update commands defined in H.245 recommendation. This is also functionally similar to what is required by the RTP/AVPF spec (RFC4585). I would like to start the discussion in the mailing list on how to implement this so that we can do something that eventually will be defined by GStreamer and not proprietary to the codecs in our platforms. My current proposal tries to be general enough to cover what is required by both recommendations - New upstream event ?GST_EVENT_VIDEO_FAST_UPDATE ?A feedback message from decoder to encoder. Contains information of the type of feedback the decoder is producing. ?Event travels up to rtpbin where it is formatted and sent via the RTCP channel - New structures to be attached to the event: Name: PictureLoss ?Empty stuct Name: SliceLoss ?*Key: (type) value* ?First: (guint) address of the first MB(MacroBlocks) where corruption was detected ?Number: (guint) number of MB's corrupted ?PictureID:(guint) codec specific picture identifier Name: ReferencePictureSelection *Key:(type)value* ?Native_info: (guint*) array with a codec specific bitstring that defines the current reference picture/slices in used by the decoder ?Length: (guint) length of the bitstring Regards Miguel From olivier.crete at collabora.co.uk Tue Sep 7 13:33:50 2010 From: olivier.crete at collabora.co.uk (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Tue, 07 Sep 2010 14:33:50 +0300 Subject: [gst-devel] New EVENT : Video Fast Update In-Reply-To: References: Message-ID: <1283859230.2502.101.camel@TesterTop4> Hi, On Tue, 2010-09-07 at 14:12 +0300, Miguel Verdu wrote: > We are planning to add support in our video codecs for the fast update > commands defined in H.245 recommendation. This is also functionally > similar to what is required by the RTP/AVPF spec (RFC4585). I've already started to implement RTP/AVPF, code at: http://git.collabora.co.uk/?p=user/tester/gst-plugins-good.git;a=shortlog;h=refs/heads/avpf-timing > My current proposal tries to be general enough to cover what is required by > both recommendations > > - New upstream event > Name: PictureLoss > Empty stuct There is already a "GstForceKeyUnit" event defined for that, and it is implemented by most of the relevant open source encoder (x264enc, gst-ffmpeg and theoraenc) elements. My AVPF branch produces that GstForceKeyUnit event on the relevant AVPF RTCP messages. Documented at: http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-keyframe-force.txt > Name: SliceLoss > Name: ReferencePictureSelection As for the two other types, I agree that we should have new events for those. Maybe extend GstForceKeyUnit for the slice loss. But ReferencePictureSelection probably requires a new custom event type. And don't feel too bad, I just re-implemented something that had existed for like a year ;) -- Olivier Cr?te olivier.crete at collabora.co.uk -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From bilboed at gmail.com Tue Sep 7 13:54:05 2010 From: bilboed at gmail.com (Edward Hervey) Date: Tue, 07 Sep 2010 13:54:05 +0200 Subject: [gst-devel] Try using Ossink In-Reply-To: <1283856480837-2529484.post@n4.nabble.com> References: <1283856480837-2529484.post@n4.nabble.com> Message-ID: <1283860445.14358.42.camel@localhost> On Tue, 2010-09-07 at 03:48 -0700, kumar wrote: > Developed an simple music player app using "osssink" but when any other > application like mozilla /firefox using osssink > my application is unable to use. Any solution to this problem That's a problem with oss. Either get a card that can do hardware mixing, or switch to a system that supports software mixing (like pulseaudio). From ensonic at hora-obscura.de Tue Sep 7 14:12:41 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 07 Sep 2010 15:12:41 +0300 Subject: [gst-devel] upcoming removal of metadata plugin in gst-plugin-bad Message-ID: <4C862C39.500@hora-obscura.de> hello, I intend to remove the metadata plugin in gst-plugin-bad in this cycle. Exif and xmp support is now provided by gst-plugins-base utility libraries. It only takes a couple of lines to add e.g. xmp support to container formats. Here are some details: https://bugzilla.gnome.org/show_bug.cgi?id=486659 For jpeg files, jpegformat plugin in gst-plugins-bad provides the container format handling. In case anyone else that Nokia on the N900 is using the metadata plugin, please migrate your code. Let me know if I miss something. Thanks, Stefan From ensonic at hora-obscura.de Tue Sep 7 14:21:48 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 07 Sep 2010 15:21:48 +0300 Subject: [gst-devel] New EVENT : Video Fast Update In-Reply-To: <1283859230.2502.101.camel@TesterTop4> References: <1283859230.2502.101.camel@TesterTop4> Message-ID: <4C862E5C.7000106@hora-obscura.de> On 07.09.2010 14:33, Olivier Cr?te wrote: > Hi, > > On Tue, 2010-09-07 at 14:12 +0300, Miguel Verdu wrote: > >> We are planning to add support in our video codecs for the fast update >> commands defined in H.245 recommendation. This is also functionally >> similar to what is required by the RTP/AVPF spec (RFC4585). >> > I've already started to implement RTP/AVPF, code at: > http://git.collabora.co.uk/?p=user/tester/gst-plugins-good.git;a=shortlog;h=refs/heads/avpf-timing > > > >> My current proposal tries to be general enough to cover what is required by >> both recommendations >> >> - New upstream event >> Name: PictureLoss >> Empty stuct >> > There is already a "GstForceKeyUnit" event defined for that, and it is > implemented by most of the relevant open source encoder (x264enc, > gst-ffmpeg and theoraenc) elements. My AVPF branch produces that > GstForceKeyUnit event on the relevant AVPF RTCP messages. > > Documented at: > http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-keyframe-force.txt > > > >> Name: SliceLoss >> Name: ReferencePictureSelection >> > As for the two other types, I agree that we should have new events for > those. Maybe extend GstForceKeyUnit for the slice loss. But > ReferencePictureSelection probably requires a new custom event type. > > And don't feel too bad, I just re-implemented something that had existed > for like a year ;) > > And we should move the event to e.g. the video libarry to have it documented so that people find it :) https://bugzilla.gnome.org/show_bug.cgi?id=607742 Stefan From ensonic at hora-obscura.de Tue Sep 7 14:23:22 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 07 Sep 2010 15:23:22 +0300 Subject: [gst-devel] how to set an audiofirfilter? In-Reply-To: <1283627896731-2526929.post@n4.nabble.com> References: <1283627896731-2526929.post@n4.nabble.com> Message-ID: <4C862EBA.50603@hora-obscura.de> On 04.09.2010 22:18, victor wrote: > hello to all > > i'm brand new to programming and to gstreamer; i find it great and allows me > to potentially manipulate sound. I've just made an audiofirfilter on a file > that i'm playing from disk. > But is there any explanations somewhere on how to set the filter? I mean how > to define the cutoff frequencies and how to define the slope of the cutoff > if it is possible. I've seen the link to the documentatrion on wich the FIR > filter is explained, burt as a musician, i cannot get it at all... > > about the same, regarding the audioirrfilter, is there a documented exeample > about a high pass filter? this seems a mathematical challenge to me... > > Thank you very much! > > victor > There are quite some docs on the net about this (e.g. http://musicdsp.org/) or good textbooks that you can buy. Its definitely out of scope for this list. sorry. Stefan From ensonic at hora-obscura.de Tue Sep 7 14:24:32 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 07 Sep 2010 15:24:32 +0300 Subject: [gst-devel] how to load aravis plugin with python-gstreamer In-Reply-To: References: Message-ID: <4C862F00.4040802@hora-obscura.de> On 01.09.2010 17:33, Bert Douglas wrote: > Hi All, > > I managed to use aravis plugin from shell script as shown below. > How can I do same sort of thing from python? > > Thanks much, > Bert Douglas > > ----------------------------------- > > > LD_PRELOAD=/usr/lib/libaravis.so \ > gst-launch --gst-plugin-load=/usr/lib/gstreamer-0.10/libgstaravis.so \ > aravissrc camera-name="" \ > ! ffmpegcolorspace \ > ! autovideosink > Why do you have to LD_PRELOAD the lib? Stefan > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Tue Sep 7 14:26:36 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 07 Sep 2010 15:26:36 +0300 Subject: [gst-devel] GstBaseSink - getcaps In-Reply-To: References: <1283338442.6384.6.camel@zingle> Message-ID: <4C862F7C.2070206@hora-obscura.de> On 02.09.2010 10:53, Julien Isorce wrote: > > > 2010/9/1 Tim-Philipp M?ller > > > On Wed, 2010-09-01 at 12:34 +0200, Julien Isorce wrote: > > > If my X11 settings are 32 bpp then an annoying thing is that running > > "gst-launch-0.10 videotestsrc ! "video/x-raw-rgb, bpp=16, > depth=16" ! > > ximagesink" gives me an error from videotestsrc but I think the > error > > should come from ximagesink. > > > > In ximagesink::gst_ximagesink_getcaps, the xcontext->caps is > setup to > > bpp=32 and depth=24, so I think at this point it should check that > > this is not compatible with the required caps from my capfilter. > > > > The error is: > > "videotestsrc0 : Could not negotiate format" > > and I think it should be: > > "ximagesink0 : Could not negotiate format" > > > > Any comment ? > > Well, yes. We should find a way to report errors like this better, no > doubt. Currently there's no way to communicate error state to the > upstream element that drives the pipeline and eventually errors out > though, it just knows the flow return and that's that. I think > there's a > bug about this somewhere in bugzilla, but can't find it right now. > Feel > free to file a new one. > > Cheers > -Tim > > > Hi, > Could you suggest me a tittle for bugzilla ? So we can find it easier. > And in core or base ? > Julien Could be this one https://bugzilla.gnome.org/show_bug.cgi?id=350545 Stefan > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From bertd at tplogic.com Tue Sep 7 14:41:05 2010 From: bertd at tplogic.com (Bert Douglas) Date: Tue, 7 Sep 2010 07:41:05 -0500 Subject: [gst-devel] how to load aravis plugin with python-gstreamer In-Reply-To: <4C862F00.4040802@hora-obscura.de> References: <4C862F00.4040802@hora-obscura.de> Message-ID: Because the so file for the plugin is defective. It lacks a reference to libaravis. You can see this with ldd. Bert Douglas On Tue, Sep 7, 2010 at 7:24 AM, Stefan Kost wrote: > On 01.09.2010 17:33, Bert Douglas wrote: > > Hi All, > > > > I managed to use aravis plugin from shell script as shown below. > > How can I do same sort of thing from python? > > > > Thanks much, > > Bert Douglas > > > > ----------------------------------- > > > > > > LD_PRELOAD=/usr/lib/libaravis.so > \ > > gst-launch --gst-plugin-load=/usr/lib/gstreamer-0.10/libgstaravis.so > \ > > aravissrc camera-name="" \ > > ! ffmpegcolorspace \ > > ! autovideosink > > > Why do you have to LD_PRELOAD the lib? > > Stefan > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From luis at debethencourt.com Tue Sep 7 14:49:45 2010 From: luis at debethencourt.com (Luis de Bethencourt) Date: Tue, 7 Sep 2010 05:49:45 -0700 (PDT) Subject: [gst-devel] Invitation to connect on LinkedIn Message-ID: <223593001.4648536.1283863785972.JavaMail.app@ech3-cdn13.prod> LinkedIn ------------Luis de Bethencourt requested to add you as a connection on LinkedIn: ------------------------------------------ Terry, I'd like to add you to my professional network on LinkedIn. - Luis Accept invitation from Luis de Bethencourt http://www.linkedin.com/e/8uvs3p-gdsrclbj-1t/I6odceFIR0nwoItulatqxeuuuYZHsrOdOOozdxXIoOow9a7uxmbiOMp/blk/I2318337477_2/1BpC5vrmRLoRZcjkkZt5YCpnlOt3RApnhMpmdzgmhxrSNBszYOnPsTd3sPcPwNcP99bQwVhmhhkQ9vbPgScjkNdj8Qdz4LrCBxbOYWrSlI/EML_comm_afe/ View invitation from Luis de Bethencourt http://www.linkedin.com/e/8uvs3p-gdsrclbj-1t/I6odceFIR0nwoItulatqxeuuuYZHsrOdOOozdxXIoOow9a7uxmbiOMp/blk/I2318337477_2/39vdPsQdPcPe34PcAALqnpPbOYWrSlI/svi/ ------------------------------------------ DID YOU KNOW that LinkedIn can find the answers to your most difficult questions? Post those vexing questions on LinkedIn Answers to tap into the knowledge of the world's foremost business experts: http://www.linkedin.com/e/8uvs3p-gdsrclbj-1t/ask/inv-23/ ------ (c) 2010, LinkedIn Corporation -------------- next part -------------- An HTML attachment was scrubbed... URL: From vaisaari at gmail.com Tue Sep 7 15:23:05 2010 From: vaisaari at gmail.com (vaisaari) Date: Tue, 7 Sep 2010 06:23:05 -0700 (PDT) Subject: [gst-devel] Height and width for displayed video Message-ID: <1283865785250-2529701.post@n4.nabble.com> Hi all, I'm new in GStreamer programming and I'm stuck in the following problem. If I have arbitrary pipeline that decodes video and eventually displays it on screen, like filesource -> decodebin -> filter1 -> filter2 -> ... -> xvimagesink. How can I get height and width of the resulting video? Thanks. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Height-and-width-for-displayed-video-tp2529701p2529701.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From 123sandy at gmail.com Tue Sep 7 16:02:16 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Tue, 7 Sep 2010 07:02:16 -0700 (PDT) Subject: [gst-devel] Height and width for displayed video In-Reply-To: <1283865785250-2529701.post@n4.nabble.com> References: <1283865785250-2529701.post@n4.nabble.com> Message-ID: <1283868136991-2529770.post@n4.nabble.com> Hi, vaisaari wrote: > > How can I get height and width of the resulting video? > You can get the height, width, framerate, etc,. from the caps. Caps will be present either as part of a Pad or as part of the buffer or both. You can get caps from pad using gst_pad_get_caps, and you can get caps from buffer using gst_buffer_get_caps function. Once you have the caps use the gst_caps_get_structure to access the individual fields. You have to gst_structure_get_ series of functions to fetch each individual. E.G.: For integer fields use gst_struture_get_int function. Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Height-and-width-for-displayed-video-tp2529701p2529770.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From nico at inattendu.org Tue Sep 7 16:43:20 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Tue, 07 Sep 2010 18:43:20 +0400 Subject: [gst-devel] Looking for some advices for playing sequence of images with a sound track In-Reply-To: <1283857052046-2529498.post@n4.nabble.com> References: <4C8613B2.4060005@inattendu.org> <1283857052046-2529498.post@n4.nabble.com> Message-ID: <4C864F88.2000609@inattendu.org> > I'd suggest using imagefreeze element to play images. > Then, use filesrc, multifilesrc, or appsrc to feed the images in. Appsrc > would help you to create a sophisticated image sequence, if you need one. > Also, it can help you to avoid storing images to disk, you your app > generates them on the fly. > Usual sound playback chain would suit, if you have sound files on the disk. > > Okay thanks. I will try it From wmiller at sdr.com Tue Sep 7 21:19:39 2010 From: wmiller at sdr.com (Wes Miller) Date: Tue, 7 Sep 2010 12:19:39 -0700 (PDT) Subject: [gst-devel] Change Framerate of recorded stream In-Reply-To: <1283188714537-2400463.post@n4.nabble.com> References: <1282663765569-2336869.post@n4.nabble.com> <1282715182546-2337676.post@n4.nabble.com> <1282739031957-2338136.post@n4.nabble.com> <1282754317897-2338536.post@n4.nabble.com> <1282756968876-2338609.post@n4.nabble.com> <1283188714537-2400463.post@n4.nabble.com> Message-ID: <1283887179955-2530270.post@n4.nabble.com> Hey, I finally got this to work! It did require one change to the original pipeline. I added the ffmpegcolorspace element and voila it worked. Actually, I did have one other change. My original mkv file was corrupt. I played it through a jpegdec ! jpecenc and re-stored it to a new mkv file and that file what worked for me. gst-launch-0.10 gnlfilesource duration=10000000000000 media-duration=10000000000000 \ location="file:///home/wmiller/2redo.mkv" ! ffmpegcolorspace ! autovideosink I do still have one problem. How do I change the playback rate? I tried adding rate=0.02 and a rate=75, neither changed the playback rate. (The original goal here is to take a mjpeg video of, say, 1000 frames, taken at framerate = 25/1 and throw away 4 out of every 5 frames then to fix the timestamp info in the saved frames so the video plays back at 5x speed, i.e. the duration is divided by 5. Result is highspeed, small filesize, "thumbnail" video) Thanks, Wes -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Change-Framerate-of-recorded-stream-tp2336869p2530270.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From barthelemy at crans.org Tue Sep 7 21:22:42 2010 From: barthelemy at crans.org (=?ISO-8859-1?Q?S=E9bastien_Barth=E9lemy?=) Date: Tue, 7 Sep 2010 21:22:42 +0200 Subject: [gst-devel] passing through compilation and album artist metadata In-Reply-To: <1283853710.3561.39.camel@zingle> References: <1283853710.3561.39.camel@zingle> Message-ID: Hi Tim, On Tue, Sep 7, 2010 at 12:01 PM, Tim-Philipp M?ller wrote: > On Tue, 2010-09-07 at 10:43 +0200, S?bastien Barth?lemy wrote: >> Is this a known problem? I could not find a reference in the bug >> tracker nor in the archive. Maybe I am missing something? > > You could try id3mux from -bad as well. I just tried with id3mux 0.10.18. Thank you for the suggestion. Alas, I got the exact same result. > There are some known issues with > the taglib-based id3v2mux, but I would've expected it to handle unknown > text frames just fine. Well, I can't tell. Would you like me to file a bug report for this? >> I use gstreamer 0.10.15 on ubuntu lucid lynx. > > That doesn't sound right. I got it with 'aptitude show' on a random gstreamer package, assuming they all shared the same version number, sorry. > What's the output of > > ?gst-inspect-0.10 id3v2mux | grep Version > > ? 0.10.21 Regards -- Sebastien From t.i.m at zen.co.uk Tue Sep 7 21:39:37 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Tue, 07 Sep 2010 20:39:37 +0100 Subject: [gst-devel] passing through compilation and album artist metadata In-Reply-To: References: <1283853710.3561.39.camel@zingle> Message-ID: <1283888377.3561.41.camel@zingle> On Tue, 2010-09-07 at 21:22 +0200, S?bastien Barth?lemy wrote: > Would you like me to file a bug report for this? Yes, please! If you could attach the beginning of the file in question there as well, that'd be great (head --bytes=999kB foo.mp3 > head.mp3 or so) Cheers -Tim From halley.zhao at intel.com Wed Sep 8 03:45:03 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Wed, 8 Sep 2010 09:45:03 +0800 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Message-ID: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> During playback of progressive content, I tried to save the content to disk as well. But the result is strange: Some contents are saved correctly, some contents are saved but can't playback again; some contents even can't playback during progressive downloaded. ## most ogg contents work well, the saved contents can playback again gst-launch-0.10 souphttpsrc location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! decodebin ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv ## some mp4 saved contents can't playback again, the saved contents differ from the original one; even the following test.mp4 and test2.mp4 are different gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! filesink location=/home/halley/swap/streaming/test2.mp4 ## some wmv contents even can't playback during progressive downloaded (though some saved wmv contents can playback again) gst-launch-0.10 -v -v souphttpsrc location=http:// 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv thanks in advance for your help. ZHAO, Halley (Aihua) Email: halley.zhao at intel.com Tel: +86(21)61166476 iNet: 8821-6476 SSG/OTC/Moblin 3W038 Pole: F4 -------------- next part -------------- An HTML attachment was scrubbed... URL: From msmith at xiph.org Wed Sep 8 04:01:54 2010 From: msmith at xiph.org (Michael Smith) Date: Tue, 7 Sep 2010 19:01:54 -0700 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> Message-ID: On Tue, Sep 7, 2010 at 6:45 PM, Zhao, Halley wrote: > During playback of progressive content, I tried to save the content to disk > as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t > playback again; some contents even can?t playback during progressive > downloaded. What you describe sounds like what's expected. True streaming formats (like ogg) work fine. Formats that may have the headers at the start OR the end vary - some have the headers at the end, so they're not playable until you've downloaded the entire file. Some have them at the start, so progressive download/playback works. This doesn't have anything to do specifically with GStreamer, it's simply how the formats work. Mike From halley.zhao at intel.com Wed Sep 8 04:21:26 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Wed, 8 Sep 2010 10:21:26 +0800 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> Message-ID: <5D8008F58939784290FAB48F54975198278A379DC1@shsmsx502.ccr.corp.intel.com> Thanks. I understand that there are possible headers at the START or the END, however my question is about the saved contents, not playback itself during progressive downloaded. If I don't add a tee and filesink to save the content, all my mentioned contents can playback in progressive downloaded well. Especially for the mp4 contents mentioned, files are different (even file size) between save during playback and save directly from souphttpsrc: gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! filesink location=/home/halley/swap/streaming/test2.mp4 I suspect soup request header at the END of the mp4 file when playback starts, but this header isn't save to file by filesink. -----Original Message----- From: Michael Smith [mailto:msmith at xiph.org] Sent: 2010?9?8? 10:02 To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded On Tue, Sep 7, 2010 at 6:45 PM, Zhao, Halley wrote: > During playback of progressive content, I tried to save the content to disk > as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t > playback again; some contents even can?t playback during progressive > downloaded. What you describe sounds like what's expected. True streaming formats (like ogg) work fine. Formats that may have the headers at the start OR the end vary - some have the headers at the end, so they're not playable until you've downloaded the entire file. Some have them at the start, so progressive download/playback works. This doesn't have anything to do specifically with GStreamer, it's simply how the formats work. Mike ------------------------------------------------------------------------------ This SF.net Dev2Dev email is sponsored by: Show off your parallel programming skills. Enter the Intel(R) Threading Challenge 2010. http://p.sf.net/sfu/intel-thread-sfd _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From gabrbedd at gmail.com Wed Sep 8 04:50:24 2010 From: gabrbedd at gmail.com (Gabriel M. Beddingfield) Date: Tue, 7 Sep 2010 21:50:24 -0500 Subject: [gst-devel] Position info when doing time stretch In-Reply-To: <201009052059.48215.gabrbedd@gmail.com> References: <201009052059.48215.gabrbedd@gmail.com> Message-ID: <201009072150.25199.gabrbedd@gmail.com> Hello, On Sunday, September 05, 2010 08:59:47 pm Gabriel M. Beddingfield wrote: > > I'm doing time stretching app, and want the UI to show > position/length info in terms of the original media. > However, gst_element_query_position() and _duration() always > gives me playback times relative to the wall clock. This > time is affected by the playback speed. > > Does anyone know a way to handle this?[1] Anyone? :-) Just curious.... is my question hard to understand? Or is it a hard question to answer? Or... other? I don't really want to give up on gstreamer... but this looks like a dead end. Thanks, Gabriel From msmith at xiph.org Wed Sep 8 04:56:36 2010 From: msmith at xiph.org (Michael Smith) Date: Tue, 7 Sep 2010 19:56:36 -0700 Subject: [gst-devel] Position info when doing time stretch In-Reply-To: <201009072150.25199.gabrbedd@gmail.com> References: <201009052059.48215.gabrbedd@gmail.com> <201009072150.25199.gabrbedd@gmail.com> Message-ID: This works the way you want it to if you speed up/slow down playback using the seek API. You're instead using the pitch element, which doesn't do that. Since you're controlling the playback speed, you could track the corrections yourself. Mike On Tue, Sep 7, 2010 at 7:50 PM, Gabriel M. Beddingfield wrote: > > Hello, > > On Sunday, September 05, 2010 08:59:47 pm Gabriel M. Beddingfield wrote: >> >> I'm doing time stretching app, and want the UI to show >> position/length info in terms of the original media. >> However, gst_element_query_position() and _duration() always >> gives me playback times relative to the wall clock. ?This >> time is affected by the playback speed. >> >> Does anyone know a way to handle this?[1] > > Anyone? ?:-) > > Just curious.... is my question hard to understand? ?Or is it a hard question to answer? ?Or... other? > > I don't really want to give up on gstreamer... but this looks like a dead end. > > Thanks, > Gabriel > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From gabrbedd at gmail.com Wed Sep 8 05:09:33 2010 From: gabrbedd at gmail.com (Gabriel M. Beddingfield) Date: Tue, 7 Sep 2010 22:09:33 -0500 Subject: [gst-devel] Position info when doing time stretch In-Reply-To: References: <201009052059.48215.gabrbedd@gmail.com> <201009072150.25199.gabrbedd@gmail.com> Message-ID: <201009072209.33719.gabrbedd@gmail.com> Hi Michael, Thank you! On Tuesday, September 07, 2010 09:56:36 pm Michael Smith wrote: > This works the way you want it to if you speed up/slow down playback > using the seek API. > > You're instead using the pitch element, which doesn't do that. So, if I were to replace the pitch element with something similar... would I be able to gain control of that? > Since you're controlling the playback speed, you could track the > corrections yourself. ...except that the audio stream is buffered and passed through a vocoder, so I don't know at what time my change will take effect. Which is an OK estimation... until I try to add A/B looping. -gabriel From 123sandy at gmail.com Wed Sep 8 05:37:05 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Tue, 7 Sep 2010 20:37:05 -0700 (PDT) Subject: [gst-devel] Change Framerate of recorded stream In-Reply-To: <1283887179955-2530270.post@n4.nabble.com> References: <1282663765569-2336869.post@n4.nabble.com> <1282715182546-2337676.post@n4.nabble.com> <1282739031957-2338136.post@n4.nabble.com> <1282754317897-2338536.post@n4.nabble.com> <1282756968876-2338609.post@n4.nabble.com> <1283188714537-2400463.post@n4.nabble.com> <1283887179955-2530270.post@n4.nabble.com> Message-ID: <1283917025840-2530740.post@n4.nabble.com> Hi Wes, Wes Miller wrote: > > I do still have one problem. How do I change the playback rate? I tried > adding rate=0.02 and a rate=75, neither changed the playback rate. > The "rate" property of gnlfilesource is READONLY. I am assuming you are talking about the gnlfilesource element. You should be getting a "GLib-GObject-WARNING" for this. To actually change the rate you have to fiddle with the "duration" and "media-duration" properties. So for a faster playback media-duration should be greater than the duration and vice-versa for slow playback. Also you have to use identity element to adjust the timestamps. E.G.: Below pipeline is for faster playback: gst-launch gnlfilesource duration=10000000000000 media-duration=100000000000000 location=file:///home/user/test.mp4 ! identity single-segment=true ! ffmpegcolorspace ! queue ! autovideosink Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Change-Framerate-of-recorded-stream-tp2336869p2530740.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From sukhbir.singh20 at gmail.com Wed Sep 8 09:08:37 2010 From: sukhbir.singh20 at gmail.com (newbie) Date: Wed, 8 Sep 2010 00:08:37 -0700 (PDT) Subject: [gst-devel] Recieve DTMF using GStreamer Message-ID: <1283929717386-2530905.post@n4.nabble.com> Hi All, This is my first post in this mailing list. i have already searched the archive to find out the answer of my question. But i did not find it. I need to know that can we use GStreamer to receive DTMF from peer. i have seen the plug ins to send DTMF to peer but not able to find the plug ins to receive DTMF. If this is possible to receive DTMF using GStreamer then please share the solution for the same. Thanks -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Recieve-DTMF-using-GStreamer-tp2530905p2530905.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From wl2776 at gmail.com Wed Sep 8 09:09:29 2010 From: wl2776 at gmail.com (wl2776) Date: Wed, 8 Sep 2010 00:09:29 -0700 (PDT) Subject: [gst-devel] How can I know that playbin2 has created rtspsrc element, as soon as possible? Message-ID: <1283929769992-2530906.post@n4.nabble.com> I am developing a universal video player, which should be capable to play RTSP streams also. I use playbin2. I am also doing some coding to the RTSPSrc element to make it able to play Kasenna streams (it sends x-rtsp-mh messages instead of SDP). How can I know that playbin2 has created rtspsrc element, as soon as possible? I need this because I want to set some properties on this element. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-can-I-know-that-playbin2-has-created-rtspsrc-element-as-soon-as-possible-tp2530906p2530906.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From olivier.crete at collabora.co.uk Wed Sep 8 09:26:47 2010 From: olivier.crete at collabora.co.uk (Olivier =?ISO-8859-1?Q?Cr=EAte?=) Date: Wed, 08 Sep 2010 10:26:47 +0300 Subject: [gst-devel] Recieve DTMF using GStreamer In-Reply-To: <1283929717386-2530905.post@n4.nabble.com> References: <1283929717386-2530905.post@n4.nabble.com> Message-ID: <1283930807.2483.0.camel@TesterTop4> On Wed, 2010-09-08 at 00:08 -0700, newbie wrote: > This is my first post in this mailing list. i have already searched the > archive to find out the answer of my question. But i did not find it. > I need to know that can we use GStreamer to receive DTMF from peer. i have > seen the plug ins to send DTMF to peer but not able to find the plug ins to > receive DTMF. > If this is possible to receive DTMF using GStreamer then please share the > solution for the same. DTMF can arrive in two forms, sound or RTP events. You can receive RTP events with "rtpdtmfdepay". There is also "dtmfdetect" which will try to detect DTMF in the incoming sound (your mileage may vary). -- Olivier Cr?te olivier.crete at collabora.co.uk -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From wim.taymans at gmail.com Wed Sep 8 09:29:10 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Wed, 08 Sep 2010 09:29:10 +0200 Subject: [gst-devel] Position info when doing time stretch In-Reply-To: <201009072150.25199.gabrbedd@gmail.com> References: <201009052059.48215.gabrbedd@gmail.com> <201009072150.25199.gabrbedd@gmail.com> Message-ID: <1283930950.6932.3.camel@metal> On Tue, 2010-09-07 at 21:50 -0500, Gabriel M. Beddingfield wrote: > Hello, > > On Sunday, September 05, 2010 08:59:47 pm Gabriel M. Beddingfield wrote: > > > > I'm doing time stretching app, and want the UI to show > > position/length info in terms of the original media. > > However, gst_element_query_position() and _duration() always > > gives me playback times relative to the wall clock. This > > time is affected by the playback speed. > > > > Does anyone know a way to handle this?[1] > > Anyone? :-) > > Just curious.... is my question hard to understand? Or is it a hard question to answer? Or... other? > > I don't really want to give up on gstreamer... but this looks like a dead end. The element probably doesn't adjust the applied_rate in the segment events. This can be fixed if you file a bug. Wim > > Thanks, > Gabriel > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From t.i.m at zen.co.uk Wed Sep 8 10:46:26 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Wed, 08 Sep 2010 09:46:26 +0100 Subject: [gst-devel] How can I know that playbin2 has created rtspsrc element, as soon as possible? In-Reply-To: <1283929769992-2530906.post@n4.nabble.com> References: <1283929769992-2530906.post@n4.nabble.com> Message-ID: <1283935586.28578.1.camel@zingle> On Wed, 2010-09-08 at 00:09 -0700, wl2776 wrote: > I am developing a universal video player, which should be capable to play > RTSP streams also. > I use playbin2. > > I am also doing some coding to the RTSPSrc element to make it able to play > Kasenna streams (it sends x-rtsp-mh messages instead of SDP). > > How can I know that playbin2 has created rtspsrc element, as soon as > possible? > > I need this because I want to set some properties on this element. g_signal_connect (playbin2, "notify::source", G_CALLBACK (configure_source), NULL); will make sure your callback gets called as soon as the source element has been created. There you can then get the source element from playbin2 via g_object_get() and set properties on it as needed. Cheers -Tim From lfarkas at lfarkas.org Wed Sep 8 10:22:46 2010 From: lfarkas at lfarkas.org (Farkas Levente) Date: Wed, 08 Sep 2010 10:22:46 +0200 Subject: [gst-devel] videorate for encoded video Message-ID: <4C8747D6.5050505@lfarkas.org> hi, we'd like to make snapshots of an input encoded video stream. does anybody have any tip how can we do this? unfortunately videorate can only accept raw streams. so suppose we've got a h264 input source and like to save the stream and every 5 minutes a jpeg image, then this pipeline almost good: --------------------------- gst-launch -e rtspsrc location="rtsp://a-h264-url" ! rtph264depay access-unit=true ! h264parse ! tee name="t" ! queue ! matroskamux ! filesink location="test.mkv" t. ! queue ! ffdec_h264 ! videorate ! video/x-raw-yuv,framerate=1/300 ! ffmpegcolorspace ! jpegenc ! multifilesink location="frame%05d.jpg" --------------------------- but in this case we've to decode _all_ frames. and we'd like to save cpu and only decode the required frames. does anybody has any tips? thanks in advance. regards. -- Levente "Si vis pacem para bellum!" From oz9aec at gmail.com Wed Sep 8 11:13:47 2010 From: oz9aec at gmail.com (Alexandru Csete) Date: Wed, 8 Sep 2010 11:13:47 +0200 Subject: [gst-devel] videorate for encoded video In-Reply-To: <4C8747D6.5050505@lfarkas.org> References: <4C8747D6.5050505@lfarkas.org> Message-ID: On Wed, Sep 8, 2010 at 10:22 AM, Farkas Levente wrote: > hi, > we'd like to make snapshots of an input encoded video stream. > does anybody have any tip how can we do this? > unfortunately videorate can only accept raw streams. > so suppose we've got a h264 input source and like to save the stream and > every 5 minutes a jpeg image, then this pipeline almost good: > --------------------------- > gst-launch -e rtspsrc > location="rtsp://a-h264-url" ! rtph264depay access-unit=true ! h264parse > ! tee name="t" ! queue ! matroskamux ! filesink location="test.mkv" ?t. > ! queue ! ffdec_h264 ! videorate ! > video/x-raw-yuv,framerate=1/300 ! > ffmpegcolorspace ! jpegenc ! multifilesink location="frame%05d.jpg" > --------------------------- > but in this case we've to decode _all_ frames. and we'd like to save cpu > and only decode the required frames. > does anybody has any tips? Hi Levente, I don't think you can do that because h264 (and most other video codecs) rely heavily on inter frame compression. I guess one could do what you ask by only decoding the key frames, but that would probably require writing your own decoder. Alex From halley.zhao at intel.com Wed Sep 8 11:15:50 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Wed, 8 Sep 2010 17:15:50 +0800 Subject: [gst-devel] During playback (http progressive downloaded), is it possible to save media content to disk correctly ? In-Reply-To: <5D8008F58939784290FAB48F54975198278A379DC1@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <5D8008F58939784290FAB48F54975198278A379DC1@shsmsx502.ccr.corp.intel.com> Message-ID: <5D8008F58939784290FAB48F54975198278A37A09C@shsmsx502.ccr.corp.intel.com> Based on some finding in previous thread, I doubt the possibility to save file on disk during playback of progressive downloaded. For the mp4 file mentioned below, some bytes differ from original file at the beginning of test.mp4, and test.mp4 also loses hundreds of bytes at the end of file. Since there are many cases that parsers don't read file in serial order, saved file from souphttpsrc in gst pipeline often differ from original file. -----Original Message----- From: Zhao, Halley [mailto:halley.zhao at intel.com] Sent: 2010?9?8? 10:21 To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Thanks. I understand that there are possible headers at the START or the END, however my question is about the saved contents, not playback itself during progressive downloaded. If I don't add a tee and filesink to save the content, all my mentioned contents can playback in progressive downloaded well. Especially for the mp4 contents mentioned, files are different (even file size) between save during playback and save directly from souphttpsrc: gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 gst-launch-0.10 souphttpsrc location=http:// 10.238.37.11/share/media/video/test.mp4 ! filesink location=/home/halley/swap/streaming/test2.mp4 I suspect soup request header at the END of the mp4 file when playback starts, but this header isn't save to file by filesink. -----Original Message----- From: Michael Smith [mailto:msmith at xiph.org] Sent: 2010?9?8? 10:02 To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded On Tue, Sep 7, 2010 at 6:45 PM, Zhao, Halley wrote: > During playback of progressive content, I tried to save the content to disk > as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t > playback again; some contents even can?t playback during progressive > downloaded. What you describe sounds like what's expected. True streaming formats (like ogg) work fine. Formats that may have the headers at the start OR the end vary - some have the headers at the end, so they're not playable until you've downloaded the entire file. Some have them at the start, so progressive download/playback works. This doesn't have anything to do specifically with GStreamer, it's simply how the formats work. Mike ------------------------------------------------------------------------------ This SF.net Dev2Dev email is sponsored by: Show off your parallel programming skills. Enter the Intel(R) Threading Challenge 2010. http://p.sf.net/sfu/intel-thread-sfd _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ This SF.net Dev2Dev email is sponsored by: Show off your parallel programming skills. Enter the Intel(R) Threading Challenge 2010. http://p.sf.net/sfu/intel-thread-sfd _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From lfarkas at lfarkas.org Wed Sep 8 11:24:18 2010 From: lfarkas at lfarkas.org (Farkas Levente) Date: Wed, 08 Sep 2010 11:24:18 +0200 Subject: [gst-devel] videorate for encoded video In-Reply-To: References: <4C8747D6.5050505@lfarkas.org> Message-ID: <4C875642.3000207@lfarkas.org> On 09/08/2010 11:13 AM, Alexandru Csete wrote: > On Wed, Sep 8, 2010 at 10:22 AM, Farkas Levente wrote: >> hi, >> we'd like to make snapshots of an input encoded video stream. >> does anybody have any tip how can we do this? >> unfortunately videorate can only accept raw streams. >> so suppose we've got a h264 input source and like to save the stream and >> every 5 minutes a jpeg image, then this pipeline almost good: >> --------------------------- >> gst-launch -e rtspsrc >> location="rtsp://a-h264-url" ! rtph264depay access-unit=true ! h264parse >> ! tee name="t" ! queue ! matroskamux ! filesink location="test.mkv" t. >> ! queue ! ffdec_h264 ! videorate ! >> video/x-raw-yuv,framerate=1/300 ! >> ffmpegcolorspace ! jpegenc ! multifilesink location="frame%05d.jpg" >> --------------------------- >> but in this case we've to decode _all_ frames. and we'd like to save cpu >> and only decode the required frames. >> does anybody has any tips? > > Hi Levente, > > I don't think you can do that because h264 (and most other video > codecs) rely heavily on inter frame compression. I guess one could do > what you ask by only decoding the key frames, but that would probably > require writing your own decoder. or simple decode only the required i frame and a few p frames instead of all i and p frames.... that's what i looking for... -- Levente "Si vis pacem para bellum!" From barthelemy at crans.org Wed Sep 8 11:35:48 2010 From: barthelemy at crans.org (=?ISO-8859-1?Q?S=E9bastien_Barth=E9lemy?=) Date: Wed, 8 Sep 2010 11:35:48 +0200 Subject: [gst-devel] passing through compilation and album artist metadata In-Reply-To: <1283888377.3561.41.camel@zingle> References: <1283853710.3561.39.camel@zingle> <1283888377.3561.41.camel@zingle> Message-ID: On Tue, Sep 7, 2010 at 9:39 PM, Tim-Philipp M?ller wrote: > On Tue, 2010-09-07 at 21:22 +0200, S?bastien Barth?lemy wrote: > >> Would you like me to file a bug report for this? > > Yes, please! This is now https://bugzilla.gnome.org/show_bug.cgi?id=629039 > If you could attach the beginning of the file in question > there as well, that'd be great > (head --bytes=999kB foo.mp3 > head.mp3 or so) I created a small audio file instead. Thank you for your help. Cheers Sebastien From wl2776 at gmail.com Wed Sep 8 11:52:03 2010 From: wl2776 at gmail.com (wl2776) Date: Wed, 8 Sep 2010 02:52:03 -0700 (PDT) Subject: [gst-devel] How can I know that playbin2 has created rtspsrc element, as soon as possible? In-Reply-To: <1283935586.28578.1.camel@zingle> References: <1283929769992-2530906.post@n4.nabble.com> <1283935586.28578.1.camel@zingle> Message-ID: <1283939523908-2531059.post@n4.nabble.com> Tim-Philipp M?ller-2 wrote: > >> How can I know that playbin2 has created rtspsrc element, as soon as >> possible? >> > g_signal_connect (playbin2, "notify::source", > G_CALLBACK (configure_source), NULL); > Great! Thanks! Till now I was experimenting with the element-added signal. Just for the information - how can I catch all "element-added" signals non-depending of their source? And more specific - what is the GType of the GstBin? (wonder if this my question is correct at all :)) ) I've found g_signal_add_emission_hook() function. It requires signal-ID and signal-details Quark. I could get them from g_signal_parse_name, but its second parameter is "GType itype - The interface/instance type that introduced "signal-name". So, how should I fill this parameter? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-can-I-know-that-playbin2-has-created-rtspsrc-element-as-soon-as-possible-tp2530906p2531059.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From t.i.m at zen.co.uk Wed Sep 8 12:25:42 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Wed, 08 Sep 2010 11:25:42 +0100 Subject: [gst-devel] How can I know that playbin2 has created rtspsrc element, as soon as possible? In-Reply-To: <1283939523908-2531059.post@n4.nabble.com> References: <1283929769992-2530906.post@n4.nabble.com> <1283935586.28578.1.camel@zingle> <1283939523908-2531059.post@n4.nabble.com> Message-ID: <1283941542.28578.6.camel@zingle> On Wed, 2010-09-08 at 02:52 -0700, wl2776 wrote: > Till now I was experimenting with the element-added signal. > Just for the information - how can I catch all "element-added" signals > non-depending of their source? What's wrong with g_signal_connect (obj, "element-added", ...) ? Not sure if there's a good way to get element-added signals from grand-children though (maybe deep-notify::parent works, maybe not). > And more specific - what is the GType of the GstBin? (wonder if this my > question is correct at all :)) ) GST_TYPE_BIN is the GType of a generic bin. You shouldn't rely on the GType name of any element though, it sometimes changes. In the notify::source callback it's best to check if the source element has certain properties by name and then set them, and not check the GType. > I've found g_signal_add_emission_hook() function. > It requires signal-ID and signal-details Quark. > I could get them from g_signal_parse_name, but its second parameter is > "GType itype - The interface/instance type that introduced "signal-name". > So, how should I fill this parameter? Not sure what you are asking here, or what the context is. Cheers -Tim From wl2776 at gmail.com Wed Sep 8 12:56:57 2010 From: wl2776 at gmail.com (wl2776) Date: Wed, 8 Sep 2010 03:56:57 -0700 (PDT) Subject: [gst-devel] How can I know that playbin2 has created rtspsrc element, as soon as possible? In-Reply-To: <1283941542.28578.6.camel@zingle> References: <1283929769992-2530906.post@n4.nabble.com> <1283935586.28578.1.camel@zingle> <1283939523908-2531059.post@n4.nabble.com> <1283941542.28578.6.camel@zingle> Message-ID: <1283943417517-2531112.post@n4.nabble.com> Tim-Philipp M?ller-2 wrote: > >> Till now I was experimenting with the element-added signal. >> Just for the information - how can I catch all "element-added" signals >> non-depending of their source? > > What's wrong with g_signal_connect (obj, "element-added", ...) ? > > Not sure if there's a good way to get element-added signals from > grand-children though (maybe deep-notify::parent works, maybe not). > That was exactly that, rtspsrc is added to the bin somewhere in playbin2's depth. That said, I can only do g_signal_connect(m_player,"element-added",...) where m_player is my instance of the playbin2. And all I'll receive is a notification, that the playbin2 has added uridecodebin, playsink and something else. I tried checking elements' names in the signal-added callback and connect this callback to them, but it didn't work for some reason. BTW, is it correct? Can I connect the callback to the signals from the same callback? Is the code correct? static void element_added(GstBin *bin, GstElement *element, gpointer user_data) { .... if( < we have proper element >){ g_signal_connect(element,"element-added",GCallback(element_added),NULL); } ... } .... g_signal_connect(m_player,"element-added",GCallback(element_added),NULL); ... Tim-Philipp M?ller-2 wrote: > >> I've found g_signal_add_emission_hook() function. >> It requires signal-ID and signal-details Quark. >> I could get them from g_signal_parse_name, but its second parameter is >> "GType itype - The interface/instance type that introduced "signal-name". >> So, how should I fill this parameter? > > Not sure what you are asking here, or what the context is. > This is a question about GLib and GObject, not GStreamer. I had problems making a call to g_signal_parse_name(), because didn't know how to fill the second parameter. Not sure it is needed now, after I know about "notify::source". Just for information. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-can-I-know-that-playbin2-has-created-rtspsrc-element-as-soon-as-possible-tp2530906p2531112.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From lfarkas at lfarkas.org Wed Sep 8 14:12:49 2010 From: lfarkas at lfarkas.org (Farkas Levente) Date: Wed, 8 Sep 2010 14:12:49 +0200 Subject: [gst-devel] matroska seek was:Re: RELEASE: GStreamer Good Plug-ins 0.10.25 "Woe to You Oh Earth and Sea" Message-ID: On Fri, Sep 3, 2010 at 11:54, Tim-Philipp M?ller wrote: > This mail announces the release of GStreamer Good Plug-ins 0.10.25 > "Woe to You Oh Earth and Sea". > Highlights of this release: > > ?* streaming mode fixes for avi and matroska/webm > ?* seeking in matroska and webm files that don't have an index what does this exactly means? suppose i currently writing a matroska file to filesink and during this i open it with a filesrc than i can seek in it? unfortunately we can't do this while we test this feature. i see these tickets: https://bugzilla.gnome.org/show_bug.cgi?id=624455 https://bugzilla.gnome.org/show_bug.cgi?id=617368 but is these are the relevant tickets? anyway i create a new ticket when we test matroska seek it seg. fault: https://bugzilla.gnome.org/show_bug.cgi?id=629047 -- ? Levente? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? "Si vis pacem para bellum!" From wl2776 at gmail.com Wed Sep 8 17:31:38 2010 From: wl2776 at gmail.com (wl2776) Date: Wed, 8 Sep 2010 08:31:38 -0700 (PDT) Subject: [gst-devel] RTSPSrc. Please help me understand what's happening. Message-ID: <1283959898943-2531511.post@n4.nabble.com> I am developing a universal media player, capable also to play RTSP streams. My player uses playbin2. The player has connected to the server, and the server streams a media to it. Wireshark shows incoming RTP packets. They constitute a valid mpeg2 transport stream. rtspsrc also exchanges messages with the server (request GET_PARAMETER, response - OK) However, I don't see any pictures and cannot hear a sound in my headphones. The dot file with the pipeline shows that playbin2 has created decodebin2 instance. It has found that incoming data have two elementary streams with the capabilities audio/mpeg, mpegversion: 2 and video/mpeg, mpegversion: 2 It has also created mpeg2dec instance and connected its src pad with the proxy pad. Audio data processing has stopped on creation ofthe mpegaudioparse instance. And that's all. The pipeline is the following: http://gstreamer-devel.966125.n4.nabble.com/file/n2531511/0.00.26.332028000-player.dot 0.00.26.332028000-player.dot (22kb) , produced png: http://gstreamer-devel.966125.n4.nabble.com/file/n2531511/0.00.26.332028000-player.png 0.00.26.332028000-player.png (450kb) -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/RTSPSrc-Please-help-me-understand-what-s-happening-tp2531511p2531511.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From gmane at colin.guthr.ie Wed Sep 8 19:02:40 2010 From: gmane at colin.guthr.ie (Colin Guthrie) Date: Wed, 08 Sep 2010 18:02:40 +0100 Subject: [gst-devel] Try using Ossink In-Reply-To: <1283856480837-2529484.post@n4.nabble.com> References: <1283856480837-2529484.post@n4.nabble.com> Message-ID: 'Twas brillig, and kumar at 07/09/10 11:48 did gyre and gimble: > > Developed an simple music player app using "osssink" but when any other > application like mozilla /firefox using osssink > my application is unable to use. Any solution to this problem You should probably not hard code your music player to use osssink, but rather a generic audio sink: autoaudiosink or gconfaudiosink so that osssink, alsasink or pulsesink will be picked automatically. Col -- Colin Guthrie gmane(at)colin.guthr.ie http://colin.guthr.ie/ Day Job: Tribalogic Limited [http://www.tribalogic.net/] Open Source: Mandriva Linux Contributor [http://www.mandriva.com/] PulseAudio Hacker [http://www.pulseaudio.org/] Trac Hacker [http://trac.edgewall.org/] From bilboed at gmail.com Wed Sep 8 21:23:27 2010 From: bilboed at gmail.com (Edward Hervey) Date: Wed, 08 Sep 2010 21:23:27 +0200 Subject: [gst-devel] ANNOUNCE: PiTiVi 0.13.4.2 pre-release Message-ID: <1283973807.4698.10.camel@deumeu> Hi all, The whole PiTiVi team is pleased to announce a pre-release of the PiTiVi video editor. This contains the past 6 months of usability improvements, speedups, and so forth (yes, you now have transitions). Unless anything critical or regressions pop up, expect the 0.13.5 release next wednesday (15th September). Please test it and abuse it and report bugs at: https://bugzilla.gnome.org/enter_bug.cgi?product=PiTiVi Tarballs are available here: http://ftp.gnome.org/pub/GNOME/sources/pitivi/0.13/ And expect updated ubuntu packages soon on the gstreamer developers PPA: https://edge.launchpad.net/~gstreamer-developers/+archive/ppa Thanks, Edward, on behalf of the PiTiVi team. From wl2776 at gmail.com Thu Sep 9 13:44:49 2010 From: wl2776 at gmail.com (wl2776) Date: Thu, 9 Sep 2010 04:44:49 -0700 (PDT) Subject: [gst-devel] ftp:// support in linux and in win32 Message-ID: <1284032689262-2532699.post@n4.nabble.com> I've studied gst-inspect giosrc both in linux and in windows (Latest OSSBuild). The comparison shows that linux giosrc supports much more uri schemes that windows one. Linux: $ gst-inspect giosrc ... URI handling capabilities: Element can act as source. Supported URI protocols: file ftp dns-sd afc smb network trash davs dav computer davs+sd dav+sd localtest burn obex archive gphoto2 sftp ssh ... Win32, using the latest OSSBuild: c:\> gst-inspect giosrc ... URI handling capabilities: Element can act as source. Supported URI protocols: file ... Where have all protocols gone? $GST_PLUGINS_BASE/ext/gio/gstgio.c has the function static gpointer _internal_get_supported_protocols (gpointer data) which calls g_vfs_get_supported_uri_schemes So, is such lack of schemes caused by the fact that this Glib instance supports only file:// ? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/ftp-support-in-linux-and-in-win32-tp2532699p2532699.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From otte at redhat.com Thu Sep 9 13:48:33 2010 From: otte at redhat.com (Benjamin Otte) Date: Thu, 09 Sep 2010 13:48:33 +0200 Subject: [gst-devel] ftp:// support in linux and in win32 In-Reply-To: <1284032689262-2532699.post@n4.nabble.com> References: <1284032689262-2532699.post@n4.nabble.com> Message-ID: <1284032913.6400.0.camel@localhost.localdomain> Yes. gio supports plugins, and on Linux, those plugins are usually provided by gvfs. Benjamin On Thu, 2010-09-09 at 04:44 -0700, wl2776 wrote: > I've studied gst-inspect giosrc both in linux and in windows (Latest > OSSBuild). > > The comparison shows that linux giosrc supports much more uri schemes that > windows one. > > Linux: > $ gst-inspect giosrc > ... > URI handling capabilities: > Element can act as source. > Supported URI protocols: > file > ftp > dns-sd > afc > smb > network > trash > davs > dav > computer > davs+sd > dav+sd > localtest > burn > obex > archive > gphoto2 > sftp > ssh > ... > > Win32, using the latest OSSBuild: > c:\> gst-inspect giosrc > ... > URI handling capabilities: > Element can act as source. > Supported URI protocols: > file > ... > > Where have all protocols gone? > > $GST_PLUGINS_BASE/ext/gio/gstgio.c has the function > static gpointer > _internal_get_supported_protocols (gpointer data) > > which calls g_vfs_get_supported_uri_schemes > > So, is such lack of schemes caused by the fact that this Glib instance > supports only file:// ? > From olivier.aubert at liris.cnrs.fr Thu Sep 9 14:12:15 2010 From: olivier.aubert at liris.cnrs.fr (Olivier Aubert) Date: Thu, 09 Sep 2010 14:12:15 +0200 Subject: [gst-devel] data-sink in rsvgoverlay Message-ID: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> Hello I have implemented a data-sink to feed SVG data to the proposed new rsvgoverlay element. However, I still have one issue: the following pipeline: pygst-launch videotestsrc ! ffmpegcolorspace ! rsvgoverlay name=overlay ! ffmpegcolorspace ! xvimagesink filesrc blocksize=102400 location=/tmp/a.svg ! image/svg ! overlay.data_sink works only if I specify the blocksize= parameter to filesrc, so that the SVG file is sent as one chunk. It greatly simplifies the rsvgoverlay code to make this assumption (else I would have to gather chunks in the overlay code to build the whole data). However, the blocksize hack is not satisfactory. Is there any way to specify this (whole files wanted) in the pad definition? I am attaching the code to also get some feedback about problems that could still be there. Thanks for any hint. Olivier -------------- next part -------------- A non-text attachment was scrubbed... Name: gstrsvgoverlay.c Type: text/x-csrc Size: 10870 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: gstrsvgoverlay.h Type: text/x-chdr Size: 2247 bytes Desc: not available URL: From sebastian.droege at collabora.co.uk Thu Sep 9 14:37:39 2010 From: sebastian.droege at collabora.co.uk (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Thu, 09 Sep 2010 14:37:39 +0200 Subject: [gst-devel] data-sink in rsvgoverlay In-Reply-To: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> References: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> Message-ID: <1284035859.4080.418.camel@odin.lan> On Thu, 2010-09-09 at 14:12 +0200, Olivier Aubert wrote: > Hello > > I have implemented a data-sink to feed SVG data to the proposed new > rsvgoverlay element. However, I still have one issue: the following > pipeline: > > pygst-launch videotestsrc ! ffmpegcolorspace ! rsvgoverlay name=overlay ! ffmpegcolorspace ! xvimagesink filesrc blocksize=102400 location=/tmp/a.svg ! image/svg ! overlay.data_sink > > works only if I specify the blocksize= parameter to filesrc, so that the > SVG file is sent as one chunk. It greatly simplifies the rsvgoverlay > code to make this assumption (else I would have to gather chunks in the > overlay code to build the whole data). However, the blocksize hack is > not satisfactory. Is there any way to specify this (whole files wanted) > in the pad definition? I am attaching the code to also get some feedback > about problems that could still be there. You can simply wait and concatenate buffers until you get the EOS event and only then start the real processing. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From olivier.aubert at liris.cnrs.fr Thu Sep 9 15:05:49 2010 From: olivier.aubert at liris.cnrs.fr (Olivier Aubert) Date: Thu, 09 Sep 2010 15:05:49 +0200 Subject: [gst-devel] data-sink in rsvgoverlay In-Reply-To: <1284035859.4080.418.camel@odin.lan> References: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> <1284035859.4080.418.camel@odin.lan> Message-ID: <1284037549.3010.187.camel@abbeyroad.dnsalias.org> On Thu, 2010-09-09 at 14:37 +0200, Sebastian Dr?ge wrote: > On Thu, 2010-09-09 at 14:12 +0200, Olivier Aubert wrote: > > Hello > > > > I have implemented a data-sink to feed SVG data to the proposed new > > rsvgoverlay element. However, I still have one issue: the following > > pipeline: > > > > pygst-launch videotestsrc ! ffmpegcolorspace ! rsvgoverlay name=overlay ! ffmpegcolorspace ! xvimagesink filesrc blocksize=102400 location=/tmp/a.svg ! image/svg ! overlay.data_sink > > > > works only if I specify the blocksize= parameter to filesrc, so that the > > SVG file is sent as one chunk. It greatly simplifies the rsvgoverlay > > code to make this assumption (else I would have to gather chunks in the > > overlay code to build the whole data). However, the blocksize hack is > > not satisfactory. Is there any way to specify this (whole files wanted) > > in the pad definition? I am attaching the code to also get some feedback > > about problems that could still be there. > > You can simply wait and concatenate buffers until you get the EOS event > and only then start the real processing. Hello Sebastian. Thanks for the answer. Yes, it is not that complicated, but I would have hoped that the gstreamer infrastructure provided some infrastructure to do this, optimizing memory allocation and freeing, since it looks like a common thing in gstreamer elements. Is there no such thing? o. From sebastian.droege at collabora.co.uk Thu Sep 9 15:34:46 2010 From: sebastian.droege at collabora.co.uk (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Thu, 09 Sep 2010 15:34:46 +0200 Subject: [gst-devel] data-sink in rsvgoverlay In-Reply-To: <1284037549.3010.187.camel@abbeyroad.dnsalias.org> References: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> <1284035859.4080.418.camel@odin.lan> <1284037549.3010.187.camel@abbeyroad.dnsalias.org> Message-ID: <1284039286.4080.419.camel@odin.lan> On Thu, 2010-09-09 at 15:05 +0200, Olivier Aubert wrote: > On Thu, 2010-09-09 at 14:37 +0200, Sebastian Dr?ge wrote: > > On Thu, 2010-09-09 at 14:12 +0200, Olivier Aubert wrote: > > > Hello > > > > > > I have implemented a data-sink to feed SVG data to the proposed new > > > rsvgoverlay element. However, I still have one issue: the following > > > pipeline: > > > > > > pygst-launch videotestsrc ! ffmpegcolorspace ! rsvgoverlay name=overlay ! ffmpegcolorspace ! xvimagesink filesrc blocksize=102400 location=/tmp/a.svg ! image/svg ! overlay.data_sink > > > > > > works only if I specify the blocksize= parameter to filesrc, so that the > > > SVG file is sent as one chunk. It greatly simplifies the rsvgoverlay > > > code to make this assumption (else I would have to gather chunks in the > > > overlay code to build the whole data). However, the blocksize hack is > > > not satisfactory. Is there any way to specify this (whole files wanted) > > > in the pad definition? I am attaching the code to also get some feedback > > > about problems that could still be there. > > > > You can simply wait and concatenate buffers until you get the EOS event > > and only then start the real processing. > Hello Sebastian. Thanks for the answer. Yes, it is not that complicated, > but I would have hoped that the gstreamer infrastructure provided some > infrastructure to do this, optimizing memory allocation and freeing, > since it looks like a common thing in gstreamer elements. Is there no > such thing? GstAdapter could help here. You push buffers into it and after EOS can get a single, large buffer from it http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-libs/html/GstAdapter.html -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From gabrbedd at gmail.com Thu Sep 9 15:45:44 2010 From: gabrbedd at gmail.com (Gabriel M. Beddingfield) Date: Thu, 9 Sep 2010 08:45:44 -0500 (CDT) Subject: [gst-devel] Position info when doing time stretch In-Reply-To: <1283930950.6932.3.camel@metal> References: <201009052059.48215.gabrbedd@gmail.com> <201009072150.25199.gabrbedd@gmail.com> <1283930950.6932.3.camel@metal> Message-ID: Hi Wim, On Wed, 8 Sep 2010, Wim Taymans wrote: >>> I'm doing time stretching app, and want the UI to show >>> position/length info in terms of the original media. >>> However, gst_element_query_position() and _duration() always >>> gives me playback times relative to the wall clock. This >>> time is affected by the playback speed. [snip] > > The element probably doesn't adjust the applied_rate in the segment > events. This can be fixed if you file a bug. Thank you, that makes sense. I will file a bug. -gabriel From gabrbedd at gmail.com Thu Sep 9 15:46:22 2010 From: gabrbedd at gmail.com (Gabriel M. Beddingfield) Date: Thu, 9 Sep 2010 08:46:22 -0500 (CDT) Subject: [gst-devel] Position info when doing time stretch In-Reply-To: <1283930950.6932.3.camel@metal> References: <201009052059.48215.gabrbedd@gmail.com> <201009072150.25199.gabrbedd@gmail.com> <1283930950.6932.3.camel@metal> Message-ID: Hi Wim, On Wed, 8 Sep 2010, Wim Taymans wrote: >>> I'm doing time stretching app, and want the UI to show >>> position/length info in terms of the original media. >>> However, gst_element_query_position() and _duration() always >>> gives me playback times relative to the wall clock. This >>> time is affected by the playback speed. [snip] > > The element probably doesn't adjust the applied_rate in the segment > events. This can be fixed if you file a bug. Thank you, that makes sense. I will file a bug. -gabriel From olivier.aubert at liris.cnrs.fr Thu Sep 9 16:26:00 2010 From: olivier.aubert at liris.cnrs.fr (Olivier Aubert) Date: Thu, 09 Sep 2010 16:26:00 +0200 Subject: [gst-devel] data-sink in rsvgoverlay In-Reply-To: <1284039286.4080.419.camel@odin.lan> References: <1284034335.3010.184.camel@abbeyroad.dnsalias.org> <1284035859.4080.418.camel@odin.lan> <1284037549.3010.187.camel@abbeyroad.dnsalias.org> <1284039286.4080.419.camel@odin.lan> Message-ID: <1284042360.3010.195.camel@abbeyroad.dnsalias.org> On Thu, 2010-09-09 at 15:34 +0200, Sebastian Dr?ge wrote: > GstAdapter could help here. You push buffers into it and after EOS can > get a single, large buffer from it > > http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-libs/html/GstAdapter.html Yup, I found that out just after sending my e-mail by looking at some code. Sorry for the noise. Anyway, the data-sink property is now implemented, and the new code is attached to the bug https://bugzilla.gnome.org/show_bug.cgi?id=435120 Regards, Olivier From bertd at tplogic.com Thu Sep 9 17:15:48 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 9 Sep 2010 10:15:48 -0500 Subject: [gst-devel] videomixer2 ? Message-ID: I see a new element, videomixer2. How is this different from original videomixer? Is it faster? Thanks much, Bert Douglas -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastian.droege at collabora.co.uk Thu Sep 9 17:21:21 2010 From: sebastian.droege at collabora.co.uk (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Thu, 09 Sep 2010 17:21:21 +0200 Subject: [gst-devel] videomixer2 ? In-Reply-To: References: Message-ID: <1284045681.4080.421.camel@odin.lan> On Thu, 2010-09-09 at 10:15 -0500, Bert Douglas wrote: > I see a new element, videomixer2. > > How is this different from original videomixer? > Is it faster? See the commit message: > New features compared to old videomixer: > * Synchronizing frames on the running time > * Improved and simplified negotiation > * Full QoS support > * Variable framerate support Next comes special support for live streams. It might be a bit faster (in playback situations because of QoS) and because the negotiation is much simpler and faster now. -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From mironoz at mail.ru Thu Sep 9 20:34:57 2010 From: mironoz at mail.ru (Miron Kunz) Date: Thu, 09 Sep 2010 22:34:57 +0400 Subject: [gst-devel] Release of ICEMission 1.2 with VP8 In-Reply-To: References: Message-ID: Dear Members, The 1.2 version of ICEMission communication tool is announced. Now with support of VP8 video compression format! By design it supports Global Directory based on Facebook/Gmail/Email users base enabling PC to PC calls using Facebook or Gmail accounts, XMPP or SIP. ICEMission is the first to take an environmental aspect into the focus while designing communication protocols so please visit http://www.icemission.com to challenge our point of view. Current release can be found at http://icemission.com/downloads. Release notes: # Media Engine: * GStreamer * Video Codec: VP8 (new!) * Audio Codec: Speex # Intergration for automated call control: * Facebook * XMPP/Gmail/Jabber (new!) * SIP (new!) * Microsoft Outlook # Firewall/NAT traversal: * ICE with STUN/TURN Thank you, ICEMissionteam From lfarkas at lfarkas.org Fri Sep 10 00:05:37 2010 From: lfarkas at lfarkas.org (Farkas Levente) Date: Fri, 10 Sep 2010 00:05:37 +0200 Subject: [gst-devel] videomixer2 ? In-Reply-To: <1284045681.4080.421.camel@odin.lan> References: <1284045681.4080.421.camel@odin.lan> Message-ID: my question as always (eq queue2, playbin2 etc) then why not replace the old one? 2010/9/9 Sebastian Dr?ge : > On Thu, 2010-09-09 at 10:15 -0500, Bert Douglas wrote: >> I see a new element, videomixer2. >> >> How is this different from original videomixer? >> Is it faster? > > See the commit message: > >> New features compared to old videomixer: >> ? ? ?* Synchronizing frames on the running time >> ? ? ?* Improved and simplified negotiation >> ? ? ?* Full QoS support >> ? ? ?* Variable framerate support > > Next comes special support for live streams. > > It might be a bit faster (in playback situations because of QoS) > and because the negotiation is much simpler and faster now. > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- ? Levente? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? "Si vis pacem para bellum!" From t.i.m at zen.co.uk Fri Sep 10 00:28:18 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Thu, 09 Sep 2010 23:28:18 +0100 Subject: [gst-devel] videomixer2 ? In-Reply-To: References: <1284045681.4080.421.camel@odin.lan> Message-ID: <1284071298.32175.5.camel@zingle> On Fri, 2010-09-10 at 00:05 +0200, Farkas Levente wrote: > my question as always (eq queue2, playbin2 etc) then why not replace > the old one? Because we'd like to keep the old elements working the way they used to, for backwards-compatibility and all that. This is often quite hard to do when you rewrite/redesign something from scratch. And even where it's possible, it might result in confusing API and/or confusing and hard to maintain code (some properties/signals/messages only applying to that mode, others only to that mode, others to both etc.). Much easier for everyone involved to just leave the old elements alone and do new ones with different behaviour and API. Cheers -Tim From bertd at tplogic.com Fri Sep 10 00:55:50 2010 From: bertd at tplogic.com (Bert Douglas) Date: Thu, 9 Sep 2010 17:55:50 -0500 Subject: [gst-devel] videomixer2 ? In-Reply-To: <1284071298.32175.5.camel@zingle> References: <1284045681.4080.421.camel@odin.lan> <1284071298.32175.5.camel@zingle> Message-ID: That sounds like good software engineering to me. Do not break things that are working. -- Bert Douglas On Thu, Sep 9, 2010 at 5:28 PM, Tim-Philipp M?ller wrote: > On Fri, 2010-09-10 at 00:05 +0200, Farkas Levente wrote: > > > my question as always (eq queue2, playbin2 etc) then why not replace > > the old one? > > Because we'd like to keep the old elements working the way they used to, > for backwards-compatibility and all that. This is often quite hard to do > when you rewrite/redesign something from scratch. And even where it's > possible, it might result in confusing API and/or confusing and hard to > maintain code (some properties/signals/messages only applying to that > mode, others only to that mode, others to both etc.). Much easier for > everyone involved to just leave the old elements alone and do new ones > with different behaviour and API. > > Cheers > -Tim > > > > > ------------------------------------------------------------------------------ > Automate Storage Tiering Simply > Optimize IT performance and efficiency through flexible, powerful, > automated storage tiering capabilities. View this brief to learn how > you can reduce costs and improve performance. > http://p.sf.net/sfu/dell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmcouat at smartt.com Fri Sep 10 00:46:22 2010 From: rmcouat at smartt.com (Ron McOuat) Date: Thu, 09 Sep 2010 15:46:22 -0700 Subject: [gst-devel] videomixer2 ? In-Reply-To: References: <1284045681.4080.421.camel@odin.lan> Message-ID: <4C8963BE.50802@smartt.com> Last time I looked queue2 is missing the feature that allows the queue to maintain N seconds of buffer and then leak buffers as they pass that N second age. On 10-09-09 3:05 PM, Farkas Levente wrote: > my question as always (eq queue2, playbin2 etc) then why not replace > the old one? > > > 2010/9/9 Sebastian Dr?ge: >> On Thu, 2010-09-09 at 10:15 -0500, Bert Douglas wrote: >>> I see a new element, videomixer2. >>> >>> How is this different from original videomixer? >>> Is it faster? >> See the commit message: >> >>> New features compared to old videomixer: >>> * Synchronizing frames on the running time >>> * Improved and simplified negotiation >>> * Full QoS support >>> * Variable framerate support >> Next comes special support for live streams. >> >> It might be a bit faster (in playback situations because of QoS) >> and because the negotiation is much simpler and faster now. >> >> ------------------------------------------------------------------------------ >> This SF.net Dev2Dev email is sponsored by: >> >> Show off your parallel programming skills. >> Enter the Intel(R) Threading Challenge 2010. >> http://p.sf.net/sfu/intel-thread-sfd >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > From sebastian.droege at collabora.co.uk Fri Sep 10 08:17:20 2010 From: sebastian.droege at collabora.co.uk (Sebastian =?ISO-8859-1?Q?Dr=F6ge?=) Date: Fri, 10 Sep 2010 08:17:20 +0200 Subject: [gst-devel] videomixer2 ? In-Reply-To: <4C8963BE.50802@smartt.com> References: <1284045681.4080.421.camel@odin.lan> <4C8963BE.50802@smartt.com> Message-ID: <1284099440.4080.425.camel@odin.lan> On Thu, 2010-09-09 at 15:46 -0700, Ron McOuat wrote: > Last time I looked queue2 is missing the feature that allows the queue > to maintain N seconds of buffer and then leak buffers as they pass that > N second age. You don't really want queue to leak buffers anyway. That's just a bad hack for broken QoS handling IMHO -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 198 bytes Desc: This is a digitally signed message part URL: From rmcouat at smartt.com Fri Sep 10 08:40:34 2010 From: rmcouat at smartt.com (Ron McOuat) Date: Thu, 09 Sep 2010 23:40:34 -0700 Subject: [gst-devel] videomixer2 ? In-Reply-To: <1284099440.4080.425.camel@odin.lan> References: <1284045681.4080.421.camel@odin.lan> <4C8963BE.50802@smartt.com> <1284099440.4080.425.camel@odin.lan> Message-ID: <4C89D2E2.2080109@smartt.com> On 10-09-09 11:17 PM, Sebastian Dr?ge wrote: > On Thu, 2010-09-09 at 15:46 -0700, Ron McOuat wrote: >> Last time I looked queue2 is missing the feature that allows the queue >> to maintain N seconds of buffer and then leak buffers as they pass that >> N second age. > You don't really want queue to leak buffers anyway. That's just a bad > hack for broken QoS handling IMHO My use has nothing to do with QoS, I am using queue in buffer leak mode to implement a pre-trigger buffer downstream of a live source. When an event occurs I have N sec of video from before the event stored in the queue so when I unblock the src pad I record the video buffers stored in the queue followed by the real time data from the live source until the event ends at which time I again block the src pad. I don't want any more than N sec worth of pre-trigger, just a short period of time up to 10 seconds before the trigger point. If nothing interesting is happening possibly for hours the video can fall off the end of the queue. The queue element is perfect for this purpose. > > ------------------------------------------------------------------------------ > Automate Storage Tiering Simply > Optimize IT performance and efficiency through flexible, powerful, > automated storage tiering capabilities. View this brief to learn how > you can reduce costs and improve performance. > http://p.sf.net/sfu/dell-sfdev2dev > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From lfarkas at lfarkas.org Fri Sep 10 10:41:36 2010 From: lfarkas at lfarkas.org (Farkas Levente) Date: Fri, 10 Sep 2010 10:41:36 +0200 Subject: [gst-devel] videomixer2 ? In-Reply-To: <4C89D2E2.2080109@smartt.com> References: <1284045681.4080.421.camel@odin.lan> <4C8963BE.50802@smartt.com> <1284099440.4080.425.camel@odin.lan> <4C89D2E2.2080109@smartt.com> Message-ID: it'd be a nine feature to add to queue2 too. On Fri, Sep 10, 2010 at 08:40, Ron McOuat wrote: > On 10-09-09 11:17 PM, Sebastian Dr?ge wrote: > > On Thu, 2010-09-09 at 15:46 -0700, Ron McOuat wrote: > > Last time I looked queue2 is missing the feature that allows the queue > to maintain N seconds of buffer and then leak buffers as they pass that > N second age. > > You don't really want queue to leak buffers anyway. That's just a bad > hack for broken QoS handling IMHO > > My use has nothing to do with QoS, I am using queue in buffer leak mode to > implement a pre-trigger buffer downstream of a live source. When an event > occurs I have N sec of video from before the event stored in the queue so > when I unblock the src pad I record the video buffers stored in the queue > followed by the real time data from the live source until the event ends at > which time I again block the src pad. I don't want any more than N sec worth > of pre-trigger, just a short period of time up to 10 seconds before the > trigger point. If nothing interesting is happening possibly for hours the > video can fall off the end of the queue. The queue element is perfect for > this purpose. > > > ------------------------------------------------------------------------------ > Automate Storage Tiering Simply > Optimize IT performance and efficiency through flexible, powerful, > automated storage tiering capabilities. View this brief to learn how > you can reduce costs and improve performance. > http://p.sf.net/sfu/dell-sfdev2dev > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > ------------------------------------------------------------------------------ > Automate Storage Tiering Simply > Optimize IT performance and efficiency through flexible, powerful, > automated storage tiering capabilities. View this brief to learn how > you can reduce costs and improve performance. > http://p.sf.net/sfu/dell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- ? Levente? ? ? ? ? ? ? ? ? ? ? ? ? ? ?? "Si vis pacem para bellum!" From lgabriellp at gmail.com Fri Sep 10 16:57:47 2010 From: lgabriellp at gmail.com (Luiz Gabriel Lima Pinheiro) Date: Fri, 10 Sep 2010 11:57:47 -0300 Subject: [gst-devel] H.264 Video Conferencing Application Message-ID: Hi, I am looking for some video conferencing application made with gstreamer that supports H.264. Can anyone suggest me one? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From baldur at medizza.com Fri Sep 10 18:52:15 2010 From: baldur at medizza.com (Baldur Gislason) Date: Fri, 10 Sep 2010 16:52:15 +0000 Subject: [gst-devel] Transcoding and otherwise dealing with streams. Message-ID: I am trying to construct a C application that can either pick up audio/video from a file (mpeg transport stream) or receive mpeg transport stream on a UDP socket. Input format is MPEG2 A/V and output is H.264 with MPEG audio or AAC, transport stream multiplexing on output. So far I have managed to transcode video from network but am getting buffer underruns on the queue. If I add audio everything just stops, the pipeline does nothing and I can't quite figure out why. If I read the data from a file, I get buffer overruns. So clearly this is a clocking thing. I have searched for documentation regarding clocking in gstreamer but found nothing useful. The app developers manual mentions clocking but nothing about how it applies to code, and gst-inspect says none of the elements I have looked at have any clocking capabilities?!f Anyway, I was wondering if anyone had an example for building an MPEG transcoding pipeline in C, for working with a live stream and not file input, file output. File input, network output would be the other scenario. Baldur Gislason From ensonic at hora-obscura.de Fri Sep 10 19:26:58 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Fri, 10 Sep 2010 20:26:58 +0300 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> Message-ID: <4C8A6A62.5000808@hora-obscura.de> Am 08.09.2010 04:45, schrieb Zhao, Halley: > During playback of progressive content, I tried to save the content to disk as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t playback > again; some contents even can?t playback during progressive downloaded. > > > > ## most ogg contents work well, the saved contents can playback again > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! decodebin > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > ## some mp4 saved contents can?t playback again, the saved contents differ from > the original one; even the following test.mp4 and test2.mp4 are different > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! filesink > location=/home/halley/swap/streaming/test2.mp4 > At first use decodebin2! If the http source is seekable, the muxer in decodebin will do pull. You could try: gst-launch-0.10 souphttpsrc location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee name=t ! decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 Stefan > > > ## some wmv contents even can?t playback during progressive downloaded (though > some saved wmv contents can playback again) > > gst-launch-0.10 -v -v souphttpsrc location=http:// > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > thanks in advance for your help. > > > > > > *ZHAO, Halley (Aihua)* > > Email: halley.zhao at intel.com > > Tel: +86(21)61166476 iNet: 8821-6476 > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From steffen.wulf at tuhh.de Fri Sep 10 20:55:58 2010 From: steffen.wulf at tuhh.de (Stwulf) Date: Fri, 10 Sep 2010 11:55:58 -0700 (PDT) Subject: [gst-devel] multiplex aac config data into resulting data stream (rtpmp4apay) Message-ID: <1284144958346-2534921.post@n4.nabble.com> Hi everybody, I use the rtpmp4apay element in order to stream AAC audio over the network. With the help of the according depay element I am able to listen to the music. But this works only if I set the config parameter within the caps of the udpsrc element. The special thing about my transmission is that it is supposed to be a unidirectional one. Due to different music input files it's hard to make any assumptions about the resulting string for the config parameter on the receiver side. According to RFC 3016 it is supposed to be possible to multiplex the config parameter (StreamMuxConfig element) into the resulting data stream. Now, my question is wheter it is possible (and already supported by the backend) to set the config parameter to "1" within the rtpmp4apay element to induce the desired result? Is there any other way to achieve the multiplexing? If this should not be the case, has anybody experience with the mapping between the config parameter and the StreamMuxConfig element? I'd very much appreciate a response. Thanks in advance! Steffen. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/multiplex-aac-config-data-into-resulting-data-stream-rtpmp4apay-tp2534921p2534921.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From nvineeth at gmail.com Sat Sep 11 09:13:20 2010 From: nvineeth at gmail.com (vineeth) Date: Sat, 11 Sep 2010 12:43:20 +0530 Subject: [gst-devel] qtdemux nal slice size Message-ID: Hi all, I was using the qtdemux to get H.264 NAL Units. For few mp4 files, the first 4 bytes correspond to NumBytesInNalUnit and for few other the first 2 bytes tell the Nal Unit size. Is there a way to determine if the first 4 or the first 2 bytes give the size on Nal unit? Also, is it possible to configure qtdemux to give Nal units with NAL prefix ( Annex-B of *14496-10* standard) Thanks. --vineeth -------------- next part -------------- An HTML attachment was scrubbed... URL: From bilboed at gmail.com Sat Sep 11 18:26:15 2010 From: bilboed at gmail.com (Edward Hervey) Date: Sat, 11 Sep 2010 18:26:15 +0200 Subject: [gst-devel] ANNOUNCE: PiTiVi 0.13.4.3 pre-release In-Reply-To: <1283973807.4698.10.camel@deumeu> References: <1283973807.4698.10.camel@deumeu> Message-ID: <1284222375.2463.2.camel@deumeu> Hi all, I just pushed a new pre-release which fixes some regressions and brings more translation updates. Tarballs/packages available at the usual location, We are still scheduled to do the release on wednesday. Edward On Wed, 2010-09-08 at 21:23 +0200, Edward Hervey wrote: > Hi all, > > The whole PiTiVi team is pleased to announce a pre-release of the > PiTiVi video editor. > > This contains the past 6 months of usability improvements, speedups, > and so forth (yes, you now have transitions). > > Unless anything critical or regressions pop up, expect the 0.13.5 > release next wednesday (15th September). > > Please test it and abuse it and report bugs at: > https://bugzilla.gnome.org/enter_bug.cgi?product=PiTiVi > > Tarballs are available here: > http://ftp.gnome.org/pub/GNOME/sources/pitivi/0.13/ > > And expect updated ubuntu packages soon on the gstreamer developers > PPA: > https://edge.launchpad.net/~gstreamer-developers/+archive/ppa > > Thanks, > > Edward, on behalf of the PiTiVi team. > From lichandler116 at gmail.com Sun Sep 12 08:01:22 2010 From: lichandler116 at gmail.com (Chandler Li) Date: Sun, 12 Sep 2010 14:01:22 +0800 Subject: [gst-devel] dynamic change element's parameter Message-ID: Hi all, I'm new to use gstreamer, and I have a problem don't know how to solve for a long time, Is there possible to dynamic change the parameter in elements or pads? For example, In a playing state streaming, could I dynamic change the size of video without stop the streaming? hopes that's not a stupid question, and very hopefully for your reply, Thank you Chandler Lee From lists at svrinformatica.it Sun Sep 12 09:17:13 2010 From: lists at svrinformatica.it (Mailing List SVR) Date: Sun, 12 Sep 2010 09:17:13 +0200 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: References: Message-ID: <1284275833.3133.17.camel@localhost.localdomain> Il giorno dom, 12/09/2010 alle 14.01 +0800, Chandler Li ha scritto: > Hi all, > I'm new to use gstreamer, > and I have a problem don't know how to solve for a long time, > > Is there possible to dynamic change the parameter in elements or pads? > > For example, > In a playing state streaming, > could I dynamic change the size of video without stop the streaming? I think you can following the directions here: http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt for your use case I'll try to insert a videoscale element in the running pipeline, be warned if you are saving the stream in a file and you dynamically change the file destination you'll have timestamp problems: https://bugzilla.gnome.org/show_bug.cgi?id=561224 hope this help, Nicola > > hopes that's not a stupid question, > and very hopefully for your reply, > Thank you > > Chandler Lee > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From t.i.m at zen.co.uk Sun Sep 12 12:00:24 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Sun, 12 Sep 2010 11:00:24 +0100 Subject: [gst-devel] multiplex aac config data into resulting data stream (rtpmp4apay) In-Reply-To: <1284144958346-2534921.post@n4.nabble.com> References: <1284144958346-2534921.post@n4.nabble.com> Message-ID: <1284285624.16934.3.camel@zingle> On Fri, 2010-09-10 at 11:55 -0700, Stwulf wrote: Hi Steffen, > I use the rtpmp4apay element in order to stream AAC audio over the network. > With the help of the according depay element I am able to listen to the > music. But this works only if I set the config parameter within the caps of > the udpsrc element. The special thing about my transmission is that it is > supposed to be a unidirectional one. Due to different music input files it's > hard to make any assumptions about the resulting string for the config > parameter on the receiver side. > According to RFC 3016 it is supposed to be possible to multiplex the config > parameter (StreamMuxConfig element) into the resulting data stream. Now, my > question is wheter it is possible (and already supported by the backend) to > set the config parameter to "1" within the rtpmp4apay element to induce the > desired result? Is there any other way to achieve the multiplexing? This sounds like a good idea. Some other payloaders (e.g. rtpmp4vpay) have had a "config-interval" property added to them to achieve this. You could look at the relevant changesets and implement the same for rtpmp4apay. Patches are always very welcome (in bugzilla). If you don't have time to implement this yourself, feel free to file a feature request in bugzilla about this. Cheers -Tim From 123sandy at gmail.com Sun Sep 12 14:08:01 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Sun, 12 Sep 2010 05:08:01 -0700 (PDT) Subject: [gst-devel] qtdemux nal slice size In-Reply-To: References: Message-ID: <1284293281669-2536285.post@n4.nabble.com> Hi Vineeth, vineeth wrote: > > For few mp4 files, the first 4 bytes correspond to NumBytesInNalUnit and > for few other the first 2 bytes tell the Nal Unit size. Is there a way to > determine if the first 4 or the first 2 bytes give the size on Nal unit? > In case of h264 streams qtdemux will give the decoder config info in the caps as "codec_data", as a GstBuffer. Note that this is not the bytestream format. This comprises of: <6 bytes avcC atom> + + <1 byte indicating Number of PPS present> + + + .... In the avcC atom the 2 Least Significant Bits of the 5th Byte corresponds to the (NumBytesInNalUnit -1). So to get the NumBytesInNalUnit (Length of Length of NAL): length_of_length = (<5th Byte of "codec_data"> & 0x3) + 1 vineeth wrote: > > Also, is it possible to configure qtdemux to give Nal units with NAL > prefix ( Annex-B of *14496-10* standard) > AFAIK configuration is not possible. Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/qtdemux-nal-slice-size-tp2535442p2536285.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From halley.zhao at intel.com Mon Sep 13 03:21:31 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Mon, 13 Sep 2010 09:21:31 +0800 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <4C8A6A62.5000808@hora-obscura.de> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> Message-ID: <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> Thanks Stefan. After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got same result. I think the possible solution is to enhance souphttpsrc to save content to disk after some refractor, because souphttpsrc does some seek following the command of parser. Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 file at the beginning of playback. Finally, tail of the original mp4 file is missing in downloaded mp4 file. halley at halley-lucid:~/swap/streaming/mp4$ ls -l total 5216 -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 -----Original Message----- From: Stefan Kost [mailto:ensonic at hora-obscura.de] Sent: 2010?9?11? 1:27 To: Discussion of the development of GStreamer Cc: Zhao, Halley Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Am 08.09.2010 04:45, schrieb Zhao, Halley: > During playback of progressive content, I tried to save the content to disk as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t playback > again; some contents even can?t playback during progressive downloaded. > > > > ## most ogg contents work well, the saved contents can playback again > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! decodebin > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > ## some mp4 saved contents can?t playback again, the saved contents differ from > the original one; even the following test.mp4 and test2.mp4 are different > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! filesink > location=/home/halley/swap/streaming/test2.mp4 > At first use decodebin2! If the http source is seekable, the muxer in decodebin will do pull. You could try: gst-launch-0.10 souphttpsrc location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee name=t ! decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 Stefan > > > ## some wmv contents even can?t playback during progressive downloaded (though > some saved wmv contents can playback again) > > gst-launch-0.10 -v -v souphttpsrc location=http:// > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > thanks in advance for your help. > > > > > > *ZHAO, Halley (Aihua)* > > Email: halley.zhao at intel.com > > Tel: +86(21)61166476 iNet: 8821-6476 > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- A non-text attachment was scrubbed... Name: mp4.log Type: application/octet-stream Size: 193764 bytes Desc: mp4.log URL: From rpavithra.87 at gmail.com Mon Sep 13 07:17:30 2010 From: rpavithra.87 at gmail.com (rpavithra.87) Date: Sun, 12 Sep 2010 22:17:30 -0700 (PDT) Subject: [gst-devel] AV Sync for demuxed data Message-ID: <1284355050614-2536913.post@n4.nabble.com> I am trying to test a decoder element.I am dumping the demuxed data (data,size and timestamps). After that i am feeding the demuxed data to the following pipeline containing two bins as described below. Appsrc - Video dec - Video sink Appsrc - Audio dec - Audio sink it plays well for two or three sec with audio and video sync . But after that it throws the following error messages : 0:00:05.834803752 29674 0x8b9c20 WARN baseaudiosink gstbaseaudiosink.c:1518:gst_base_audio_sink_render: warning: Compensating for audio synchronisation problems 0:00:05.834846589 29674 0x8b9c20 WARN baseaudiosink gstbaseaudiosink.c:1518:gst_base_audio_sink_render: warning: Unexpected discontinuity in audio timestamps of more than half a second (0:00:04.294979166), resyncing and the actual timestamp is (0:00:00.001032704) 0:00:05.836275768 29674 0x8b9c20 WARN baseaudiosink gstbaseaudiosink.c:1518:gst_base_audio_sink_render: warning: Compensating for audio synchronisation problems 0:00:05.836289732 29674 0x8b9c20 WARN baseaudiosink gstbaseaudiosink.c:1518:gst_base_audio_sink_render: warning: Unexpected discontinuity in audio timestamps of more than half a second (0:00:04.294958333), resyncing and the actual timestamp is (0:00:00.002065408) and the video decoder also starts dropping the frames. ffmpeg gstffmpegdec.c:1966:gst_ffmpegdec_video_frame: Dropping non-keyframe (seek/init) Without timestamps all the video frames and audio samples are played out without sync. Should i need to do something extra in order to achieve avsync? Please help me. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-Sync-for-demuxed-data-tp2536913p2536913.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From 123sandy at gmail.com Mon Sep 13 08:30:53 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Sun, 12 Sep 2010 23:30:53 -0700 (PDT) Subject: [gst-devel] AV Sync for demuxed data In-Reply-To: <1284355050614-2536913.post@n4.nabble.com> References: <1284355050614-2536913.post@n4.nabble.com> Message-ID: <1284359453004-2536950.post@n4.nabble.com> Hi, Make sure that: 1. The timestamps from the demuxed data is properly injected into the pipeline. Also the framerate and sampling rates has to be set on the caps of respective appsrc so that its propagated properly to the respective sinks. 2. You are feeding full frames into appsrc. If you are not sure, better to use the respective elementary stream parsers before each decoder so that the decoder will have a frame of data to decode. "h264parse" in case of h264 streams, "aacparse" to parse aac streams. Your pipeline will look like: Appsrc - Elementary Stream Parser - Video dec - Video sink Appsrc - Elementary Stream Parser - Audio dec - Audio sink Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-Sync-for-demuxed-data-tp2536913p2536950.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From rpavithra.87 at gmail.com Mon Sep 13 08:42:07 2010 From: rpavithra.87 at gmail.com (rpavithra.87) Date: Sun, 12 Sep 2010 23:42:07 -0700 (PDT) Subject: [gst-devel] AV Sync for demuxed data In-Reply-To: <1284359453004-2536950.post@n4.nabble.com> References: <1284355050614-2536913.post@n4.nabble.com> <1284359453004-2536950.post@n4.nabble.com> Message-ID: <1284360127869-2536964.post@n4.nabble.com> I compared the timestamps with the timestamps passed in a standard gstreamer pipeline(playbin).The timestamps are correct. And i am setting the caps using a capsfilter which is present in between appsrc and decoders. Actually i dumped the buffer size, buffer data ,codec private data for h264 and timestamps in separate files.Whenever need data callback comes, i read the size from the sizefile and depending on the size, i read the data from the data file and feed it to the appsrc. When played without timestamps , all the frames and samples are played out. But only when i feed the timestamps , this problem occurs. But the same file plays well when i use a playbin. So i hope the demux does something extra . -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-Sync-for-demuxed-data-tp2536913p2536964.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From nvineeth at gmail.com Mon Sep 13 08:58:55 2010 From: nvineeth at gmail.com (vineeth) Date: Mon, 13 Sep 2010 12:28:55 +0530 Subject: [gst-devel] qtdemux nal slice size In-Reply-To: <1284293281669-2536285.post@n4.nabble.com> References: <1284293281669-2536285.post@n4.nabble.com> Message-ID: Hi , Thanks for the kind reply, I could more info from class AVCDecoderConfigurationRecord of 14496-part 15, thanks to your pointers. --vineeth On Sun, Sep 12, 2010 at 5:38 PM, Sandeep Prakash <123sandy at gmail.com> wrote: > > Hi Vineeth, > > > vineeth wrote: > > > > For few mp4 files, the first 4 bytes correspond to NumBytesInNalUnit > and > > for few other the first 2 bytes tell the Nal Unit size. Is there a way to > > determine if the first 4 or the first 2 bytes give the size on Nal unit? > > > In case of h264 streams qtdemux will give the decoder config info in the > caps as > "codec_data", as a GstBuffer. Note that this is not the bytestream format. > This comprises of: > <6 bytes avcC atom> + + <1 byte indicating Number of PPS present> + > + + .... > > In the avcC atom the 2 Least Significant Bits of the 5th Byte corresponds > to > the > (NumBytesInNalUnit -1). So to get the NumBytesInNalUnit (Length of Length > of > NAL): > length_of_length = (<5th Byte of "codec_data"> & 0x3) + 1 > > > vineeth wrote: > > > > Also, is it possible to configure qtdemux to give Nal units with NAL > > prefix ( Annex-B of *14496-10* standard) > > > AFAIK configuration is not possible. > > > Regards, > Sandeep Prakash > http://sandeepprakash.homeip.net > > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/qtdemux-nal-slice-size-tp2535442p2536285.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rpavithra.87 at gmail.com Mon Sep 13 14:47:54 2010 From: rpavithra.87 at gmail.com (rpavithra.87) Date: Mon, 13 Sep 2010 05:47:54 -0700 (PDT) Subject: [gst-devel] AV Sync for demuxed data In-Reply-To: <1284360127869-2536964.post@n4.nabble.com> References: <1284355050614-2536913.post@n4.nabble.com> <1284359453004-2536950.post@n4.nabble.com> <1284360127869-2536964.post@n4.nabble.com> Message-ID: <1284382074038-2537355.post@n4.nabble.com> Hi sandeep, Thanks for the information. I tried using the parsers infront of the decoders. Actually in case of audio some more samples were played out(i hope this is because video gets struck in the middle. otherwise all the samples would have been played out). But if i use the h264parse element the application hangs. How to feed the codec private data to h264parse? Currently i am having a caps filter in between decoder and appsrc. And i am setting the codec private data as part of the caps. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-Sync-for-demuxed-data-tp2536913p2537355.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From 123sandy at gmail.com Mon Sep 13 17:02:55 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Mon, 13 Sep 2010 08:02:55 -0700 (PDT) Subject: [gst-devel] AV Sync for demuxed data In-Reply-To: <1284382074038-2537355.post@n4.nabble.com> References: <1284355050614-2536913.post@n4.nabble.com> <1284359453004-2536950.post@n4.nabble.com> <1284360127869-2536964.post@n4.nabble.com> <1284382074038-2537355.post@n4.nabble.com> Message-ID: <1284390175812-2537595.post@n4.nabble.com> Hi, rpavithra.87 wrote: > > How to feed the codec private data to h264parse? Currently i am having a > caps filter in between decoder and appsrc. And i am setting the codec > private data as part of the caps. > h264parse expects the codec private data as "codec_data" which should be part of the caps. Unless the stream is a bytestream in which case the codec_data might be part of the 1st frame, codec_data is compulsory as part of the caps. Plz refer any of the standard demuxers/parser on how they construct this codec_data. There is a format that need to be followed for codec_data (In h264 case). You can start of here: http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/gst/h264parse/gsth264parse.c#n1115 Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/AV-Sync-for-demuxed-data-tp2536913p2537595.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From nico at inattendu.org Mon Sep 13 17:09:09 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Mon, 13 Sep 2010 19:09:09 +0400 Subject: [gst-devel] Looking for some advices for playing sequence of images with a sound track In-Reply-To: <1283857052046-2529498.post@n4.nabble.com> References: <4C8613B2.4060005@inattendu.org> <1283857052046-2529498.post@n4.nabble.com> Message-ID: <4C8E3E95.7070306@inattendu.org> On 07/09/2010 14:57, wl2776 wrote: > Nicolas Bertrand-4 wrote: > >> > >> > I use gstreamer for making snapshot and play the movie. >> > I want to add the possibility of playing a music or sound in the same >> > time the video is playing. >> > The input of the 'play' function is a sequence of images. >> > >> > I'd suggest using imagefreeze element to play images. > Hi I have some problems to use imagefreeze with a sequence of image. Is it possible with imagefreeze to play a sequence of image at a given framerate ? For example play a sequence of image as a movie : image_001.png image_002.png ... image_010.png at a framereta of 1/25s for example ? I'm looking to turning the image sequence as a video to allow implementation of seek/position function Cheers Nico From ensonic at hora-obscura.de Mon Sep 13 23:14:31 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 14 Sep 2010 00:14:31 +0300 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: References: Message-ID: <4C8E9437.9050309@hora-obscura.de> Am 12.09.2010 09:01, schrieb Chandler Li: > Hi all, > I'm new to use gstreamer, > and I have a problem don't know how to solve for a long time, > > Is there possible to dynamic change the parameter in elements or pads? gobject parameters -> g_object_set or use GstController > > For example, > In a playing state streaming, > could I dynamic change the size of video without stop the streaming? yes you can. if you resize the video window for xvimagesink the video adjusts if it can. So if you run gst-launch videotestsrc ! xvimagesink, the videotest is sending videoframes in the native resolution instead of xvimagesink scaling them. Stefan > > hopes that's not a stupid question, > and very hopefully for your reply, > Thank you > > Chandler Lee > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From halley.zhao at intel.com Tue Sep 14 04:40:54 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Tue, 14 Sep 2010 10:40:54 +0800 Subject: [gst-devel] H.264 Video Conferencing Application In-Reply-To: References: Message-ID: <5D8008F58939784290FAB48F549751982C55A6FBA0@shsmsx502.ccr.corp.intel.com> Empathy + Farsight2 ________________________________ From: Luiz Gabriel Lima Pinheiro [mailto:lgabriellp at gmail.com] Sent: 2010?9?10? 22:58 To: gstreamer-devel at lists.sourceforge.net Subject: [gst-devel] H.264 Video Conferencing Application Hi, I am looking for some video conferencing application made with gstreamer that supports H.264. Can anyone suggest me one? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lichandler116 at gmail.com Tue Sep 14 09:48:14 2010 From: lichandler116 at gmail.com (Chandler Li) Date: Tue, 14 Sep 2010 15:48:14 +0800 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: <4C8E9437.9050309@hora-obscura.de> References: <4C8E9437.9050309@hora-obscura.de> Message-ID: I have read the mail Nicola and Stefan given, and that really inspired me a lot. Thank you! In the first, I try to replace some elements and encounter some problem, the following I will show my target first. Because I want to stream webcam frame on the internet, and dynamically change the frame size without stopping the pipeline. According to my experience, if the v4l2src element opens webcam device, it only captures one size at one time. if I want to change the size of webcam, I need to restart it. So I try another architecture of gstreamer, v4l2src -> capsfilter1 -> videoscale -> capsfilter2 -> ffmpegcolorspace -> .... (to internet) videoscale links two filters, the first filter (capsfilter1) negotiates with v4l2src in a static frame size, I want to change the size I assigned in the second filter (capsfilter2) , BUT I get an error message after I change the capsfilter2 to the new filter, videoscale can't link to new capsfilter, I don't know what's happened? does anyone know that? Thank you! Best regards, Chandler Lee. 2010/9/14 Stefan Kost : > Am 12.09.2010 09:01, schrieb Chandler Li: >> Hi all, >> ? ? I'm new to use gstreamer, >> ? ? and I have a problem don't know how to solve for a long time, >> >> ? ? Is there possible to dynamic change the parameter in elements or pads? > > gobject parameters -> g_object_set or use GstController > > >> >> ? ? For example, >> ? ? In a playing state streaming, >> ? ? could I dynamic change the size of video without stop the streaming? > > yes you can. if you resize the video window for xvimagesink the video adjusts if > it can. So if you run gst-launch videotestsrc ! xvimagesink, the videotest is > sending videoframes in the native resolution instead of xvimagesink scaling them. > > Stefan > >> >> ? ? hopes that's not a stupid question, >> ? ? and very hopefully for your reply, >> ? ? Thank you >> >> ? ? Chandler Lee >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > From wl2776 at gmail.com Tue Sep 14 11:02:52 2010 From: wl2776 at gmail.com (wl2776) Date: Tue, 14 Sep 2010 02:02:52 -0700 (PDT) Subject: [gst-devel] How can I add RTSP media mapping to the gst-rtsp-server in Python? Message-ID: <1284454972893-2538670.post@n4.nabble.com> I've created a server instance, but cannot add media-mapping to it. Are there any examples of using gst-rtsp-server in Python? import gobject gobject.threads_init() import sys import pygst pygst.require('0.10') import os from gst import rtspserver class myrtspserver: def __init__(self): self.server=rtspserver.Server() self.server.props.media_mapping.set_property('/test', '( videotestsrc is-live=1 ! x264enc ! rtph264pay name=pay0 pt=96 )') The last line of the code above gives the error TypeError: object of type `GstRTSPMediaMapping' does not have property `/test' If I replace set_property with set_data, it doesn't give errors, but logs show that the server cannot find the media mapping when processing connection. Tried also set_data -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-can-I-add-RTSP-media-mapping-to-the-gst-rtsp-server-in-Python-tp2538670p2538670.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From ensonic at hora-obscura.de Tue Sep 14 11:20:39 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 14 Sep 2010 12:20:39 +0300 Subject: [gst-devel] videomixer2 ? In-Reply-To: <1284099440.4080.425.camel@odin.lan> References: <1284045681.4080.421.camel@odin.lan> <4C8963BE.50802@smartt.com> <1284099440.4080.425.camel@odin.lan> Message-ID: <4C8F3E67.6050301@hora-obscura.de> On 10.09.2010 09:17, Sebastian Dr?ge wrote: > On Thu, 2010-09-09 at 15:46 -0700, Ron McOuat wrote: > >> Last time I looked queue2 is missing the feature that allows the queue >> to maintain N seconds of buffer and then leak buffers as they pass that >> N second age. >> > You don't really want queue to leak buffers anyway. That's just a bad > hack for broken QoS handling IMHO > No, its also quite useful for e.g. using it as a valve. Or e.g. in this scenario: .. ! tee ! queue ! audiosink t. ! queue leaky ! analyser ! fakesink Stefan > > ------------------------------------------------------------------------------ > Automate Storage Tiering Simply > Optimize IT performance and efficiency through flexible, powerful, > automated storage tiering capabilities. View this brief to learn how > you can reduce costs and improve performance. > http://p.sf.net/sfu/dell-sfdev2dev > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Tue Sep 14 11:25:41 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 14 Sep 2010 12:25:41 +0300 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: References: <4C8E9437.9050309@hora-obscura.de> Message-ID: <4C8F3F95.1050403@hora-obscura.de> On 14.09.2010 10:48, Chandler Li wrote: > I have read the mail Nicola and Stefan given, and that really inspired > me a lot. > Thank you! > In the first, I try to replace some elements and encounter some > problem, the following I will show my target first. > Because I want to stream webcam frame on the internet, > and dynamically change the frame size without stopping the pipeline. > > According to my experience, if the v4l2src element opens webcam > device, it only captures one size at one time. > if I want to change the size of webcam, I need to restart it. > v4l2src does not support changing the resolution on the fly. This is a v4l2 interface limitation right now - one needs to stop streaming, set new format and restart streaming. So your observation is correct. > So I try another architecture of gstreamer, > > v4l2src -> capsfilter1 -> videoscale -> capsfilter2 -> > ffmpegcolorspace -> .... (to internet) > > videoscale links two filters, the first filter (capsfilter1) > negotiates with v4l2src in a static frame size, > I want to change the size I assigned in the second filter (capsfilter2) , > > BUT I get an error message after I change the capsfilter2 to the new filter, > > videoscale can't link to new capsfilter, > that should work. Can you paste a few lines of the code here. You should create new caps and just set the new format on capsfilter2. Stefan > I don't know what's happened? does anyone know that? > Thank you! > > Best regards, > Chandler Lee. > > 2010/9/14 Stefan Kost : > >> Am 12.09.2010 09:01, schrieb Chandler Li: >> >>> Hi all, >>> I'm new to use gstreamer, >>> and I have a problem don't know how to solve for a long time, >>> >>> Is there possible to dynamic change the parameter in elements or pads? >>> >> gobject parameters -> g_object_set or use GstController >> >> >> >>> For example, >>> In a playing state streaming, >>> could I dynamic change the size of video without stop the streaming? >>> >> yes you can. if you resize the video window for xvimagesink the video adjusts if >> it can. So if you run gst-launch videotestsrc ! xvimagesink, the videotest is >> sending videoframes in the native resolution instead of xvimagesink scaling them. >> >> Stefan >> >> >>> hopes that's not a stupid question, >>> and very hopefully for your reply, >>> Thank you >>> >>> Chandler Lee >>> >>> ------------------------------------------------------------------------------ >>> Start uncovering the many advantages of virtual appliances >>> and start using them to simplify application deployment and >>> accelerate your shift to cloud computing >>> http://p.sf.net/sfu/novell-sfdev2dev >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.sourceforge.net >>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>> >> >> From ykumar23 at gmail.com Tue Sep 14 11:54:13 2010 From: ykumar23 at gmail.com (mohan) Date: Tue, 14 Sep 2010 02:54:13 -0700 (PDT) Subject: [gst-devel] how to update gstreamer registry Message-ID: <1284458053376-2538726.post@n4.nabble.com> Hi, I need to install some plugin from gst-plugins-good. I added to configure arguments and when installed, the plugin is missing from gst-inspect, later found to know that need to update gstreamer-registry like usr/...../gstreamer-0.10/registry....bin Can any one suggest how to update the registry. Thanks in advance -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/how-to-update-gstreamer-registry-tp2538726p2538726.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From henrik.hedberg at innologies.fi Tue Sep 14 12:02:55 2010 From: henrik.hedberg at innologies.fi (Henrik Hedberg) Date: Tue, 14 Sep 2010 13:02:55 +0300 Subject: [gst-devel] Pipeline with live and non-live sources and sinks Message-ID: <4C8F484F.5050209@innologies.fi> Hi, We have a problem constructing a pipeline with live and non-live sources and sinks. The following pipeline results audible scratches or "jumps" during playback: gst-launch-0.10 filesrc location=/tmp/test.mp3 ! decodebin ! audioconvert ! autoaudiosink autoaudiosrc ! audioconvert ! wavenc ! filesink location=/tmp/recording.wav It may be buffer under-run or latency issue. It does not happen every time but usually and it occurs especially at the beginning of the stream. The recorded wav is perfect. According to design documentation, the above pipeline should work and even preroll: http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-latency.txt#n167 However, gst-launch handles the pipeline purely as a live source and skips prerolling. That happens also in an application that constructs the same pipeline. Any idea what should be changed? Thanks a lot in advance! BR, Henrik -- Henrik Hedberg - http://www.henrikhedberg.net/ From 123sandy at gmail.com Tue Sep 14 12:58:50 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Tue, 14 Sep 2010 03:58:50 -0700 (PDT) Subject: [gst-devel] how to update gstreamer registry In-Reply-To: <1284458053376-2538726.post@n4.nabble.com> References: <1284458053376-2538726.post@n4.nabble.com> Message-ID: <1284461930291-2538808.post@n4.nabble.com> Hi, gstreamer should automatically update registry once you install new plugins. If its not happening, try deleting the registry.bin from your home folder. (/home//.gstreamer-0.10/registry.i486.bin). Next time you inspect it will rebuild the registry. Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/how-to-update-gstreamer-registry-tp2538726p2538808.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From andre.dieb at gmail.com Tue Sep 14 13:19:39 2010 From: andre.dieb at gmail.com (=?UTF-8?Q?Andr=C3=A9_Dieb?=) Date: Tue, 14 Sep 2010 08:19:39 -0300 Subject: [gst-devel] How can I add RTSP media mapping to the gst-rtsp-server in Python? In-Reply-To: <1284454972893-2538670.post@n4.nabble.com> References: <1284454972893-2538670.post@n4.nabble.com> Message-ID: Hello, Currently gst-rtsp-server (master) can't add media mappings. I'm currently in the process of cleaning up my python-gst-rtsp-server patches, which include MediaMapping and MediaFactory bindings. It's likely to be submitted to bugzilla this week (at least an initial version). If you're interested, I can send you the dirty patches :) passing a launch string to the factory already works. On Tue, Sep 14, 2010 at 6:02 AM, wl2776 wrote: > > I've created a server instance, but cannot add media-mapping to it. > Are there any examples of using gst-rtsp-server in Python? > > import gobject > gobject.threads_init() > > import sys > import pygst > pygst.require('0.10') > > import os > > from gst import rtspserver > > class myrtspserver: > def __init__(self): > self.server=rtspserver.Server() > self.server.props.media_mapping.set_property('/test', > '( videotestsrc is-live=1 ! x264enc ! > rtph264pay name=pay0 pt=96 )') > > > The last line of the code above gives the error > TypeError: object of type `GstRTSPMediaMapping' does not have property > `/test' > > If I replace set_property with set_data, it doesn't give errors, but logs > show that the server cannot find the media mapping when processing > connection. > Tried also set_data > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/How-can-I-add-RTSP-media-mapping-to-the-gst-rtsp-server-in-Python-tp2538670p2538670.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Andr? Dieb Martins -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Tue Sep 14 14:59:21 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Tue, 14 Sep 2010 15:59:21 +0300 Subject: [gst-devel] [ANN] working on multimedia for MeeGo at Nokia Message-ID: <4C8F71A9.2020505@hora-obscura.de> *working on multimedia for MeeGo at Nokia* If you like to work on free software and like snow - we're having a few open positions: * GStreamer Developer: http://bit.ly/dhIb9b * GStreamer Developer/Architect: http://bit.ly/c21g7W * Specialist, GStreamer Camera: http://bit.ly/bkCFkG * PulseAudio Developer: http://bit.ly/bjMQRF Ping me on irc (ensonic @ freenode/gimpnet) or reply to me if you have questions. Stefan From lichandler116 at gmail.com Tue Sep 14 15:37:02 2010 From: lichandler116 at gmail.com (Chandler Li) Date: Tue, 14 Sep 2010 21:37:02 +0800 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: <4C8F3F95.1050403@hora-obscura.de> References: <4C8E9437.9050309@hora-obscura.de> <4C8F3F95.1050403@hora-obscura.de> Message-ID: Thank you, I paste the key point code to there, the old one caps: caps1 = gst_caps_new_simple ("video/x-raw-yuv", "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), "width", G_TYPE_INT, 123, "height", G_TYPE_INT, 456, "framerate", GST_TYPE_FRACTION, 30, 1, NULL); the new one caps: caps2 = gst_caps_new_simple ("video/x-raw-yuv", "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), "width", G_TYPE_INT, 456, "height", G_TYPE_INT, 789, "framerate", GST_TYPE_FRACTION, 30, 1, NULL); and the folowing code change caps: gst_pad_set_blocked_async(pad,TRUE,my_blocked_callback,NULL); gst_element_set_state(GST_ELEMENT(capsfilter2), GST_STATE_NULL); g_object_set (G_OBJECT (capsfilter2), "caps",caps2,NULL); gst_element_set_state(GST_ELEMENT(capsfilter2), GST_STATE_PLAYING); gst_pad_set_blocked_async(pad,FALSE,my_blocked_callback,NULL); First, I block the videoscale's src pad. Then turn the the element capsfilter2 to NULL , give it new caps with different width and height. After that, I turn the element capsfilter2 to PLAYING, and unblock the videoscale's src pad. after doing these, I get wrong message ** ERROR **: Internal data flow error. aborting... But if I let the caps2's format same with caps1: caps2 = gst_caps_new_simple ("video/x-raw-yuv", "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), "width", G_TYPE_INT, 123, "height", G_TYPE_INT, 456, "framerate", GST_TYPE_FRACTION, 30, 1, NULL); The streaming still works. It's really strange. I think it's because of the videoscale problem. Did I lose any detail in the code? Thank you! Best regards, Chandler Lee. 2010/9/14 Stefan Kost : > On 14.09.2010 10:48, Chandler Li wrote: >> I have read the mail Nicola and Stefan given, and that really inspired >> me a lot. >> Thank you! >> In the first, I try to replace some elements and encounter some >> problem, the following I will show my target first. >> Because I want to stream webcam frame on the internet, >> and dynamically change the frame size without stopping the pipeline. >> >> According to my experience, if the v4l2src element opens webcam >> device, it only captures one size at one time. >> if I want to change the size of webcam, I need to restart it. >> > > v4l2src does not support changing the resolution on the fly. This is a > v4l2 interface limitation right now - one needs to stop streaming, set > new format and restart streaming. So your observation is correct. > >> So I try another architecture of gstreamer, >> >> ? ? v4l2src ?-> ?capsfilter1 ?-> ?videoscale -> capsfilter2 ?-> >> ffmpegcolorspace -> ? .... (to internet) >> >> videoscale links two filters, the first filter (capsfilter1) >> negotiates with v4l2src in a static frame size, >> I want to change the size I assigned in the second filter (capsfilter2) , >> >> BUT I get an error message after I change the capsfilter2 to the new filter, >> >> ? ? videoscale can't link to new capsfilter, >> > > that should work. Can you paste a few lines of the code here. You should > create new caps and just set the new format on capsfilter2. > > Stefan >> I don't know what's happened? does anyone know that? >> Thank you! >> >> Best regards, >> Chandler Lee. >> >> 2010/9/14 Stefan Kost : >> >>> Am 12.09.2010 09:01, schrieb Chandler Li: >>> >>>> Hi all, >>>> ? ? I'm new to use gstreamer, >>>> ? ? and I have a problem don't know how to solve for a long time, >>>> >>>> ? ? Is there possible to dynamic change the parameter in elements or pads? >>>> >>> gobject parameters -> g_object_set or use GstController >>> >>> >>> >>>> ? ? For example, >>>> ? ? In a playing state streaming, >>>> ? ? could I dynamic change the size of video without stop the streaming? >>>> >>> yes you can. if you resize the video window for xvimagesink the video adjusts if >>> it can. So if you run gst-launch videotestsrc ! xvimagesink, the videotest is >>> sending videoframes in the native resolution instead of xvimagesink scaling them. >>> >>> Stefan >>> >>> >>>> ? ? hopes that's not a stupid question, >>>> ? ? and very hopefully for your reply, >>>> ? ? Thank you >>>> >>>> ? ? Chandler Lee >>>> >>>> ------------------------------------------------------------------------------ >>>> Start uncovering the many advantages of virtual appliances >>>> and start using them to simplify application deployment and >>>> accelerate your shift to cloud computing >>>> http://p.sf.net/sfu/novell-sfdev2dev >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.sourceforge.net >>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>> >>> >>> > > From henrik.hedberg at innologies.fi Tue Sep 14 15:41:31 2010 From: henrik.hedberg at innologies.fi (Henrik Hedberg) Date: Tue, 14 Sep 2010 16:41:31 +0300 Subject: [gst-devel] Pipeline with live and non-live sources and sinks In-Reply-To: <4C8F484F.5050209@innologies.fi> References: <4C8F484F.5050209@innologies.fi> Message-ID: <4C8F7B8B.1000706@innologies.fi> On 14.09.2010 13:02, Henrik Hedberg wrote: > We have a problem constructing a pipeline with live and non-live sources > and sinks. The following pipeline results audible scratches or "jumps" > during playback: > > gst-launch-0.10 filesrc location=/tmp/test.mp3 ! decodebin ! > audioconvert ! autoaudiosink autoaudiosrc ! audioconvert ! wavenc ! > filesink location=/tmp/recording.wav > > It may be buffer under-run or latency issue. It does not happen every > time but usually and it occurs especially at the beginning of the > stream. The recorded wav is perfect. I have tested the same pipeline with different versions of GStreamer. It seems that 0.10.21 and 0.10.23 were working as expected, but this problem (bug?) appears in 0.10.25 and 0.0.28. Does anybody have an idea, what has been changed between 0.10.23 and 0.10.25 related to this issue? BR, Henrik -- Henrik Hedberg - http://www.henrikhedberg.net/ From wim.taymans at gmail.com Tue Sep 14 15:50:22 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Tue, 14 Sep 2010 15:50:22 +0200 Subject: [gst-devel] Pipeline with live and non-live sources and sinks In-Reply-To: <4C8F7B8B.1000706@innologies.fi> References: <4C8F484F.5050209@innologies.fi> <4C8F7B8B.1000706@innologies.fi> Message-ID: <1284472222.2407.9.camel@metal> On Tue, 2010-09-14 at 16:41 +0300, Henrik Hedberg wrote: > On 14.09.2010 13:02, Henrik Hedberg wrote: > > > We have a problem constructing a pipeline with live and non-live sources > > and sinks. The following pipeline results audible scratches or "jumps" > > during playback: > > > > gst-launch-0.10 filesrc location=/tmp/test.mp3 ! decodebin ! > > audioconvert ! autoaudiosink autoaudiosrc ! audioconvert ! wavenc ! > > filesink location=/tmp/recording.wav > > > > It may be buffer under-run or latency issue. It does not happen every > > time but usually and it occurs especially at the beginning of the > > stream. The recorded wav is perfect. > > I have tested the same pipeline with different versions of > GStreamer. It seems that 0.10.21 and 0.10.23 were working as expected, > but this problem (bug?) appears in 0.10.25 and 0.0.28. Does anybody have > an idea, what has been changed between 0.10.23 and 0.10.25 related to > this issue? It's because the sink has to slave its clock to the pipeline clock, which is the one provided by the source. Usually you get little glitches when the clocks try to match rates and or when resync happens because the clocks drift too much. If you don't need synchronization between the playback and the capture, you can set slave-method=none on the sink or the source. Wim > > BR, > > Henrik > From bilboed at gmail.com Wed Sep 15 08:18:14 2010 From: bilboed at gmail.com (Edward Hervey) Date: Wed, 15 Sep 2010 08:18:14 +0200 Subject: [gst-devel] [gst-cvs] gstreamer: gstpad: Fix flush-stop event handling In-Reply-To: <20100913235628.2DC9E10057@kemper.freedesktop.org> References: <20100913235628.2DC9E10057@kemper.freedesktop.org> Message-ID: <1284531494.5201.6.camel@joder> On Mon, 2010-09-13 at 16:56 -0700, Thiago Sousa Santos wrote: > Module: gstreamer > Branch: master > Commit: 60fba4df8b53226b019a1cc72405afbf8b708d06 > URL: http://cgit.freedesktop.org/gstreamer/gstreamer/commit/?id=60fba4df8b53226b019a1cc72405afbf8b708d06 > > Author: Thiago Santos > Date: Mon Sep 13 20:39:50 2010 -0300 > > gstpad: Fix flush-stop event handling > > A flush-stop event would make a pad unflushing, causing it > to start acting as an activated pad. This, for example, > could lead to the chain function being called when stuff > isn't initialized. > > This could happend when setting qtdemux to NULL while a seek > was being handled in the upstream filesrc (in push mode). > > This patch makes it check if it is activated before setting > it to unflushing. > > --- > > gst/gstpad.c | 6 ++++-- > 1 files changed, 4 insertions(+), 2 deletions(-) > > diff --git a/gst/gstpad.c b/gst/gstpad.c > index c331684..9639f1d 100644 > --- a/gst/gstpad.c > +++ b/gst/gstpad.c > @@ -5074,8 +5074,10 @@ gst_pad_send_event (GstPad * pad, GstEvent * event) > GST_CAT_DEBUG_OBJECT (GST_CAT_EVENT, pad, "set flush flag"); > break; > case GST_EVENT_FLUSH_STOP: > - GST_PAD_UNSET_FLUSHING (pad); > - GST_CAT_DEBUG_OBJECT (GST_CAT_EVENT, pad, "cleared flush flag"); > + if (G_LIKELY (GST_PAD_ACTIVATE_MODE (pad) != GST_ACTIVATE_NONE)) { > + GST_PAD_UNSET_FLUSHING (pad); > + GST_CAT_DEBUG_OBJECT (GST_CAT_EVENT, pad, "cleared flush flag"); > + } This causes a regression when used with the proxypads of ghostpads. It seems to be related to the special flushing handling of those pads (they are created with the flushing flag unset). > GST_OBJECT_UNLOCK (pad); > /* grab stream lock */ > GST_PAD_STREAM_LOCK (pad); > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-cvs mailing list > gstreamer-cvs at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-cvs From thiagossantos at gmail.com Wed Sep 15 13:59:23 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Wed, 15 Sep 2010 08:59:23 -0300 Subject: [gst-devel] Problem in ffmux_mov In-Reply-To: <4C7DD6EF.5020701@tataelxsi.co.in> References: <4C7DD6EF.5020701@tataelxsi.co.in> Message-ID: On Wed, Sep 1, 2010 at 1:30 AM, Gurpreet wrote: > Hi All.. > > I m encoding pcm file with faac and then muxing it with ffmux_mov. I am > using this pipeline. file is being played in VLC and QT player. > > gst-launch filesrc blocksize=2048 > location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm > ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, > endianness=1234, signed=true ! faac ! aacparse ! audio/mpeg,rate=48000, > channels=1, mpegversion=4, layer=3 ! ffmux_mov ! filesink > location=/home/Gurpreet/outputfilesmp4/exp_mov.mov > > > But Whenever i m setting faac property "outputformat=1" ( for ADTS Header > ) then > > gst-launch filesrc blocksize=2048 > location=/home/Gurpreet/inputfilesmp4/Sample_Files/New_Folder/CH1_48000_mono_16bit.pcm > ! audio/x-raw-int, channels=1, rate=48000, width=16, depth=16, > endianness=1234, signed=true ! faac outputformat=1 ! aacparse ! > audio/mpeg,rate=48000, channels=1, mpegversion=4, layer=3 ! ffmux_mov ! > filesink location=/home/Gurpreet/outputfilesmp4/exp_mov.mov > > then the generated file is being played in vlc but in QT bar is moving but > no audio is coming. what could be the error ? > outputformat=0 is for RAW Aac > outputformat=1 is for ADTS > > is this bug in ffmux_mov ? what could be the reason ? > First, you should be using qtmux for this. I *think* (big emphasis on think) quicktime doesn't support ADTS in .mov files. I remember bumping into this some months ago. Bugs about this: https://bugzilla.gnome.org/show_bug.cgi?id=598350 and https://bugzilla.gnome.org/show_bug.cgi?id=604925 > > Thanks > Gurpreet > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From parveen.jain at one97.net Wed Sep 15 14:43:06 2010 From: parveen.jain at one97.net (Parveen Kumar Jain) Date: Wed, 15 Sep 2010 18:13:06 +0530 Subject: [gst-devel] can't send the video stream from a given rtp port Message-ID: HI All, I am using GStreamer for one of my basic video streaming application.I am facing an issue where I want GStreamer to send the video RTP stream from a given IP address and port(my system has more than 2 ethernet cards),as GStreamer's udpsink picks any "free port and ip address" of his own choice. Can anyone from the group can help me to achieve this ? Regards, Parveen Jain -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Wed Sep 15 17:40:24 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 15 Sep 2010 18:40:24 +0300 Subject: [gst-devel] upcoming removal of metadata plugin in gst-plugin-bad In-Reply-To: <4C862C39.500@hora-obscura.de> References: <4C862C39.500@hora-obscura.de> Message-ID: <4C90E8E8.4000505@hora-obscura.de> On 07.09.2010 15:12, Stefan Kost wrote: > hello, > > I intend to remove the metadata plugin in gst-plugin-bad in this cycle. > Exif and xmp support is now provided by gst-plugins-base utility > libraries. It only takes a couple of lines to add e.g. xmp support to > container formats. Here are some details: > https://bugzilla.gnome.org/show_bug.cgi?id=486659 > > For jpeg files, jpegformat plugin in gst-plugins-bad provides the > container format handling. In case anyone else that Nokia on the N900 is > using the metadata plugin, please migrate your code. > okay. It is gone now. Stefan > Let me know if I miss something. > > Thanks, > Stefan > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From ensonic at hora-obscura.de Wed Sep 15 17:41:55 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 15 Sep 2010 18:41:55 +0300 Subject: [gst-devel] dynamic change element's parameter In-Reply-To: References: <4C8E9437.9050309@hora-obscura.de> <4C8F3F95.1050403@hora-obscura.de> Message-ID: <4C90E943.404@hora-obscura.de> On 14.09.2010 16:37, Chandler Li wrote: > Thank you, > I paste the key point code to there, > > the old one caps: > caps1 = gst_caps_new_simple ("video/x-raw-yuv", > "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), > "width", G_TYPE_INT, 123, > "height", G_TYPE_INT, 456, > "framerate", GST_TYPE_FRACTION, 30, 1, > NULL); > the new one caps: > caps2 = gst_caps_new_simple ("video/x-raw-yuv", > "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), > "width", G_TYPE_INT, 456, > "height", G_TYPE_INT, 789, > "framerate", GST_TYPE_FRACTION, 30, 1, > NULL); > and the folowing code change caps: > gst_pad_set_blocked_async(pad,TRUE,my_blocked_callback,NULL); > gst_element_set_state(GST_ELEMENT(capsfilter2), GST_STATE_NULL); > g_object_set (G_OBJECT (capsfilter2), "caps",caps2,NULL); > gst_element_set_state(GST_ELEMENT(capsfilter2), GST_STATE_PLAYING); > gst_pad_set_blocked_async(pad,FALSE,my_blocked_callback,NULL); > ohh, you can just set the caps. I mean just do: g_object_set (G_OBJECT (capsfilter2), "caps",caps2,NULL); Stefan > First, I block the videoscale's src pad. Then turn the the element > capsfilter2 to NULL , give it new caps with different width and > height. > After that, I turn the element capsfilter2 to PLAYING, and unblock the > videoscale's src pad. > > after doing these, I get wrong message > ** ERROR **: Internal data flow error. > aborting... > > But if I let the caps2's format same with caps1: > caps2 = gst_caps_new_simple ("video/x-raw-yuv", > "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'), > "width", G_TYPE_INT, 123, > "height", G_TYPE_INT, 456, > "framerate", GST_TYPE_FRACTION, 30, 1, > NULL); > > The streaming still works. It's really strange. > I think it's because of the videoscale problem. > Did I lose any detail in the code? > > Thank you! > Best regards, > Chandler Lee. > > 2010/9/14 Stefan Kost : > >> On 14.09.2010 10:48, Chandler Li wrote: >> >>> I have read the mail Nicola and Stefan given, and that really inspired >>> me a lot. >>> Thank you! >>> In the first, I try to replace some elements and encounter some >>> problem, the following I will show my target first. >>> Because I want to stream webcam frame on the internet, >>> and dynamically change the frame size without stopping the pipeline. >>> >>> According to my experience, if the v4l2src element opens webcam >>> device, it only captures one size at one time. >>> if I want to change the size of webcam, I need to restart it. >>> >>> >> v4l2src does not support changing the resolution on the fly. This is a >> v4l2 interface limitation right now - one needs to stop streaming, set >> new format and restart streaming. So your observation is correct. >> >> >>> So I try another architecture of gstreamer, >>> >>> v4l2src -> capsfilter1 -> videoscale -> capsfilter2 -> >>> ffmpegcolorspace -> .... (to internet) >>> >>> videoscale links two filters, the first filter (capsfilter1) >>> negotiates with v4l2src in a static frame size, >>> I want to change the size I assigned in the second filter (capsfilter2) , >>> >>> BUT I get an error message after I change the capsfilter2 to the new filter, >>> >>> videoscale can't link to new capsfilter, >>> >>> >> that should work. Can you paste a few lines of the code here. You should >> create new caps and just set the new format on capsfilter2. >> >> Stefan >> >>> I don't know what's happened? does anyone know that? >>> Thank you! >>> >>> Best regards, >>> Chandler Lee. >>> >>> 2010/9/14 Stefan Kost : >>> >>> >>>> Am 12.09.2010 09:01, schrieb Chandler Li: >>>> >>>> >>>>> Hi all, >>>>> I'm new to use gstreamer, >>>>> and I have a problem don't know how to solve for a long time, >>>>> >>>>> Is there possible to dynamic change the parameter in elements or pads? >>>>> >>>>> >>>> gobject parameters -> g_object_set or use GstController >>>> >>>> >>>> >>>> >>>>> For example, >>>>> In a playing state streaming, >>>>> could I dynamic change the size of video without stop the streaming? >>>>> >>>>> >>>> yes you can. if you resize the video window for xvimagesink the video adjusts if >>>> it can. So if you run gst-launch videotestsrc ! xvimagesink, the videotest is >>>> sending videoframes in the native resolution instead of xvimagesink scaling them. >>>> >>>> Stefan >>>> >>>> >>>> >>>>> hopes that's not a stupid question, >>>>> and very hopefully for your reply, >>>>> Thank you >>>>> >>>>> Chandler Lee >>>>> >>>>> ------------------------------------------------------------------------------ >>>>> Start uncovering the many advantages of virtual appliances >>>>> and start using them to simplify application deployment and >>>>> accelerate your shift to cloud computing >>>>> http://p.sf.net/sfu/novell-sfdev2dev >>>>> _______________________________________________ >>>>> gstreamer-devel mailing list >>>>> gstreamer-devel at lists.sourceforge.net >>>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>>> >>>>> >>>> >>>> >> >> From henrik.hedberg at innologies.fi Wed Sep 15 21:37:16 2010 From: henrik.hedberg at innologies.fi (Henrik Hedberg) Date: Wed, 15 Sep 2010 22:37:16 +0300 Subject: [gst-devel] Pipeline with live and non-live sources and sinks In-Reply-To: <1284472222.2407.9.camel@metal> References: <4C8F484F.5050209@innologies.fi> <4C8F7B8B.1000706@innologies.fi> <1284472222.2407.9.camel@metal> Message-ID: <4C91206C.3010207@innologies.fi> On 14.09.2010 16:50, Wim Taymans wrote: > On Tue, 2010-09-14 at 16:41 +0300, Henrik Hedberg wrote: >> On 14.09.2010 13:02, Henrik Hedberg wrote: >> >>> We have a problem constructing a pipeline with live and non-live sources >>> and sinks. The following pipeline results audible scratches or "jumps" >>> during playback: >>> >>> gst-launch-0.10 filesrc location=/tmp/test.mp3 ! decodebin ! >>> audioconvert ! autoaudiosink autoaudiosrc ! audioconvert ! wavenc ! >>> filesink location=/tmp/recording.wav >>> >>> It may be buffer under-run or latency issue. It does not happen every >>> time but usually and it occurs especially at the beginning of the >>> stream. The recorded wav is perfect. >> >> I have tested the same pipeline with different versions of >> GStreamer. It seems that 0.10.21 and 0.10.23 were working as expected, >> but this problem (bug?) appears in 0.10.25 and 0.0.28. Does anybody have >> an idea, what has been changed between 0.10.23 and 0.10.25 related to >> this issue? > > It's because the sink has to slave its clock to the pipeline clock, > which is the one provided by the source. Usually you get little glitches > when the clocks try to match rates and or when resync happens because > the clocks drift too much. Thank you for your answer. It was something I was thinking myself too. However, the same pipeline used to work decently before 0.10.25 (or 0.10.24), so the situation has gone worse recently. > If you don't need synchronization between the playback and the > capture, you can set slave-method=none on the sink or the source. Unfortunately that is not possible, because we are implementing a karaoke-like application (for children :). Any idea, what other parameters could be fine-tuned to achieve better behavior? I already tried different buffering and latency values as well as forced the pipeline to use the sink clock. The result was either missing synchronization or no significant improvement. Actually, we are using PulseAudio, so the minimized pipeline is: gst-launch-0.10 filesrc location=/tmp/test.mp3 ! decodebin! pulsesink pulsesrc ! wavenc ! filesink location=/tmp/recording.wav Is it possible that PulseAudio is affecting here somehow? BR, Henrik -- Henrik Hedberg - http://www.henrikhedberg.net/ From bertd at tplogic.com Thu Sep 16 00:39:34 2010 From: bertd at tplogic.com (Bert Douglas) Date: Wed, 15 Sep 2010 17:39:34 -0500 Subject: [gst-devel] bayer format video Message-ID: How does one go about adding a new family of raw video formats to gstreamer? Is this even possible or practicable? video/x-bayer There are various bits per pixel, 8,12,16 The 12 bits form can be packed in several ways The components have several ordering permutations BGGR GBRG RGGB GRBG -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at entropywave.com Thu Sep 16 00:47:06 2010 From: ds at entropywave.com (David Schleef) Date: Wed, 15 Sep 2010 18:47:06 -0400 Subject: [gst-devel] can't send the video stream from a given rtp port In-Reply-To: References: Message-ID: <20100915224706.GA30275@cooker.entropywave.com> On Wed, Sep 15, 2010 at 06:13:06PM +0530, Parveen Kumar Jain wrote: > HI All, > I am using GStreamer for one of my basic video streaming application.I am > facing an issue where I want GStreamer to send the video RTP stream from a > given IP address and port(my system has more than 2 ethernet cards),as > GStreamer's udpsink picks any "free port and ip address" of his own choice. > Can anyone from the group can help me to achieve this ? Set up the socket yourself and pass the file descriptor using the sockfd property. David From ds at entropywave.com Thu Sep 16 01:03:53 2010 From: ds at entropywave.com (David Schleef) Date: Wed, 15 Sep 2010 19:03:53 -0400 Subject: [gst-devel] bayer format video In-Reply-To: References: Message-ID: <20100915230353.GB30275@cooker.entropywave.com> On Wed, Sep 15, 2010 at 05:39:34PM -0500, Bert Douglas wrote: > How does one go about adding a new family of raw video formats to gstreamer? First, you check the documentation to see if it's already there. David From bertd at tplogic.com Thu Sep 16 01:13:39 2010 From: bertd at tplogic.com (Bert Douglas) Date: Wed, 15 Sep 2010 18:13:39 -0500 Subject: [gst-devel] bayer format video In-Reply-To: <20100915230353.GB30275@cooker.entropywave.com> References: <20100915230353.GB30275@cooker.entropywave.com> Message-ID: I did. http://www.gstreamer.net/data/doc/gstreamer/head/pwg/html/section-types-definitions.html And I did gst-inspect on every base element. Where else should I look? Thanks, Bert Douglas On Wed, Sep 15, 2010 at 6:03 PM, David Schleef wrote: > On Wed, Sep 15, 2010 at 05:39:34PM -0500, Bert Douglas wrote: > > How does one go about adding a new family of raw video formats to > gstreamer? > > First, you check the documentation to see if it's already there. > > > > David > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ds at entropywave.com Thu Sep 16 02:16:21 2010 From: ds at entropywave.com (David Schleef) Date: Wed, 15 Sep 2010 20:16:21 -0400 Subject: [gst-devel] bayer format video In-Reply-To: References: <20100915230353.GB30275@cooker.entropywave.com> Message-ID: <20100916001621.GA32353@cooker.entropywave.com> On Wed, Sep 15, 2010 at 06:13:39PM -0500, Bert Douglas wrote: > I did. > http://www.gstreamer.net/data/doc/gstreamer/head/pwg/html/section-types-definitions.html > > And I did gst-inspect on every base element. > > Where else should I look? srsly? http://www.google.com/search?hl=en&q=gstreamer+bayer David From gang.a.hu at intel.com Thu Sep 16 04:23:26 2010 From: gang.a.hu at intel.com (Hu, Gang A) Date: Thu, 16 Sep 2010 10:23:26 +0800 Subject: [gst-devel] bayer format video In-Reply-To: References: Message-ID: <1A42CE6F5F474C41B63392A5F80372B22A90CC57@shsmsx501.ccr.corp.intel.com> V4l2src already support video/x-raw-bayer (V4L2_PIX_FMT_SBGGR8) Gst-inspect v4l2src | grep bayer From: Bert Douglas [mailto:bertd at tplogic.com] Sent: Thursday, September 16, 2010 6:40 AM To: Discussion of the development of GStreamer Subject: [gst-devel] bayer format video How does one go about adding a new family of raw video formats to gstreamer? Is this even possible or practicable? video/x-bayer There are various bits per pixel, 8,12,16 The 12 bits form can be packed in several ways The components have several ordering permutations BGGR GBRG RGGB GRBG -------------- next part -------------- An HTML attachment was scrubbed... URL: From nico at inattendu.org Thu Sep 16 14:22:39 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Thu, 16 Sep 2010 16:22:39 +0400 Subject: [gst-devel] Gnonlin. Example of video and sound mix with seek and query Message-ID: <4C920C0F.9060400@inattendu.org> Hi, I spent some time to write a python script based on gnonlin. This script play 2 videos sequentially and play an audio track at the same time. The overall video duration id displayed in a scale bar. It is also possible to seek on the scale bar. I have a litle hard time to write it. Mainly on gnonlin and understand that a gnl composition is needed for each type of stream. So far, I don't find another way to do that. Its look like gnonlin feet perfectly for that kind of function. After hating gnonlin these past 2 days, I think I'm loving it now ;-) So if it can be useful for anybody. I attached the script to this mail The goal of this function is to add a very simple timeline to luciole, a stop-motion tool. The timeline is used to plays the taken snapshot with a sound file. Add allow also seek on the timeline Cheers Nico -------------- next part -------------- A non-text attachment was scrubbed... Name: play_seeker_with_gnonlin.py Type: text/x-python Size: 15361 bytes Desc: not available URL: From alessandro.d at gmail.com Thu Sep 16 18:07:07 2010 From: alessandro.d at gmail.com (Alessandro Decina) Date: Thu, 16 Sep 2010 18:07:07 +0200 Subject: [gst-devel] [PiTiVi] RELEASE: PiTiVi open-source video editor 0.13.5 "I Missed My Lunch" Message-ID: This mail announces the release of PiTiVi video editor 0.13.5 "I Missed My Lunch". PiTiVi is an open source video editor, written in Python and based on GStreamer and GTK+. More information in the attached release notes, as well as on http://www.pitivi.org/ To file bugs, please go to http://bugzilla.gnome.org/enter_bug.cgi?product=pitivi -------------- next part -------------- A non-text attachment was scrubbed... Name: RELEASE Type: application/octet-stream Size: 5680 bytes Desc: not available URL: From thomazavila at gmail.com Thu Sep 16 21:13:14 2010 From: thomazavila at gmail.com (Thomaz Barros) Date: Thu, 16 Sep 2010 16:13:14 -0300 Subject: [gst-devel] Ubuntu and Gstreamer In-Reply-To: References: Message-ID: Hi, I've already tried two alternatives but could not see the x264enc options you told me before: 1) I compiled all the Gstreamer packages (always the latest versions) and the latest daily tarball of libx264 from VideoLan. 2) I compiled libx264 and tested it with the latest Gstreamer packages from PPA. As I told before, none of these options worked for me. Do you know if I'm making any mistake? I'm using Ubuntu 10.04 32 bits. 2010/9/3 Rob > On 3 September 2010 13:37, Thomaz Barros wrote: > > Hi Rob, thanks for your response. I just tried your suspect today because > i > > had other problems during the week. I built the lastest gstreamer core > and > > plugins version and the latest libx264 release but it didn't work as well > > but your options aren't displayed to me. > > Which packages did you build? How did you build them? Did you install > them or are you running them uninstalled or using JHBuild? > > What do you mean by the latest libx264 release? The latest release as > far as the x264 developers are concerned is always the current tip of > the x264 git repository. With current x264 (from git) and current core > -base and -ugly (also from git repositories) I see all the options I > mentioned. > > Regards, > Rob > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wl2776 at gmail.com Fri Sep 17 08:44:02 2010 From: wl2776 at gmail.com (wl2776) Date: Thu, 16 Sep 2010 23:44:02 -0700 (PDT) Subject: [gst-devel] Where can I read about internals of pushsrc element? Message-ID: <1284705842728-2543424.post@n4.nabble.com> Where can I read about internals of pushsrc element and how to derive classes from it? Plugin Writers Guide introduces chain and event handling functions, but this plugin seems to be missing the first. But it has its own g_main_loop instance and some other functions (start, stop, finalize, etc) instead. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Where-can-I-read-about-internals-of-pushsrc-element-tp2543424p2543424.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From bilboed at gmail.com Fri Sep 17 09:56:38 2010 From: bilboed at gmail.com (Edward Hervey) Date: Fri, 17 Sep 2010 09:56:38 +0200 Subject: [gst-devel] Where can I read about internals of pushsrc element? In-Reply-To: <1284705842728-2543424.post@n4.nabble.com> References: <1284705842728-2543424.post@n4.nabble.com> Message-ID: <1284710198.5201.17.camel@joder> On Thu, 2010-09-16 at 23:44 -0700, wl2776 wrote: > Where can I read about internals of pushsrc element Have you considered reading the code of GstBaseSrc and GstPushSrc ? Have you considered reading the docs about those base classes ? > and how to derive classes > from it? Have you considered reading the code to the most basic elements derived from those ? like fakesrc ? > Plugin Writers Guide introduces chain and event handling functions, but this > plugin seems to be missing the first. The first what ? > But it has its own g_main_loop > instance Plugins have nothing to do with GMainLoop > and some other functions (start, stop, finalize, etc) instead. Those virtual methods and their usage are properly documented in the basesrc/pushsrc API docs. Edward P.S. And don't forget that since GstPushSrc is a subclass of GstBaseSrc you should read the docs of GstBaseSrc in addition to GstPushSrc From ikt011 at gmail.com Fri Sep 17 13:42:02 2010 From: ikt011 at gmail.com (Kocsis Tibor) Date: Fri, 17 Sep 2010 13:42:02 +0200 Subject: [gst-devel] cannot interpolate rtp time Message-ID: Hi, can somebody explain me what's this warning message means and what can i do to avoid it: rtpsource rtpsource.c:1391:rtp_source_get_new_sr: no clock-rate, cannot interpolate rtp time It shows up when i connect an rtp sink to a running pipeline, and sometimes come these criticals with that: (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: assertion `mini_object->refcount > 0' failed (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: assertion `mini_object->refcount > 0' failed Thanks Tibor From wim.taymans at gmail.com Fri Sep 17 13:55:32 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Fri, 17 Sep 2010 13:55:32 +0200 Subject: [gst-devel] cannot interpolate rtp time In-Reply-To: References: Message-ID: <1284724532.2439.4.camel@metal> On Fri, 2010-09-17 at 13:42 +0200, Kocsis Tibor wrote: > Hi, > > can somebody explain me what's this warning message means and what can > i do to avoid it: > > rtpsource rtpsource.c:1391:rtp_source_get_new_sr: no clock-rate, > cannot interpolate rtp time This usually means that the application didn't tell gstrtpbin what the clock-rate is for a source. You need to provide complete RTP caps on the rtpbin or connect to the request-pt-map signal to provide this info to gstrtpbin. > > It shows up when i connect an rtp sink to a running pipeline, and > sometimes come these criticals with that: You probably also add a new sender source, which explains the warning above. > > (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: > assertion `mini_object->refcount > 0' failed > (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: > assertion `mini_object->refcount > 0' failed > That suggests a serious refcounting bug somewhere. runnning the application with --gst-fatal-warnings and inside gdb can give a backtrace that can give a hint what's going on. Wim > > Thanks > Tibor > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From wim.taymans at gmail.com Fri Sep 17 13:55:32 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Fri, 17 Sep 2010 13:55:32 +0200 Subject: [gst-devel] cannot interpolate rtp time In-Reply-To: References: Message-ID: <1284724532.2439.4.camel@metal> On Fri, 2010-09-17 at 13:42 +0200, Kocsis Tibor wrote: > Hi, > > can somebody explain me what's this warning message means and what can > i do to avoid it: > > rtpsource rtpsource.c:1391:rtp_source_get_new_sr: no clock-rate, > cannot interpolate rtp time This usually means that the application didn't tell gstrtpbin what the clock-rate is for a source. You need to provide complete RTP caps on the rtpbin or connect to the request-pt-map signal to provide this info to gstrtpbin. > > It shows up when i connect an rtp sink to a running pipeline, and > sometimes come these criticals with that: You probably also add a new sender source, which explains the warning above. > > (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: > assertion `mini_object->refcount > 0' failed > (unknown:13994): GStreamer-CRITICAL **: gst_mini_object_unref: > assertion `mini_object->refcount > 0' failed > That suggests a serious refcounting bug somewhere. runnning the application with --gst-fatal-warnings and inside gdb can give a backtrace that can give a hint what's going on. Wim > > Thanks > Tibor > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From nico at inattendu.org Fri Sep 17 15:22:26 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Fri, 17 Sep 2010 17:22:26 +0400 Subject: [gst-devel] Pitivi and tests Message-ID: <4C936B92.9020207@inattendu.org> Hi I getted the last version of PiTiVi from Git I made the : ./autogen.sh make bin/pitivi works like a charm But I would like to run the tests under the directory tests. But it doesn't work due to dependance on pitivi. So how can I run it ? Should I have to install pitivi ( make install) ? Thanks Nico From gary at mlbassoc.com Fri Sep 17 21:58:37 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Fri, 17 Sep 2010 13:58:37 -0600 Subject: [gst-devel] H264 network streaming Message-ID: <4C93C86D.1040100@mlbassoc.com> I'm trying to run gstrtpbin over a network, in particular, the examples server-v4l2-H264-alsasrc-PCMA.sh client-H264-PCMA.sh I'm using gst-plugins-good-0.10.20 (latest version I think) I get widely varying results. * video (or audio) data freezes. * the pipeline spontaneously breaks (ERROR: from element /GstPipeline:pipeline0/GstAutoAudioSink:autoaudiosink0/GstPulseSink:autoaudiosink0-actual-sink-pulse: pa_stream_writable_size() failed: Connection terminated) * no video (or audio) at all What's the best way to debug this? In particular the last case where I never get any video data displayed, I can see lots of it being shipped to my client (rtp + rtcp packets all look good). How can I discover why there is no video? Any ideas or pointers? Thanks -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From gary at mlbassoc.com Fri Sep 17 22:02:49 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Fri, 17 Sep 2010 14:02:49 -0600 Subject: [gst-devel] H264 network streaming In-Reply-To: <4C93C86D.1040100@mlbassoc.com> References: <4C93C86D.1040100@mlbassoc.com> Message-ID: <4C93C969.7050800@mlbassoc.com> On 09/17/2010 01:58 PM, Gary Thomas wrote: > I'm trying to run gstrtpbin over a network, in particular, > the examples > server-v4l2-H264-alsasrc-PCMA.sh > client-H264-PCMA.sh > I'm using gst-plugins-good-0.10.20 (latest version I think) Actually, I'm using gst-plugins-good-0.10.25 > > I get widely varying results. > * video (or audio) data freezes. > * the pipeline spontaneously breaks > (ERROR: from element > /GstPipeline:pipeline0/GstAutoAudioSink:autoaudiosink0/GstPulseSink:autoaudiosink0-actual-sink-pulse: > pa_stream_writable_size() failed: Connection terminated) > * no video (or audio) at all > > What's the best way to debug this? In particular the last > case where I never get any video data displayed, I can see > lots of it being shipped to my client (rtp + rtcp packets > all look good). How can I discover why there is no video? > > Any ideas or pointers? > > Thanks > -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From gary at mlbassoc.com Sat Sep 18 01:01:53 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Fri, 17 Sep 2010 17:01:53 -0600 Subject: [gst-devel] rtpbin + mpegtsmux Message-ID: <4C93F361.8080206@mlbassoc.com> I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines: Server: gst-launch -v gstrtpbin name=rtpbin v4l2src ! video/x-raw-yuv,width=720,height=480 ! x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 ts-offset=0 name=vrtpsink rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 sync=false async=false name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0 Client: gst-launch -v gstrtpbin name=rtpbin latency=200 udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES port=5000 ! rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux name=demux ! ffdec_h264 ! xvimagesink demux. udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 ! udpsink port=5005 host=192.168.1.101 sync=false async=false When the server starts up, I get these notices: /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)480, fr amerate=(fraction)30/1 Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)8 0, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int) 480, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)48 0, framerate=(fraction)30/1 /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:src: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)4 80 /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:sink_64: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int) 480 /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188 /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, enco ding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188 /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamhea der=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00f fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249 > /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, str eamheader=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 4740203 08b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff fffffffffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249 > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base= (uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)900 00, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)vide o, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-ba se=(uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)9000 0, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_src: caps = application/x-rtp, media=(string)video, c lock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=( uint)41726 /GstPipeline:pipeline0/GstUDPSink:vrtpsink.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding- name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video , clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-bae =(uint)41726 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstUDPSink:vrtcpsink.GstPad:sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp Similarly for the client: Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0: ntp-ns-base = 3493752949569988000 New clock: GstSystemClock /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:src: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, payload=(int)33 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_src_0_4255958994_33.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, payload=(int)33 /GstPipeline:pipeline0/GstMpegTSDemux:demux.GstPad:sink: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true /GstPipeline:pipeline0/GstMpegTSDemux:demux: pat-info = ((GValueArray*) 0xb3f02c80) /GstPipeline:pipeline0/GstMpegTSDemux:demux: pmt-info = ((MpegTsPmtInfo*) 0xb3f02520) /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264 /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad2: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_-39008302: caps = application/x-rtcp /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp Sadly, even though the server is pumping out data, I don't see anything at the client (my xvimagesink window never opens up) Any ideas what I'm doing wrong or how to diagnose this? Thanks Note: I'm also a bit unsure how to write these pipelines if I want to put audio data into the .TS container as well. Any pointers on this would be most helpful. -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From gary at mlbassoc.com Sat Sep 18 01:14:05 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Fri, 17 Sep 2010 17:14:05 -0600 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C93F361.8080206@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> Message-ID: <4C93F63D.9020105@mlbassoc.com> On 09/17/2010 05:01 PM, Gary Thomas wrote: > I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines: > > Server: > gst-launch -v gstrtpbin name=rtpbin v4l2src ! video/x-raw-yuv,width=720,height=480 ! > x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0 > rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 ts-offset=0 name=vrtpsink > rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 sync=false async=false name=vrtcpsink > udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0 > > Client: > gst-launch -v gstrtpbin name=rtpbin latency=200 > udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES port=5000 ! > rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux name=demux ! ffdec_h264 ! xvimagesink demux. > udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 ! > udpsink port=5005 host=192.168.1.101 sync=false async=false > > When the server starts up, I get these notices: > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)480, fr > amerate=(fraction)30/1 > Pipeline is live and does not need PREROLL ... > Setting pipeline to PLAYING ... > New clock: GstSystemClock > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)8 > 0, framerate=(fraction)30/1 > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int) > 480, framerate=(fraction)30/1 > /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)48 > 0, framerate=(fraction)30/1 > /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:src: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)4 > 80 > /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:sink_64: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int) > 480 > /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188 > /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, enco > ding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 > /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188 > /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamhea > der=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00f > fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249> > /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, str > eamheader=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 4740203 > 08b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff > fffffffffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249> > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_sink: caps = application/x-rtp, media=(string)video, > clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base= > (uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)900 > 00, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)vide > o, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-ba > se=(uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)9000 > 0, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_src: caps = application/x-rtp, media=(string)video, c > lock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=( > uint)41726 > /GstPipeline:pipeline0/GstUDPSink:vrtpsink.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding- > name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video > , clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-bae > =(uint)41726 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp > /GstPipeline:pipeline0/GstUDPSink:vrtcpsink.GstPad:sink: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp > > Similarly for the client: > Setting pipeline to PAUSED ... > Pipeline is live and does not need PREROLL ... > Setting pipeline to PLAYING ... > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0: ntp-ns-base = 3493752949569988000 > New clock: GstSystemClock > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES > /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:src: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true > /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, payload=(int)33 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_src_0_4255958994_33.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, > encoding-name=(string)MP2T-ES, payload=(int)33 > /GstPipeline:pipeline0/GstMpegTSDemux:demux.GstPad:sink: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true > /GstPipeline:pipeline0/GstMpegTSDemux:demux: pat-info = ((GValueArray*) 0xb3f02c80) > /GstPipeline:pipeline0/GstMpegTSDemux:demux: pmt-info = ((MpegTsPmtInfo*) 0xb3f02520) > /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264 > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp > /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad2: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_-39008302: caps = application/x-rtcp > /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp > > Sadly, even though the server is pumping out data, I don't see > anything at the client (my xvimagesink window never opens up) > > Any ideas what I'm doing wrong or how to diagnose this? > > Thanks > > Note: I'm also a bit unsure how to write these pipelines if I want > to put audio data into the .TS container as well. Any pointers > on this would be most helpful. > Followup - I tried this between two [similar] x86 desktop systems and it worked! It seems to only fail when my server machine is my embedded OMAP board (running a recent kernel and the same gstreamer modules as on the x86 systems). How can I figure out where in the process (pipeline) it's failing? -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From jjinfo at nudt.edu.cn Sat Sep 18 03:58:59 2010 From: jjinfo at nudt.edu.cn (Jie Jiang) Date: Sat, 18 Sep 2010 09:58:59 +0800 Subject: [gst-devel] Question about stream media server based on gstreamer Message-ID: <1284775139.7327.9.camel@UT43> Hi, I'm new to gstreamer. Now I'm considering writing a stream video/audio server with RTP support. I have some questions: 1. Is the client(stream receiver) required to use gstreamer as backend in order to correctly receive and decode the video stream from the server, which is written on the base of gsreamer? Or any client that implements RTP protocol and the corresponding decoder can work with the server? 2. If the audio stream and video stream go through two separate gstreamer streams, is it possible to synchronize them at the receiver client? Regards, Jie From bilboed at gmail.com Sat Sep 18 13:22:26 2010 From: bilboed at gmail.com (Edward Hervey) Date: Sat, 18 Sep 2010 13:22:26 +0200 Subject: [gst-devel] Pitivi and tests In-Reply-To: <4C936B92.9020207@inattendu.org> References: <4C936B92.9020207@inattendu.org> Message-ID: <1284808946.2746.0.camel@deumeu> On Fri, 2010-09-17 at 17:22 +0400, Nicolas Bertrand wrote: > Hi > I getted the last version of PiTiVi from > > Git I made the : > ./autogen.sh > make > > bin/pitivi works like a charm > > But I would like to run the tests under the directory tests. But it > doesn't work due to dependance on pitivi. > > So how can I run it ? Should I have to install pitivi ( make install) ? Just run "make check". Also, you might want to use the pitivi ml in the future :) Edward > > Thanks > Nico > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From nico at inattendu.org Sat Sep 18 14:24:51 2010 From: nico at inattendu.org (Nicolas Bertrand) Date: Sat, 18 Sep 2010 16:24:51 +0400 Subject: [gst-devel] Pitivi and tests In-Reply-To: <1284808946.2746.0.camel@deumeu> References: <4C936B92.9020207@inattendu.org> <1284808946.2746.0.camel@deumeu> Message-ID: <4C94AF93.1050702@inattendu.org> > Just run "make check". Also, you might want to use the pitivi ml in > the future :) > > Thanks for the answer ! And i go to suscribe to pitivi ml. From gibrovacco at gmail.com Sat Sep 18 16:16:11 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 17:16:11 +0300 Subject: [gst-devel] H264 network streaming In-Reply-To: <4C93C86D.1040100@mlbassoc.com> References: <4C93C86D.1040100@mlbassoc.com> Message-ID: Hi, On Fri, Sep 17, 2010 at 10:58 PM, Gary Thomas wrote: > I'm trying to run gstrtpbin over a network, in particular, > the examples > server-v4l2-H264-alsasrc-PCMA.sh > client-H264-PCMA.sh > I'm using gst-plugins-good-0.10.20 (latest version I think) > > I get widely varying results. > * video (or audio) data freezes. > * the pipeline spontaneously breaks > (ERROR: from element > /GstPipeline:pipeline0/GstAutoAudioSink:autoaudiosink0/GstPulseSink:autoaudiosink0-actual-sink-pulse: > pa_stream_writable_size() failed: Connection terminated) > * no video (or audio) at all > > What's the best way to debug this? running your pipeline after setting GST_DEBUG to a proper value (I suggest you to begin with 2) will give you some hints about what's going wrong. Increasing the number increases the verbosity. > In particular the last case where I never get any video data displayed, I > can see lots of it being shipped to my client (rtp + rtcp packets all look > good). How can I discover why there is no video? > If you're dealing with the network, checking for lost packets is another good starting point. Even if you've more than 80% of the video packets being properly received, you may not see anything because of no key frames ever being completely received. Regards > Any ideas or pointers? > > Thanks > > -- > ------------------------------------------------------------ > Gary Thomas | Consulting for the > MLB Associates | Embedded world > ------------------------------------------------------------ > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Sat Sep 18 16:31:11 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 17:31:11 +0300 Subject: [gst-devel] Transcoding and otherwise dealing with streams. In-Reply-To: References: Message-ID: Hi, sorry for the late reply. I hope it will help. On Fri, Sep 10, 2010 at 7:52 PM, Baldur Gislason wrote: > I am trying to construct a C application that can either pick up > audio/video from a file (mpeg transport stream) or receive mpeg transport > stream on a UDP socket. > Input format is MPEG2 A/V and output is H.264 with MPEG audio or AAC, > transport stream multiplexing on output. > So far I have managed to transcode video from network but am getting buffer > underruns on the queue. If I add audio everything just stops, the pipeline > does nothing and I can't quite figure out why. > I don't have many details about how your pipelines ar4e built, but in 90% of the cases the behaviour you're observing is due to a missing queue at the right point. The audio thread should be straightforward from source to sink, while one queue element should be separating the video thread before the muxer. > If I read the data from a file, I get buffer overruns. So clearly this is a > clocking thing. I have searched for documentation regarding clocking in > gstreamer but found nothing useful. You can find something here: http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-clocks.txt and here: http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-scheduling.txt http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-synchronisation.txt > The app developers manual mentions clocking but nothing about how it > applies to code, and gst-inspect says none of the elements I have looked at > have any clocking capabilities?!f > > Anyway, I was wondering if anyone had an example for building an MPEG > transcoding pipeline in C, for working with a live stream and not file > input, file output. File input, network output would be the other > scenario. > See here the excellent guide (should be from Mark Nauwelaerts): http://gentrans.sourceforge.net/docs/head/manual/html/howto.html Regards > > Baldur Gislason > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Sat Sep 18 17:01:51 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 18:01:51 +0300 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C93F361.8080206@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> Message-ID: Hi, On Sat, Sep 18, 2010 at 2:01 AM, Gary Thomas wrote: > I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines: > > Server: > gst-launch -v gstrtpbin name=rtpbin v4l2src ! > video/x-raw-yuv,width=720,height=480 ! > x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0 > rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 > ts-offset=0 name=vrtpsink > rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 > sync=false async=false name=vrtcpsink > udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0 > The feeling is that the muxing/payloading is wrong.. try replacing mpegtsmux ! rtpmp2pay with rtph264pay (and a similar operation in the receiver). > Client: > gst-launch -v gstrtpbin name=rtpbin latency=200 > udpsrc > caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES > port=5000 ! > rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux > name=demux ! ffdec_h264 ! xvimagesink demux. > udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 ! > udpsink port=5005 host=192.168.1.101 sync=false async=false > > > ..snip.. if you have more than a few lines to report in a thread, sending them as an attachment (or even using tools like pastebin) will improve readability -and will avoid my poor old eepc 701 going crazy with the web client ;) -. Regards. > Sadly, even though the server is pumping out data, I don't see > anything at the client (my xvimagesink window never opens up) > > Any ideas what I'm doing wrong or how to diagnose this? > > Thanks > > Note: I'm also a bit unsure how to write these pipelines if I want > to put audio data into the .TS container as well. Any pointers > on this would be most helpful. > > -- > ------------------------------------------------------------ > Gary Thomas | Consulting for the > MLB Associates | Embedded world > ------------------------------------------------------------ > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at svrinformatica.it Sat Sep 18 17:46:03 2010 From: lists at svrinformatica.it (Mailing List SVR) Date: Sat, 18 Sep 2010 17:46:03 +0200 Subject: [gst-devel] Question about stream media server based on gstreamer In-Reply-To: <1284775139.7327.9.camel@UT43> References: <1284775139.7327.9.camel@UT43> Message-ID: <1284824763.3130.26.camel@localhost.localdomain> Il giorno sab, 18/09/2010 alle 09.58 +0800, Jie Jiang ha scritto: > Hi, > > I'm new to gstreamer. Now I'm considering writing a stream video/audio > server with RTP support. I have some questions: > > 1. Is the client(stream receiver) required to use gstreamer as backend > in order to correctly receive and decode the video stream from the > server, which is written on the base of gsreamer? Or any client that > implements RTP protocol and the corresponding decoder can work with the > server? any rtp client will work, you can generate an sdp for clients such as vlc, take a look to the samples here: http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/tests/examples/rtp Nicola > > 2. If the audio stream and video stream go through two separate > gstreamer streams, is it possible to synchronize them at the receiver > client? > > > Regards, > Jie > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From gibrovacco at gmail.com Sat Sep 18 18:42:41 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 19:42:41 +0300 Subject: [gst-devel] Question about stream media server based on gstreamer In-Reply-To: <1284775139.7327.9.camel@UT43> References: <1284775139.7327.9.camel@UT43> Message-ID: Hi, On Sat, Sep 18, 2010 at 4:58 AM, Jie Jiang wrote: > Hi, > > I'm new to gstreamer. Now I'm considering writing a stream video/audio > server with RTP support. I have some questions: > > my 0.05? here: I warmly suggest you to evaluate, instead of starting a new project, the possibility to contribute to an already running one: http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/ it will have both the advantages to save you the time needed to "invent" a session initiation protocol and enhance and already pretty good project (imho). Regards 1. Is the client(stream receiver) required to use gstreamer as backend > in order to correctly receive and decode the video stream from the > server, which is written on the base of gsreamer? Or any client that > implements RTP protocol and the corresponding decoder can work with the > server? > > 2. If the audio stream and video stream go through two separate > gstreamer streams, is it possible to synchronize them at the receiver > client? > > > Regards, > Jie > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Sat Sep 18 18:53:14 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 19:53:14 +0300 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> Message-ID: Hi, 2010/9/13 Zhao, Halley > Thanks Stefan. > After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got > same result. > > I think the possible solution is to enhance souphttpsrc to save content to > disk after some refractor, because souphttpsrc does some seek following the > command of parser. > > Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 > file at the beginning of playback. Finally, tail of the original mp4 file is > missing in downloaded mp4 file. > It looks like you're not re-muxing the content. Are you simply storing the raw mp4 data to a file or are you using a muxer before the filesink? What does mp4info tell about your output file? You may try and recover the saved files with mp4mux using the option "moov-recovery-file". What happens if you transmux the files using it? Regards > halley at halley-lucid:~/swap/streaming/mp4$ ls -l > total 5216 > -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 > -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 > > -----Original Message----- > From: Stefan Kost [mailto:ensonic at hora-obscura.de] > Sent: 2010?9?11? 1:27 > To: Discussion of the development of GStreamer > Cc: Zhao, Halley > Subject: Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > Am 08.09.2010 04:45, schrieb Zhao, Halley: > > During playback of progressive content, I tried to save the content to > disk as well. > > > > But the result is strange: > > > > Some contents are saved correctly, some contents are saved but can?t > playback > > again; some contents even can?t playback during progressive downloaded. > > > > > > > > ## most ogg contents work well, the saved contents can playback again > > > > gst-launch-0.10 souphttpsrc > > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! > decodebin > > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > > > > > ## some mp4 saved contents can?t playback again, the saved contents > differ from > > the original one; even the following test.mp4 and test2.mp4 are different > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! filesink > > location=/home/halley/swap/streaming/test2.mp4 > > > > At first use decodebin2! > > If the http source is seekable, the muxer in decodebin will do pull. You > could try: > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee > name=t ! > decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink > location=test.mp4 > > Stefan > > > > > > > ## some wmv contents even can?t playback during progressive downloaded > (though > > some saved wmv contents can playback again) > > > > gst-launch-0.10 -v -v souphttpsrc location=http:// > > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > > > > > thanks in advance for your help. > > > > > > > > > > > > *ZHAO, Halley (Aihua)* > > > > Email: halley.zhao at intel.com > > > > Tel: +86(21)61166476 iNet: 8821-6476 > > > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Sat Sep 18 18:56:22 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 18 Sep 2010 19:56:22 +0300 Subject: [gst-devel] [gst-embedded] gstreamer dma buffer & management In-Reply-To: References: <9383297B60843144AC17878B479520F1127143AF53@SJEXCHCCR01.corp.ad.broadcom.com> <9383297B60843144AC17878B479520F1127143BAF4@SJEXCHCCR01.corp.ad.broadcom.com> Message-ID: Hi, yes, I DO hate the default gmail settings.. ---------- Forwarded message ---------- From: Marco Ballesio Date: Sat, Sep 18, 2010 at 4:52 PM Subject: Re: [gst-embedded] gstreamer dma buffer & management To: Feng Ye Hi, On Fri, Sep 17, 2010 at 2:49 AM, Feng Ye wrote: > Hello there, > > I am working on a video dec plugin (based on hardware dec). I plan to use > filesrc to read a raw H264 file and feed to my plugin. > it's generically better to use a container format rather than a raw h264 byte-stream. If a container is not an option for your case, maybe you could use at least an annex-B compliant stream. This makes it easier to identify NAL boundaries. It should anyway possible to use even an unformatted NAL sequence (see below). > The buffers from plugin's sink pad are the source data for decoding so they > need to be DMA-able. Is there way to control how the buffers are allocated? > Since they are allocated in filesrc so that's not possible? Looks like I > will need to allocate my own buffer in my plugin and then copy them over? > if you're not using any container format, the source has no way to pass to you buffers aligned with NAL boundaries, so you definitely need an adapter somewhere (see below). > > Also, a video frame may contain less bytes than what's in the buffer, what > do I do with the remaining bytes? > >From the ffmpeg example (they are software based so the first dma question > does not exist), they create a sub buffer of the remaining bytes and then > does a buffer join. These are time consuming I think. yes it is, but I guess you could afford it for compressed data. The overhead depends, of course, on the bandwidth you're using. > I wonder if there are other faster solution? One thing I think might help > is to have another plugin in between, which parse the file and only give one > frame size of data to my plugin. But I am not sure if these kind of plugin > exists. > You can use h264parse for this. It will identify NAL boundaries and pass them to your plugin on separated buffers. Btw everything has a price and in this case it's the penalty like described some rows above. Regards. > > > Thanks, > Feng > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > Gstreamer-embedded mailing list > Gstreamer-embedded at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-embedded > -------------- next part -------------- An HTML attachment was scrubbed... URL: From marc.leeman at gmail.com Sat Sep 18 18:51:33 2010 From: marc.leeman at gmail.com (Marc Leeman) Date: Sat, 18 Sep 2010 18:51:33 +0200 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C93F361.8080206@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> Message-ID: <20100918165133.GU25114@crichton.homelinux.org> > Sadly, even though the server is pumping out data, I don't see > anything at the client (my xvimagesink window never opens up) > > Any ideas what I'm doing wrong or how to diagnose this? You??e probably not sending the data for the decoder to start decoding (NAL 7/8). Try starting the decoder before the sender and see if you get video decoded. If so, add a recent h264parser in the chain and it should remultiplex the correct data into the stream for the decoder to start decoding. Why are you first putting your data into TS and then again in RTP? -- greetz, marc Measure twice, cut once. crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: Digital signature URL: From emmanuel at gnome.org Sun Sep 19 22:54:30 2010 From: emmanuel at gnome.org (Emmanuel Pacaud) Date: Sun, 19 Sep 2010 22:54:30 +0200 Subject: [gst-devel] Looking for gstreamer based muti-stream video display and recorder Message-ID: <1284929670.1941.20.camel@lappc-p348> Hi, I'm looking for a software which could display several video streams on one screen, with recording capabilities. Something similar to multieye-net or streampix5, but free software and based on gstreamer. http://www.artec.de/en/products/multieye/products/multieye-software.html http://www.norpix.com/products/streampix5/streampix5.php Does such a project exist ? Regards, Emmanuel. From gustavo.orrillo at gmail.com Sun Sep 19 23:19:31 2010 From: gustavo.orrillo at gmail.com (Gustavo Orrillo) Date: Sun, 19 Sep 2010 18:19:31 -0300 Subject: [gst-devel] Looking for gstreamer based muti-stream video display and recorder In-Reply-To: <1284929670.1941.20.camel@lappc-p348> References: <1284929670.1941.20.camel@lappc-p348> Message-ID: Hi Emmanuel, Moldeo could do it and it is based on gstreamer. As of this moment does not support recording capabilities Check the website (it's on spanish) http://www.moldeo.org/ Cheers, 2010/9/19 Emmanuel Pacaud > Hi, > > I'm looking for a software which could display several video streams on > one screen, with recording capabilities. > > Something similar to multieye-net or streampix5, but free software and > based on gstreamer. > > http://www.artec.de/en/products/multieye/products/multieye-software.html > http://www.norpix.com/products/streampix5/streampix5.php > > Does such a project exist ? > > Regards, > > Emmanuel. > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From halley.zhao at intel.com Mon Sep 20 03:30:57 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Mon, 20 Sep 2010 09:30:57 +0800 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> Message-ID: <5D8008F58939784290FAB48F549751982C55D64720@shsmsx502.ccr.corp.intel.com> Your suggestion may be helpful, But I expect a solution needn?t care about demux/mux, because all these data are passed through souphttpsrc, save the data from souphttpsrc shouldn?t care about mux/demux. From: Marco Ballesio [mailto:gibrovacco at gmail.com] Sent: Sunday, September 19, 2010 12:53 AM To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Hi, 2010/9/13 Zhao, Halley > Thanks Stefan. After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got same result. I think the possible solution is to enhance souphttpsrc to save content to disk after some refractor, because souphttpsrc does some seek following the command of parser. Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 file at the beginning of playback. Finally, tail of the original mp4 file is missing in downloaded mp4 file. It looks like you're not re-muxing the content. Are you simply storing the raw mp4 data to a file or are you using a muxer before the filesink? What does mp4info tell about your output file? You may try and recover the saved files with mp4mux using the option "moov-recovery-file". What happens if you transmux the files using it? Regards halley at halley-lucid:~/swap/streaming/mp4$ ls -l total 5216 -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 -----Original Message----- From: Stefan Kost [mailto:ensonic at hora-obscura.de] Sent: 2010?9?11? 1:27 To: Discussion of the development of GStreamer Cc: Zhao, Halley Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Am 08.09.2010 04:45, schrieb Zhao, Halley: > During playback of progressive content, I tried to save the content to disk as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t playback > again; some contents even can?t playback during progressive downloaded. > > > > ## most ogg contents work well, the saved contents can playback again > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! decodebin > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > ## some mp4 saved contents can?t playback again, the saved contents differ from > the original one; even the following test.mp4 and test2.mp4 are different > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! filesink > location=/home/halley/swap/streaming/test2.mp4 > At first use decodebin2! If the http source is seekable, the muxer in decodebin will do pull. You could try: gst-launch-0.10 souphttpsrc location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee name=t ! decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 Stefan > > > ## some wmv contents even can?t playback during progressive downloaded (though > some saved wmv contents can playback again) > > gst-launch-0.10 -v -v souphttpsrc location=http:// > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > thanks in advance for your help. > > > > > > *ZHAO, Halley (Aihua)* > > Email: halley.zhao at intel.com > > > Tel: +86(21)61166476 iNet: 8821-6476 > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From andy at bluewire.net.nz Mon Sep 20 06:10:27 2010 From: andy at bluewire.net.nz (Andy Savage) Date: Mon, 20 Sep 2010 12:10:27 +0800 Subject: [gst-devel] Help with complicated RTPBin pipeline Message-ID: Hi there, I need a little bit of help with a long gstreamer pipeline and sending/receiving video/audio. I have two pipelines which work successfully... *Sending Video* gst-launch gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 typefind=true ! typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, width=640, height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! ffenc_mpeg4 ! rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink port=5502 host=192.168.10.175 rtpbin.send_rtcp_src_0 ! udpsink port=5510 host=192.168.10.175 sync=false async=false udpsrc port=5510 ! rtpbin.recv_rtcp_sink_0 autoaudiosrc samplesperbuffer=1000 ! alawenc ! rtppcmapay ! rtpbin.send_rtp_sink_1 rtpbin.send_rtp_src_1 ! udpsink port=5504 host=192.168.10.175 rtpbin.send_rtcp_src_1 ! udpsink port=5512 host=192.168.10.175 sync=false async=false udpsrc port=5512 ! rtpbin.recv_rtcp_sink_1 *Receiving Video* gst-launch gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink host=192.168.10.175 port=5510 sync=false async=false udpsrc caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false async=false I wanted to combine them so that I could have sending/receiving in one pipeline. Is this possible? Right now I simply launch two gstreamer instances (one for sending, one for receiving). The problem is that in the receiving instance I would like to access the local camera and place it as a local preview window into the receiving video. This works fine but the problem is, you can't have two gstreamer instances accessing the camera. I thought if I could combine the above into one pipeline I could use tee and split them then there would only ever be one instance of the camera used at once. *If I simply combine the above two into something like:* gst-launch \ gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 typefind=true ! typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, width=640, height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! ffenc_mpeg4 ! rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink port=5502 host=192.168.10.175 rtpbin.send_rtcp_src_0 ! udpsink port=5510 host=192.168.10.175 sync=false async=false udpsrc port=5510 ! rtpbin.recv_rtcp_sink_0 autoaudiosrc samplesperbuffer=1000 ! alawenc ! rtppcmapay ! rtpbin.send_rtp_sink_1 rtpbin.send_rtp_src_1 ! udpsink port=5504 host=192.168.10.175 rtpbin.send_rtcp_src_1 ! udpsink port=5512 host=192.168.10.175 sync=false async=false udpsrc port=5512 ! rtpbin.recv_rtcp_sink_1 \ gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink host=192.168.10.175 port=5510 sync=false async=false udpsrc caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false async=false It doesn't appear to work... gstreamer starts successfully but I can't recieve video on either end Has this been attempted before? Is there a best way to do it? (I presume people normally use two pipelines?) Kind regards, Andy Savage -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Mon Sep 20 07:54:09 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Mon, 20 Sep 2010 08:54:09 +0300 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <5D8008F58939784290FAB48F549751982C55D64720@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> <5D8008F58939784290FAB48F549751982C55D64720@shsmsx502.ccr.corp.intel.com> Message-ID: Hi, 2010/9/20 Zhao, Halley > Your suggestion may be helpful, > > But I expect a solution needn?t care about demux/mux, because all these > data are passed through souphttpsrc, save the data from souphttpsrc > shouldn?t care about mux/demux. > As you wrote: "it seek to the end of the mp4 file at the beginning of playback." the seek operation is performed from the demuxer (qtdemux), which identifies essential meta-data present at the end of the file. This data will not be transferred again at the end of the playback, so *in this case* you can't consider souphttpsrc as just a mere data pipe through which you get the complete clip. Said so, you have many ways to address this: - You can (try and) use the "moov-recovery-file" transmuxing the file after having saved it. It will restore the missing meta-info. - You can re-mux on-the-file the file while you're getting it from souphttpsrc. Again, it will rebuild the lost meta-infos. - You can use only progressive-download compliant files: they will have all the meta-information stored at the beginning and no seek will be needed. Regards > > > > > *From:* Marco Ballesio [mailto:gibrovacco at gmail.com] > *Sent:* Sunday, September 19, 2010 12:53 AM > > *To:* Discussion of the development of GStreamer > *Subject:* Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > > > Hi, > > 2010/9/13 Zhao, Halley > > Thanks Stefan. > After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got > same result. > > I think the possible solution is to enhance souphttpsrc to save content to > disk after some refractor, because souphttpsrc does some seek following the > command of parser. > > Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 > file at the beginning of playback. Finally, tail of the original mp4 file is > missing in downloaded mp4 file. > > > It looks like you're not re-muxing the content. Are you simply storing the > raw mp4 data to a file or are you using a muxer before the filesink? What > does mp4info tell about your output file? > > You may try and recover the saved files with mp4mux using the option > "moov-recovery-file". What happens if you transmux the files using it? > > Regards > > > halley at halley-lucid:~/swap/streaming/mp4$ ls -l > total 5216 > -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 > -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 > > > -----Original Message----- > From: Stefan Kost [mailto:ensonic at hora-obscura.de] > Sent: 2010?9?11? 1:27 > To: Discussion of the development of GStreamer > > Cc: Zhao, Halley > Subject: Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > Am 08.09.2010 04:45, schrieb Zhao, Halley: > > During playback of progressive content, I tried to save the content to > disk as well. > > > > But the result is strange: > > > > Some contents are saved correctly, some contents are saved but can?t > playback > > again; some contents even can?t playback during progressive downloaded. > > > > > > > > ## most ogg contents work well, the saved contents can playback again > > > > gst-launch-0.10 souphttpsrc > > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! > decodebin > > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > > > > > ## some mp4 saved contents can?t playback again, the saved contents > differ from > > the original one; even the following test.mp4 and test2.mp4 are different > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! filesink > > location=/home/halley/swap/streaming/test2.mp4 > > > > At first use decodebin2! > > If the http source is seekable, the muxer in decodebin will do pull. You > could try: > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee > name=t ! > decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink > location=test.mp4 > > Stefan > > > > > > > ## some wmv contents even can?t playback during progressive downloaded > (though > > some saved wmv contents can playback again) > > > > gst-launch-0.10 -v -v souphttpsrc location=http:// > > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > > > > > thanks in advance for your help. > > > > > > > > > > > > *ZHAO, Halley (Aihua)* > > > > Email: halley.zhao at intel.com > > > > Tel: +86(21)61166476 iNet: 8821-6476 > > > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Mon Sep 20 09:47:16 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Mon, 20 Sep 2010 10:47:16 +0300 Subject: [gst-devel] Help with complicated RTPBin pipeline In-Reply-To: References: Message-ID: Hi, On Mon, Sep 20, 2010 at 7:10 AM, Andy Savage wrote: > Hi there, > > I need a little bit of help with a long gstreamer pipeline and > sending/receiving video/audio. I have two pipelines which work > successfully... > > *Sending Video* > gst-launch gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 > typefind=true ! typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, > width=640, height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! > ffenc_mpeg4 ! rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 > rtpbin.send_rtp_src_0 ! udpsink port=5502 host=192.168.10.175 > rtpbin.send_rtcp_src_0 ! udpsink port=5510 host=192.168.10.175 sync=false > async=false udpsrc port=5510 ! rtpbin.recv_rtcp_sink_0 autoaudiosrc > samplesperbuffer=1000 ! alawenc ! rtppcmapay ! rtpbin.send_rtp_sink_1 > rtpbin.send_rtp_src_1 ! udpsink port=5504 host=192.168.10.175 > rtpbin.send_rtcp_src_1 ! udpsink port=5512 host=192.168.10.175 sync=false > async=false udpsrc port=5512 ! rtpbin.recv_rtcp_sink_1 > > *Receiving Video* > gst-launch gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, > media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, > profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! > rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc > port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink > host=192.168.10.175 port=5510 sync=false async=false udpsrc > caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA > port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! > autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 > rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false > async=false > > I wanted to combine them so that I could have sending/receiving in one > pipeline. Is this possible? > I've tried something like that a few months ago and it didn't work (even if using an application instead of gst-launch). As I didn't have much time to dig I just continued using two separate pipelines. > > Right now I simply launch two gstreamer instances (one for sending, one for > receiving). The problem is that in the receiving instance I would like to > access the local camera and place it as a local preview window into the > receiving video. This works fine but the problem is, you can't have two > gstreamer instances accessing the camera. I thought if I could combine the > above into one pipeline I could use tee and split them then there would only > ever be one instance of the camera used at once. > Can't you put a tee in the sending pipeline and attach for instance an xvimagesink to it? In case you're not sending data (no rtp involved) it's even simpler, as you just need a kind-of v4lsrc ! xvimagesink pipe. Regards > > *If I simply combine the above two into something like:* > gst-launch \ > gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 typefind=true ! > typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, width=640, > height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! ffenc_mpeg4 ! > rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! > udpsink port=5502 host=192.168.10.175 rtpbin.send_rtcp_src_0 ! udpsink > port=5510 host=192.168.10.175 sync=false async=false udpsrc port=5510 ! > rtpbin.recv_rtcp_sink_0 autoaudiosrc samplesperbuffer=1000 ! alawenc ! > rtppcmapay ! rtpbin.send_rtp_sink_1 rtpbin.send_rtp_src_1 ! udpsink > port=5504 host=192.168.10.175 rtpbin.send_rtcp_src_1 ! udpsink port=5512 > host=192.168.10.175 sync=false async=false udpsrc port=5512 ! > rtpbin.recv_rtcp_sink_1 \ > gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, > media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, > profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! > rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc > port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink > host=192.168.10.175 port=5510 sync=false async=false udpsrc > caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA > port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! > autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 > rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false > async=false > > It doesn't appear to work... gstreamer starts successfully but I can't > recieve video on either end > > Has this been attempted before? Is there a best way to do it? (I presume > people normally use two pipelines?) > > Kind regards, > Andy Savage > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wl2776 at gmail.com Mon Sep 20 13:00:08 2010 From: wl2776 at gmail.com (wl2776) Date: Mon, 20 Sep 2010 04:00:08 -0700 (PDT) Subject: [gst-devel] Curl-based source element. Message-ID: <1284980408467-2546784.post@n4.nabble.com> Hi all. I've seen several posts, related to this work, containing a link to the sources, but it link seemed to be generated using dyndns, and is invalid now. I don't have much time to search the author and study his code, so, I'm going to create one more CURL-based source element, primarily, because I need an FTP file source, and Windows is missing one. So, is there any global contradictions or technically insuperable obstacles? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Curl-based-source-element-tp2546784p2546784.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From gary at mlbassoc.com Mon Sep 20 13:36:14 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Mon, 20 Sep 2010 05:36:14 -0600 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <20100918165133.GU25114@crichton.homelinux.org> References: <4C93F361.8080206@mlbassoc.com> <20100918165133.GU25114@crichton.homelinux.org> Message-ID: <4C97472E.9060205@mlbassoc.com> On 09/18/2010 10:51 AM, Marc Leeman wrote: >> Sadly, even though the server is pumping out data, I don't see >> anything at the client (my xvimagesink window never opens up) >> >> Any ideas what I'm doing wrong or how to diagnose this? > > You??e probably not sending the data for the decoder to start decoding > (NAL 7/8). Try starting the decoder before the sender and see if you get > video decoded. > > If so, add a recent h264parser in the chain and it should remultiplex > the correct data into the stream for the decoder to start decoding. I am already using the very latest release of everything. gstreamer 0.10.30 gst-plugins-good 0.10.25 gst-plugins-base 0.10.30 gst-plugins-bad 0.10.20 gst-plugins-ugly 0.10.16 > Why are you first putting your data into TS and then again in RTP? Because that's what the customer wants :-) My understanding is that TS is a container that will eventually contain both video and audio and is not network worthy by itself, hence the RTP (RealTime [network] protocol) -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From gary at mlbassoc.com Mon Sep 20 15:51:17 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Mon, 20 Sep 2010 07:51:17 -0600 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C97472E.9060205@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> <20100918165133.GU25114@crichton.homelinux.org> <4C97472E.9060205@mlbassoc.com> Message-ID: <4C9766D5.9020908@mlbassoc.com> On 09/20/2010 05:36 AM, Gary Thomas wrote: > On 09/18/2010 10:51 AM, Marc Leeman wrote: >>> Sadly, even though the server is pumping out data, I don't see >>> anything at the client (my xvimagesink window never opens up) >>> >>> Any ideas what I'm doing wrong or how to diagnose this? >> >> You??e probably not sending the data for the decoder to start decoding >> (NAL 7/8). Try starting the decoder before the sender and see if you get >> video decoded. >> >> If so, add a recent h264parser in the chain and it should remultiplex >> the correct data into the stream for the decoder to start decoding. > > I am already using the very latest release of everything. > gstreamer 0.10.30 > gst-plugins-good 0.10.25 > gst-plugins-base 0.10.30 > gst-plugins-bad 0.10.20 > gst-plugins-ugly 0.10.16 > >> Why are you first putting your data into TS and then again in RTP? > > Because that's what the customer wants :-) > > My understanding is that TS is a container that will eventually contain > both video and audio and is not network worthy by itself, hence the RTP > (RealTime [network] protocol) > That said, I've also tried this with a raw H264 stream and the same thing happens. As I've pointed out, these pipelines do not even work reliably on my desktop system all the time. Using just the raw H264 stream, I stream out and in on my desktop, using the local network (127.0.0.1) While it may work, even for a while, after some time the receiver no longer gets new frames (motion stops). Is there some way to get useful debug information on this? I don't see any messages about the RTP stream until level 4 and then it's too low level to interpret easily. I'd like to know when packets come in, how they are parsed, passed on, etc, where the keyframes are, etc. This sort of data doesn't seem to show up in the debug data. -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From marc.leeman at gmail.com Mon Sep 20 16:07:22 2010 From: marc.leeman at gmail.com (Marc Leeman) Date: Mon, 20 Sep 2010 16:07:22 +0200 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C9766D5.9020908@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> <20100918165133.GU25114@crichton.homelinux.org> <4C97472E.9060205@mlbassoc.com> <4C9766D5.9020908@mlbassoc.com> Message-ID: <20100920140722.GD2176@crichton.homelinux.org> > > Because that's what the customer wants :-) Is this what the customer really wants (getting video reliably over the network) or is it what you've been told the customer wants :-) > > My understanding is that TS is a container that will eventually contain > > both video and audio and is not network worthy by itself, hence the RTP > > (RealTime [network] protocol) You might have some problems with the timestamps that are in the RTP header and those that are in TS. If only one is slightly off; you'll run into problems. > That said, I've also tried this with a raw H264 stream and the same > thing happens. > > As I've pointed out, these pipelines do not even work reliably on > my desktop system all the time. Using just the raw H264 stream, I > stream out and in on my desktop, using the local network (127.0.0.1) > While it may work, even for a while, after some time the receiver no > longer gets new frames (motion stops). > > Is there some way to get useful debug information on this? I don't > see any messages about the RTP stream until level 4 and then it's > too low level to interpret easily. I'd like to know when packets > come in, how they are parsed, passed on, etc, where the keyframes > are, etc. This sort of data doesn't seem to show up in the debug > data. We've been doing quite some h.264 streaming ourselves; and I can't say we've had many problems. There are a number of things you need to take into account. There are a number of encoders that only send NAL 7/8 once. You can configure the rtp payloader to re-multipex those on a regular interval in your stream. In your case; that will not work since it's not h264 you're sending, but h264/ts. That's why you can also instruct the h264 parser to re-include those settings into the stream (before you add the mpeg ts layer). Our focus is mainly on rtp/h264; but AFAIK; streaming is stable, both from hardware sources as from x264 sources (and file based). -- greetz, marc E = MC ** 2 +- 3db crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: Digital signature URL: From gary at mlbassoc.com Mon Sep 20 16:55:03 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Mon, 20 Sep 2010 08:55:03 -0600 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <20100920140722.GD2176@crichton.homelinux.org> References: <4C93F361.8080206@mlbassoc.com> <20100918165133.GU25114@crichton.homelinux.org> <4C97472E.9060205@mlbassoc.com> <4C9766D5.9020908@mlbassoc.com> <20100920140722.GD2176@crichton.homelinux.org> Message-ID: <4C9775C7.7030609@mlbassoc.com> On 09/20/2010 08:07 AM, Marc Leeman wrote: >>> Because that's what the customer wants :-) > > Is this what the customer really wants (getting video reliably over the > network) or is it what you've been told the customer wants :-) That's always the $64,000 question! >>> My understanding is that TS is a container that will eventually contain >>> both video and audio and is not network worthy by itself, hence the RTP >>> (RealTime [network] protocol) > > You might have some problems with the timestamps that are in the RTP > header and those that are in TS. If only one is slightly off; you'll run > into problems. Any hints on how to diagnose this? >> That said, I've also tried this with a raw H264 stream and the same >> thing happens. >> >> As I've pointed out, these pipelines do not even work reliably on >> my desktop system all the time. Using just the raw H264 stream, I >> stream out and in on my desktop, using the local network (127.0.0.1) >> While it may work, even for a while, after some time the receiver no >> longer gets new frames (motion stops). >> >> Is there some way to get useful debug information on this? I don't >> see any messages about the RTP stream until level 4 and then it's >> too low level to interpret easily. I'd like to know when packets >> come in, how they are parsed, passed on, etc, where the keyframes >> are, etc. This sort of data doesn't seem to show up in the debug >> data. > > We've been doing quite some h.264 streaming ourselves; and I can't say > we've had many problems. > > There are a number of things you need to take into account. > > There are a number of encoders that only send NAL 7/8 once. You can > configure the rtp payloader to re-multipex those on a regular interval > in your stream. > > In your case; that will not work since it's not h264 you're sending, but > h264/ts. That's why you can also instruct the h264 parser to re-include > those settings into the stream (before you add the mpeg ts layer). > > Our focus is mainly on rtp/h264; but AFAIK; streaming is stable, both > from hardware sources as from x264 sources (and file based). In this light, I'm going to concentrate a bit more on the pure H264 streaming. I have had a little success today, but it's still not great. I have to have the client started before the server starts, so I'm guessing that I have the "only one NAL" issue you mention above. How can I change the behaviour as you mention above? When it does run, I see messages like this on the client/receiver: WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped. Additional debug info: gstbasesink.c(2686): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: There may be a timestamping problem, or this computer is too slow. I tried adjusting the latency/jitterbuffer on the receiver, but it didn't seem to change much. Thanks for the help -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From marc.leeman at gmail.com Mon Sep 20 17:52:19 2010 From: marc.leeman at gmail.com (Marc Leeman) Date: Mon, 20 Sep 2010 17:52:19 +0200 Subject: [gst-devel] rtpbin + mpegtsmux Message-ID: <20100920155219.GF2176@crichton.homelinux.org> -- greetz, marc Chemistry professors never die, they just fail to react. crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux -------------- next part -------------- An embedded message was scrubbed... From: Marc Leeman Subject: Re: [gst-devel] rtpbin + mpegtsmux Date: Mon, 20 Sep 2010 17:51:09 +0200 Size: 2811 URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: Digital signature URL: From ivan_zoli at fastwebnet.it Fri Sep 3 18:57:25 2010 From: ivan_zoli at fastwebnet.it (ivan zoli) Date: Fri, 03 Sep 2010 16:57:25 -0000 Subject: [gst-devel] Stream on iPad, iPhone, Android... Message-ID: > Stefan is correct that gcc auto-vectorization will not give any significant gain > in acceleration of the video encoders. Moreover, if you are thinking of encoding > on mobile devices, you need to spend several months writing/optimizing encoder > using NEON, and end of day you might end up with something which is not realtime > encoding. FFMPEG's usefulness is limited to Desktop PCs. You may contact some > codec company which provides encoder on iPhone or Android. That should help. But iPhone has built in it's own AV steam client, for example it can connect YouTube and watch a video... So it will be nice to do a server that can stream live video on thoose devices, without the need to xcompile Gstreamer or build my own codec. I was not able to build a working pipeline to do this, only AV stream to VLC... Ivan From pedro.faria at grupofox.com.br Mon Sep 13 16:59:17 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Mon, 13 Sep 2010 07:59:17 -0700 (PDT) Subject: [gst-devel] STREAMING, CONVERT H264 Message-ID: <1284389957324-2537582.post@n4.nabble.com> Is Posible send a video stream of the DNS and convert to H264 in the same pipeline? the image format of the camera is in MJPEG I tried something like this gst-launch souphttpsrc location = "192.9.0.154:9545: XCAM: 2"! jpecdec! ffmpegcolorspace! h264dec! autovideosink something like, if someone can help me, I'll be very grateful! -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/STREAMING-CONVERT-H264-tp2537582p2537582.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Mon Sep 13 23:17:46 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Mon, 13 Sep 2010 14:17:46 -0700 (PDT) Subject: [gst-devel] STREAMING, CONVERT H264 In-Reply-To: <1284389957324-2537582.post@n4.nabble.com> References: <1284389957324-2537582.post@n4.nabble.com> Message-ID: <1284412666836-2538090.post@n4.nabble.com> Someone Please ;) -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/STREAMING-CONVERT-H264-tp2537582p2538090.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Tue Sep 14 22:41:08 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Tue, 14 Sep 2010 13:41:08 -0700 (PDT) Subject: [gst-devel] VIDEO Internal Data Flow Error Message-ID: <1284496868018-2539637.post@n4.nabble.com> Hello! I'm trying to run two video cameras at the same time, but this showing an error "Internal data flow error" reason not-linked (-1). Can anyone help me? I used these pipelines: gst-launch souphttpsrc location = "http://videotest:999/CAM:2! decodebin! jpegdec! ffmpegcolorspace souphttpsrc location =" http://videotest:999/CAM:2! decodebin! jpegdec! ffmpegcolorspace also tried ... souphttpsrc location = "http://videotest:999/CAM:2! decodebin! jpegdec! ffmpegcolorspace! dshowvideosink Building on the topic, the image of the video cameras are in MJPEG it is possible to capture these images via a URL and convert to h264 in a single PIPELINE? Thanks in advance Sorry an error of Portuguese, I am Brazilian and I have no fluency in the language! Hugs -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/VIDEO-Internal-Data-Flow-Error-tp2539637p2539637.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Tue Sep 14 22:43:54 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Tue, 14 Sep 2010 13:43:54 -0700 (PDT) Subject: [gst-devel] GStreamer + C++ Message-ID: <1284497034642-2539641.post@n4.nabble.com> Hi All ! I'm using GStreamer with Visual C + + 2010 Can anyone help me, I need to hide the command prompt, because when the User clicks the button to open the camera prompt opens on top of the image need to hide or minimize the prompt. Thanks in advance -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/GStreamer-C-tp2539641p2539641.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Wed Sep 15 19:49:51 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Wed, 15 Sep 2010 10:49:51 -0700 (PDT) Subject: [gst-devel] STREAMING, CONVERT H264 In-Reply-To: <1284389957324-2537582.post@n4.nabble.com> References: <1284389957324-2537582.post@n4.nabble.com> Message-ID: <1284572991658-2540925.post@n4.nabble.com> i tried also gst-launch location="http://dns.dvrdns.org:654/CAM:1 ! multipartdemux ! image/jpeg, width=640, height=480, framerate=25/1 ! jpegdec ! x264enc ! filesink location=c:/teste.h264 the error ERROR: From element / GstPipeline:pipeline0/GstX264Enc:x2640: Could not encode stream. Additional debug info: ..\..\..\..\..\Source\gst-plugins-ugly\ext\x264\gstx264enc.c(737) : gst_x264_enc_header_buf (): /GstPipeline:pipeline0/GstX264Enc:x264enc0: Unexpected x264 header. ERROR: pipeline doens't want to preroll. Seeting Pipeline to Null . . . Freeing Pipeline . . . Can help me? Thanks' -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/STREAMING-CONVERT-H264-tp2537582p2540925.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.faria at grupofox.com.br Wed Sep 15 21:52:37 2010 From: pedro.faria at grupofox.com.br (Pedro.henrique) Date: Wed, 15 Sep 2010 12:52:37 -0700 (PDT) Subject: [gst-devel] STREAMING, CONVERT H264 In-Reply-To: <1284389957324-2537582.post@n4.nabble.com> References: <1284389957324-2537582.post@n4.nabble.com> Message-ID: <1284580357905-2541116.post@n4.nabble.com> I Tried also gst-launch souphttpsrc location=?http://" ! multipartdemux ! imagen/jpegdec, width=640, height=480, framerate=25/1 ! jpegdec ! x264enc ! fakesink also gst-launch souphttpsrc location=?http://" ! mpegtsdemux name=dmx dmx. ! queue2 max-size-buffers=0 max-size-time=0 max-size-bytes=0 ! ffdec_h264 ! autovideosink ________________________________________________________________________ I Tried also gst-launch souphttpsrc location=?http://" ! multipartdemux ! imagen/jpegdec, width=640, height=480, framerate=25/1 ! jpegdec ! x264enc ! fakesink also gst-launch souphttpsrc location=?http://" ! mpegtsmux ! ffdec_h264 ! ffmpegcolorspace ! fakesink ____________________________________________________________ gst-launch souphttpsrc location=?http://" ! mpegtsmux ! ffdec_h264 ! ffmpegcolorspace ! fakesink _________________________________________________ gst-launch souphttpsrc location=?http://cybelar.dvrdns.org:9538/XCAM:3 ! mpegtsmux ! queue ! ffdec_h264 ! queue ! ffmpegcolorspace ! queue ! autovideosink ______________________________________________ Seeting pipeline to Paused . . . Pipeline is PREROLLING . . . but video window don't open :S -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/STREAMING-CONVERT-H264-tp2537582p2541116.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From faruk.namli at gmail.com Thu Sep 16 10:22:26 2010 From: faruk.namli at gmail.com (frknml) Date: Thu, 16 Sep 2010 01:22:26 -0700 (PDT) Subject: [gst-devel] simultaneously showing and recording MPEG-2 video Message-ID: <1284625346611-2541735.post@n4.nabble.com> Hi everyone; I'm very new for gstreamer and i'm developing multimedi project.My first aim is showing video which is in my local file system and at the same time i want to record this video as a second copy of my original video.I can show video and i can record this video individually but not simultaneously.I couldn't find enough information in Gstreamer Application Development Manual from gstreamer.net to solve my problem. If you have any document,ebook or example please share me :) because i couldn't find any useful resource about gstreamer. Faruk -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From dotsony at triplehelix.org Fri Sep 17 22:51:32 2010 From: dotsony at triplehelix.org (Brandon Lewis) Date: Fri, 17 Sep 2010 13:51:32 -0700 Subject: [gst-devel] Pitivi and tests In-Reply-To: <4C936B92.9020207@inattendu.org> References: <4C936B92.9020207@inattendu.org> Message-ID: <1284756692.2233.0.camel@kermit> On Fri, 2010-09-17 at 17:22 +0400, Nicolas Bertrand wrote: > Hi > I getted the last version of PiTiVi from > > Git I made the : > ./autogen.sh > make > > bin/pitivi works like a charm > > But I would like to run the tests under the directory tests. But it > doesn't work due to dependance on pitivi. > > So how can I run it ? Should I have to install pitivi ( make install) ? just run "make check" after running ./autogen.sh && make > > Thanks > Nico > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From umakantgoyal1 at gmail.com Sat Sep 18 07:53:49 2010 From: umakantgoyal1 at gmail.com (qwerty-1) Date: Fri, 17 Sep 2010 22:53:49 -0700 (PDT) Subject: [gst-devel] Some basic queries Message-ID: <1284789229601-2544760.post@n4.nabble.com> Hi All, i have couple of queries: 1. Can i have a single pipeline for receiving data from network (RTP Packets), decode the data, encode the data and send the encoded data over network (RTP Packets)? 2. Can i have element/bin that can send or receive data to/from network using same ports? Thanks in Advance -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Some-basic-queries-tp2544760p2544760.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From baldur at medizza.com Mon Sep 20 18:20:25 2010 From: baldur at medizza.com (Baldur Gislason) Date: Mon, 20 Sep 2010 16:20:25 +0000 Subject: [gst-devel] Pipeline performance assessment Message-ID: I have been doing some experiments with transcoding using gstreamer and I've had some pretty terrible performance. VLC has no problem performing the same task with as far as I can tell, identical encoder profiles, leaving plenty of resources behind, while with gstreamer I am maxing out one core and going about 40-50% on the remaining three, and not quite keeping up with real time transcoding. Machine is a 64 bit Linux system on an Intel Xeon E5507 quad core processor. I am decoding MPEG2 and encoding H.264. I was wondering if there was a recommended method of wathing the CPU time used by an element in a pipe. My current pipeline goes like this: udpsrc -> mpegtsdemux demux -> decodebin2 decoder -> x264enc -> queue -> mpegtsmux decoder -> audioconvert -> audioresample -> ffenc_aac -> queue -> mpegtsmux mux -> filesink (for testing purposes, would be udpsink in production) Is there any inherent problem with this configuration? I tried placing queues between decoder and encoders but that did not make any difference I have a hunch that the handling of the uncompressed video is inefficient but I'm not quite sure what the best tools are for finding bottlenecks of streaming applications. Baldur From ensonic at hora-obscura.de Mon Sep 20 20:20:19 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Mon, 20 Sep 2010 21:20:19 +0300 Subject: [gst-devel] Curl-based source element. In-Reply-To: <1284980408467-2546784.post@n4.nabble.com> References: <1284980408467-2546784.post@n4.nabble.com> Message-ID: <4C97A5E3.3070005@hora-obscura.de> Am 20.09.2010 14:00, schrieb wl2776: > > Hi all. > > I've seen several posts, related to this work, containing a link to the > sources, but it link seemed to be generated using dyndns, and is invalid > now. > > I don't have much time to search the author and study his code, so, I'm > going to create one more CURL-based source element, primarily, because I > need an FTP file source, and Windows is missing one. > > So, is there any global contradictions or technically insuperable obstacles? > https://bugzilla.gnome.org/show_bug.cgi?id=558450 Stefan From josh at joshdoe.com Mon Sep 20 20:41:24 2010 From: josh at joshdoe.com (Josh Doe) Date: Mon, 20 Sep 2010 14:41:24 -0400 Subject: [gst-devel] Looking for gstreamer based muti-stream video display and recorder In-Reply-To: <1284929670.1941.20.camel@lappc-p348> References: <1284929670.1941.20.camel@lappc-p348> Message-ID: I'd be interested to know if you find anything. We would like to develop an application that does much of what Streampix5 does. Right now we are in the planning stages. If you want to send me any suggestions it'd be great to get any input. -Josh On Sun, Sep 19, 2010 at 4:54 PM, Emmanuel Pacaud wrote: > Hi, > > I'm looking for a software which could display several video streams on > one screen, with recording capabilities. > > Something similar to multieye-net or streampix5, but free software and > based on gstreamer. > > http://www.artec.de/en/products/multieye/products/multieye-software.html > http://www.norpix.com/products/streampix5/streampix5.php > > Does such a project exist ? > > Regards, > > Emmanuel. > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh at joshdoe.com Mon Sep 20 21:00:09 2010 From: josh at joshdoe.com (Josh Doe) Date: Mon, 20 Sep 2010 15:00:09 -0400 Subject: [gst-devel] Interest in Matlab plugin loader Message-ID: Just throwing this out there, but is anyone interested in a plugin loader for Matlab? I'm thinking something along the lines of the frei0r plugin, but able to support a wider variety of video formats, particularly higher bitdepths like 16-bit and floating point. Matlab code would need to be compiled obviously, and bumps the barrier to entry way up ($5,000 for a commercial license), so it's pretty much limited to academic and commercial use. -Josh -------------- next part -------------- An HTML attachment was scrubbed... URL: From andreynech at googlemail.com Mon Sep 20 21:05:18 2010 From: andreynech at googlemail.com (Andrey Nechypurenko) Date: Mon, 20 Sep 2010 21:05:18 +0200 Subject: [gst-devel] Variable frame size and theora decoder Message-ID: Hi, I am using the following settings (two processes) to experiment with different adaptation strategies for changing network conditions in real-time streaming application (some elements are omitted for simplicity): v4l2src->videoscale->caps->theoraenc->rtptheorapay->appsink ... network ... appsrc->rtpthoradepay->theoradec->videoscale(to fixed size)->fakesink Fakesink's handoff mechanism is used to visualize frames. To react on reduced network bandwidth, I am adjusting caps between videoscale and theoraenc to reduce the frame size. It works more or less as expected if h264 is used. However, with theora, as soon as frame size is reduced at the encoding side, the decoding side's visualization starts producing funny mosaic of color squares. I am suspecting that theoradec somehow does not understand the new frame size. But I am not sure. So I would kindly appreciate if someone more familiar with theese formats can explain this behavior. My assumption is that h264 (using x264enc) somehow includes frame size with encoded frames where is theora not doing it. But I am not sure if it is correct assumption. In addition I would appreciate any suggestions and tips on how to reconfigure the decoding pipeline to handle new frame size. Thank you very much, Andrey. From padhi.chinmaya at gmail.com Mon Sep 20 21:13:12 2010 From: padhi.chinmaya at gmail.com (Chinmaya Padhi) Date: Tue, 21 Sep 2010 00:43:12 +0530 Subject: [gst-devel] Error compiling gstreamer Message-ID: Hi , I am compiling gstreamer from git. Got the below error message : Any idea how to solve this : gtk-doc: Compiling scanner libtool: compile: gcc -I../../libs -I../.. -I../../libs -I../.. -pthread -I/usr/local/include/glib-2.0 -I/usr/local/lib/glib-2.0/include -DG_THREADS_MANDATORY -DG_DISABLE_CAST_CHECKS -I/usr/include/libxml2 -Wall -Wdeclaration-after-statement -Wvla -Wpointer-arith -Wmissing-declarations -Wmissing-prototypes -Wredundant-decls -Wundef -Wwrite-strings -Wformat-nonliteral -Wformat-security -Wold-style-definition -Winit-self -Wmissing-include-dirs -Waddress -Waggregate-return -Wno-multichar -Wnested-externs -g -g -O2 -c gstreamer-scan.c -fPIC -DPIC -o .libs/gstreamer-scan.o libtool: compile: gcc -I../../libs -I../.. -I../../libs -I../.. -pthread -I/usr/local/include/glib-2.0 -I/usr/local/lib/glib-2.0/include -DG_THREADS_MANDATORY -DG_DISABLE_CAST_CHECKS -I/usr/include/libxml2 -Wall -Wdeclaration-after-statement -Wvla -Wpointer-arith -Wmissing-declarations -Wmissing-prototypes -Wredundant-decls -Wundef -Wwrite-strings -Wformat-nonliteral -Wformat-security -Wold-style-definition -Winit-self -Wmissing-include-dirs -Waddress -Waggregate-return -Wno-multichar -Wnested-externs -g -g -O2 -c gstreamer-scan.c -o gstreamer-scan.o >/dev/null 2>&1 gtk-doc: Linking scanner libtool: link: gcc -o .libs/gstreamer-scan .libs/gstreamer-scan.o -pthread ../../gst/.libs/libgstreamer-0.10.so -L/usr/local/lib /usr/local/lib/ libgobject-2.0.so /usr/local/lib/libgthread-2.0.so /usr/local/lib/ libgmodule-2.0.so -lrt /usr/local/lib/libglib-2.0.so -pthread gtk-doc: Running scanner gstreamer-scan /home/chinmaya/Desktop/gstreamer/gstreamer/docs/gst/.libs/lt-gstreamer-scan: symbol lookup error: /home/chinmaya/Desktop/gstreamer/gstreamer/gst/.libs/libgstreamer-0.10.so.0: undefined symbol: g_error_get_type Scan failed: make[5]: *** [scan-build.stamp] Error 127 make[5]: Leaving directory `/home/chinmaya/Desktop/gstreamer/gstreamer/docs/gst' make[4]: *** [all] Error 2 Any inputs will be helpful. ~Chinmaya -------------- next part -------------- An HTML attachment was scrubbed... URL: From emmanuel at gnome.org Mon Sep 20 21:16:40 2010 From: emmanuel at gnome.org (Emmanuel Pacaud) Date: Mon, 20 Sep 2010 21:16:40 +0200 Subject: [gst-devel] Looking for gstreamer based muti-stream video display and recorder In-Reply-To: References: <1284929670.1941.20.camel@lappc-p348> Message-ID: <1285010200.2728.0.camel@lappc-p348> Le dimanche 19 septembre 2010 ? 23:19 +0200, Gustavo Orrillo a ?crit : > Moldeo could do it and it is based on gstreamer. As of this moment > does not support recording capabilities > > Check the website (it's on spanish) > > http://www.moldeo.org/ Thanks for the pointer, Gustavo. Emmanuel. From ylatuya at gmail.com Mon Sep 20 21:45:27 2010 From: ylatuya at gmail.com (Andoni Morales) Date: Mon, 20 Sep 2010 21:45:27 +0200 Subject: [gst-devel] Pipeline performance assessment In-Reply-To: References: Message-ID: 2010/9/20 Baldur Gislason : > I have been doing some experiments with transcoding using gstreamer > and I've had some pretty terrible performance. > VLC has no problem performing the same task with as far as I can tell, > identical encoder profiles, leaving plenty of resources behind, while > with gstreamer I am maxing out one core and going about 40-50% on the > remaining three, and not quite keeping up with real time transcoding. > Machine is a 64 bit Linux system on an Intel Xeon E5507 quad core processor. > I am decoding MPEG2 and encoding H.264. > I was wondering if there was a recommended method of wathing the CPU > time used by an element in a pipe. To benchmark a pipeline/element in gstreamer you can use the 'cpureport' element although you will need to apply some patches in bugzilla to improve the precision in the results (https://bugzilla.gnome.org/show_bug.cgi?id=627274) You have to set the 'cpu-clock' property to CLOCK_THREAD_CPUTIME_ID to get the time spent in the cpu by the encoder in the encoding thread, which is created surrounding the encoder with queues. Otherwise you will get the cpu time of the whole process instead of the cpu time of the encoding thread. This element post messages in the bus for each frame and the cpu % can be get doing 'cpu-time' / 'actual-time' * 100 > My current pipeline goes like this: > udpsrc -> mpegtsdemux > demux -> decodebin2 > decoder -> x264enc -> queue -> mpegtsmux > decoder -> audioconvert -> audioresample -> ffenc_aac -> queue -> mpegtsmux > mux -> filesink (for testing purposes, would be udpsink in production) > Is there any inherent problem with this configuration? I tried placing > queues between decoder and encoders but that did not make any > difference > I have a hunch that the handling of the uncompressed video is > inefficient but I'm not quite sure what the best tools are for finding > bottlenecks of streaming applications. You can try using queues with an unlimited size, in the unlikely case where the video encoding thread is being blocked because of one the queues being full. Also, if you are looking to do real time, the default setting of the x264 encoder might not be the best ones. A good reference are the ffmpeg presets, which can me mapped to x264 settings using the following guide: http://rob.opendot.cl/index.php/useful-stuff/x264-to-ffmpeg-option-mapping/ Andoni > > Baldur > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Andoni Morales Alastruey LongoMatch:The Digital Coach http://www.longomatch.ylatuya.es From gibrovacco at gmail.com Mon Sep 20 21:52:12 2010 From: gibrovacco at gmail.com (Gibro Vacco) Date: Mon, 20 Sep 2010 22:52:12 +0300 Subject: [gst-devel] Interest in Matlab plugin loader In-Reply-To: References: Message-ID: <1285012332.6292.5.camel@Nokia-N900> Hi, ----- Messaggio originale ----- > Just throwing this out there, but is anyone interested in a plugin loader > for Matlab? I'm thinking something along the lines of the frei0r plugin, > but able to support a wider variety of video formats, particularly higher > bitdepths like 16-bit and floating point. Matlab code would need to be > compiled obviously, and bumps the barrier to entry way up ($5,000 for a > commercial license), so it's pretty much limited to academic and > commercial use. Why not to use octave instead of Matlab? I've been comparing them for closed loop control systems design (both discrete and continuous) and I found, at the end of the day, very few differences. Another project, known as scilab, should even be able to execute matlab programs (afair). OT: I've always been fascinated about the possibility to implement a closed-loop control system with GStreamer, but never had time. Maybe after retirement ;)... Regards > > -Josh -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh at joshdoe.com Mon Sep 20 22:12:58 2010 From: josh at joshdoe.com (Josh Doe) Date: Mon, 20 Sep 2010 16:12:58 -0400 Subject: [gst-devel] Interest in Matlab plugin loader In-Reply-To: <1285012332.6292.5.camel@Nokia-N900> References: <1285012332.6292.5.camel@Nokia-N900> Message-ID: It would be great to use Octave, however there doesn't appear to be a compiler for Octave scripts. One could use liboctave directly, but what I'm thinking about is allowing non-programmers the ability to create processing elements from a Matlab (or Octave) language. -Josh On Mon, Sep 20, 2010 at 3:52 PM, Gibro Vacco wrote: > Hi, > > ----- Messaggio originale ----- > > Just throwing this out there, but is anyone interested in a plugin loader > > > for Matlab? I'm thinking something along the lines of the frei0r plugin, > > but able to support a wider variety of video formats, particularly higher > > > bitdepths like 16-bit and floating point. Matlab code would need to be > > compiled obviously, and bumps the barrier to entry way up ($5,000 for a > > commercial license), so it's pretty much limited to academic and > > commercial use. > > Why not to use octave instead of Matlab? I've been comparing them for > closed loop control systems design (both discrete and continuous) and I > found, at the end of the day, very few differences. Another project, known > as scilab, should even be able to execute matlab programs (afair). > > OT: I've always been fascinated about the possibility to implement a > closed-loop control system with GStreamer, but never had time. Maybe after > retirement ;)... > > Regards > > > > > -Josh > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Mon Sep 20 22:18:54 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Mon, 20 Sep 2010 23:18:54 +0300 Subject: [gst-devel] Error compiling gstreamer In-Reply-To: References: Message-ID: <4C97C1AE.2030907@hora-obscura.de> Am 20.09.2010 22:13, schrieb Chinmaya Padhi: > Hi , > > I am compiling gstreamer from git. Got the below error message : Any idea how to > solve this : try running ./autoregen.sh - something is borked in your build setup. Also du you intend to link against libs in /usr/local ? Stefan > > gtk-doc: Compiling scanner > libtool: compile: gcc -I../../libs -I../.. -I../../libs -I../.. -pthread > -I/usr/local/include/glib-2.0 -I/usr/local/lib/glib-2.0/include > -DG_THREADS_MANDATORY -DG_DISABLE_CAST_CHECKS -I/usr/include/libxml2 -Wall > -Wdeclaration-after-statement -Wvla -Wpointer-arith -Wmissing-declarations > -Wmissing-prototypes -Wredundant-decls -Wundef -Wwrite-strings > -Wformat-nonliteral -Wformat-security -Wold-style-definition -Winit-self > -Wmissing-include-dirs -Waddress -Waggregate-return -Wno-multichar > -Wnested-externs -g -g -O2 -c gstreamer-scan.c -fPIC -DPIC -o > .libs/gstreamer-scan.o > libtool: compile: gcc -I../../libs -I../.. -I../../libs -I../.. -pthread > -I/usr/local/include/glib-2.0 -I/usr/local/lib/glib-2.0/include > -DG_THREADS_MANDATORY -DG_DISABLE_CAST_CHECKS -I/usr/include/libxml2 -Wall > -Wdeclaration-after-statement -Wvla -Wpointer-arith -Wmissing-declarations > -Wmissing-prototypes -Wredundant-decls -Wundef -Wwrite-strings > -Wformat-nonliteral -Wformat-security -Wold-style-definition -Winit-self > -Wmissing-include-dirs -Waddress -Waggregate-return -Wno-multichar > -Wnested-externs -g -g -O2 -c gstreamer-scan.c -o gstreamer-scan.o >/dev/null 2>&1 > gtk-doc: Linking scanner > libtool: link: gcc -o .libs/gstreamer-scan .libs/gstreamer-scan.o -pthread > ../../gst/.libs/libgstreamer-0.10.so > -L/usr/local/lib /usr/local/lib/libgobject-2.0.so > /usr/local/lib/libgthread-2.0.so > /usr/local/lib/libgmodule-2.0.so -lrt > /usr/local/lib/libglib-2.0.so -pthread > gtk-doc: Running scanner gstreamer-scan > /home/chinmaya/Desktop/gstreamer/gstreamer/docs/gst/.libs/lt-gstreamer-scan: > symbol lookup error: > /home/chinmaya/Desktop/gstreamer/gstreamer/gst/.libs/libgstreamer-0.10.so.0: > undefined symbol: g_error_get_type > Scan failed: > make[5]: *** [scan-build.stamp] Error 127 > make[5]: Leaving directory `/home/chinmaya/Desktop/gstreamer/gstreamer/docs/gst' > make[4]: *** [all] Error 2 > > > > Any inputs will be helpful. > > ~Chinmaya > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From marc.leeman at gmail.com Mon Sep 20 22:20:14 2010 From: marc.leeman at gmail.com (Marc Leeman) Date: Mon, 20 Sep 2010 22:20:14 +0200 Subject: [gst-devel] rtpbin + mpegtsmux In-Reply-To: <4C978C0A.3050506@mlbassoc.com> References: <4C93F361.8080206@mlbassoc.com> <20100918165133.GU25114@crichton.homelinux.org> <4C97472E.9060205@mlbassoc.com> <4C9766D5.9020908@mlbassoc.com> <20100920140722.GD2176@crichton.homelinux.org> <4C9775C7.7030609@mlbassoc.com> <20100920155109.GE2176@crichton.homelinux.org> <4C978C0A.3050506@mlbassoc.com> Message-ID: <20100920202014.GH2176@crichton.homelinux.org> > Do I need this on both ends? My version of h264parse (0.10.19) doesn't support > that option. No, we put the option first in rtph264pay and then afterwards in h264parse. If you use it in the parser with an interval of 1s, adding it in the payloader will do nothing. It was added in the parser for exactly the kind of 'strange' combinations that you are trying; sending h264 in TS, in TS/RTP or even in ES over the network. You need this NAL insertion on the sending side only; since this is essential data for the decoder to start decoding. >> Is this pure RTP or RTP/TS? > > These results are RTP/TS. When I do just RTP, I never get anything other than the initial [incomplete] frame. This is probably what I was talking about; try to send it in pure RTP first and see if you get the sender/receiver working correctly. IIRC, when receiving this combination; the timestamps in the buffers are set with RTP and then afterwards adjusted with the TS timing info. I've also been warned several times that the mpegtsmux/mpegtsdemux is not completely correct with timestamps. > Note: I'm using a special H264 encoder (based on the DSP in my processor, not x264enc) That does not really matter as long as your timestamps in your encoder is correct; we're doing basically the same thing with all kinds of boards. > I tried to use x264enc, but my sensor only does UYVY: > gst-launch -vv v4l2src ! video/x-raw-yuv,width=720,height=480 ! ffmpegcolorspace ! x264enc ! filesink location=/tmp/xx > Pipeline:pipeline0/GstV4l2Src:v4l2src0: Device '/dev/video0' cannot capture in the specified format > Additional debug info: > gstv4l2object.c(1971): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: > Tried to capture in YUYV, but device returned format UYVY UYVY does not seem to supported in ffmpegcolorspace > I tried to recode it, but I get an error I don't understand: > gst-launch -vv v4l2src ! video/x-raw-yuv,width=720,height=480 ! ffmpegcolorspace ! 'video/x-raw-yuv,format=(fourcc)YUY2' ! x264enc ! filesink location=/tmp/xx > WARNING: erroneous pipeline: could not link ffmpegcsp0 to x264enc0 It can't match the caps; so it can't link ffmpegcolorspace to x264enc; x264enc only supports I420 (gst-inspect x264enc) > Sorry for all the questions, but my exploration of the documentation has not > been very fruitful (I'm not always this challenged...) nah, there is a lot of information; you just need to find out where. IRC has always been helpful for me. -- greetz, marc One man's "magic" is another man's engineering. "Supernatural" is a null word. -- Robert Heinlein crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 197 bytes Desc: Digital signature URL: From gary at mlbassoc.com Tue Sep 21 00:11:08 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Mon, 20 Sep 2010 16:11:08 -0600 Subject: [gst-devel] H264 problems Message-ID: <4C97DBFC.5000008@mlbassoc.com> Based on the previous discussions, I'm trying to get the H264 encoding to work in the simplest of cases. In this light, I'm trying to just grab some data and encode it. I have two pipeline setups with different encoders. One works and the other just stalls. Why would this one work gst-launch -v v4l2src num-buffers=200 ! 'video/x-raw-yuv,width=720,height=480' ! \ TIVidenc1 codecName=h264enc engineName=codecServer ! filesink location=/tmp/test.h264 and this one stall? gst-launch -v v4l2src num-buffers=200 ! 'video/x-raw-yuv,width=720,height=480,format=(fourcc)UYVY' ! \ ffmpegcolorspace ! 'video/x-raw-yuv,width=720,height=480,format=(fourcc)I420' ! x264enc ! filesink location=/tmp/hold.h264 I checked and there is [plenty of] definitely data going into x264enc Note: the extra ffmpegcolorspace step is there to massage the data format since my sensor can only do UYVY, but x264enc can only handle I420. These components are all from the latest released versions. n.b. I was told to ask on IRC, but that's not really an option for me, so this mailing list is my access. Thanks -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From bbock at digitalforcetech.com Tue Sep 21 00:23:35 2010 From: bbock at digitalforcetech.com (BillBock) Date: Mon, 20 Sep 2010 15:23:35 -0700 (PDT) Subject: [gst-devel] Stopping and restarting complicated pipeline with muxing Message-ID: <1285021415812-2547757.post@n4.nabble.com> All: I?m hoping my issue is some kind of basic misunderstanding of GStreamer design philosophy. I have an application with a fairly complicated pipeline. The pipeline has two sources, v4l2src and alsasrc. It can do simultaneous video capture (without muxing audio), video streaming (using tcpsink), individual frame capture, audio capture, and audio streaming. The various branches of the pipeline all end at fakesinks. If I want to capture video, I block the src pads of v4l2src and alsasrc, unlink stuff, insert an encoder element, a muxer element, and a filesink element, link them to the pipeline, set them to playing, and unblock the src pads. When I?m ready to stop capturing, I again block the src pads, unlink and relink elements so that the encoder, muxer, and filesink are linked to each other, but not to the rest of the elements in the pipeline, send an EOS to the encoder, unlink and set the elements to null, remove them from the pipeline, link stuff back to a fakesink, and unblock the src pads. I do similar things with the other functions. Everything works great (simultaneously and successively) without general stream errors or internal data flow errors. It?s when I try to do muxing that I have issues. It will usually work the first time, except that when unblocking the src pads after reconfiguring the pipeline to end in fakesinks again when done with capture, no more buffers pass through the video source. If I strip down the pileline to simplify it and only do muxing of video and audio (so, no tee elements at all in the pipeline, for example) then successive capturing of muxed video and audio works. I?ve done up to 30 captures in a row. Whenever I stop capturing in this case, I have to send the EOS events on the video and and audio encoders before blocking the video and audio src pads (so I also don?t unlink/relink stuff so that section of the pipeline is off by itself). I always get a couple general stream errors in this case, but everything continues to work afterwards and I can start and stop again, over and over. I?m hoping for suggestions as to what I?m doing wrong. I?d be happy to post code if that would help. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Stopping-and-restarting-complicated-pipeline-with-muxing-tp2547757p2547757.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From andreynech at googlemail.com Tue Sep 21 10:43:57 2010 From: andreynech at googlemail.com (Andrey Nechypurenko) Date: Tue, 21 Sep 2010 10:43:57 +0200 Subject: [gst-devel] H264 problems In-Reply-To: <4C97DBFC.5000008@mlbassoc.com> References: <4C97DBFC.5000008@mlbassoc.com> Message-ID: > and this one stall? > >gst-launch -v v4l2src num-buffers=200 ! > 'video/x-raw-yuv,width=720,height=480,format=(fourcc)UYVY' ! > \ ffmpegcolorspace ! > 'video/x-raw-yuv,width=720,height=480,format=(fourcc)I420' ! > x264enc ! filesink location=/tmp/hold.h264 I would suggest to try the following. Run your pipeline with increased debug level, i.e. GST_DEBUG=3 gst-launch ... and search for any relevant messages which might give you the hint about what is going wrong. In addition, I remember some strange behavior got fixed by explicitly mentioning the framerate. In your case, for example, in caps filter right after v4l2src add framerate=30/1 or whatever is appropriate frame rate for your camera. In addition, since you are using gstreamer on TI platform, you can consider asking the question also here: https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?action=ForumBrowse&forum_id=187 Regards, Andrey. From prabhulingaswamy.bs at globaledgesoft.com Tue Sep 21 11:06:55 2010 From: prabhulingaswamy.bs at globaledgesoft.com (Prabhulinga Swamy B S) Date: Tue, 21 Sep 2010 14:36:55 +0530 Subject: [gst-devel] avimux not muxing complete Audio stream Message-ID: <4C9875AF.9010306@globaledgesoft.com> Hi, I'm using the following pipeline to read h.264 frames from the network and trying to multiplex the sample audio from the 'testaudiosrc'. The video is playing fine. appsrc name=videosrc ! h264parse ! taginject tags="artist=tom" ! mux. \ audiotestsrc name=audiosrc ! ffenc_ac3 ! queue ! \ mux. avimux name=mux ! filesink location=testvideo.avi The problem is, the audio is not multiplexing properly. Only for initial 1sec or so, audio is found in avi file. The following output is from 'aviindex -i testvideo.avi' command, which is used to read the avi file. TAG TYPE CHUNK CHUNK/TYPE POS LEN KEY MS [avilib] V: 30.000 fps, codec=H264, frames=133, width=640, height=480 [avilib] A: 44100 Hz, format=0x2000, bits=16, channels=1, bitrate=0 kbps, [avilib] 15 chunks, 8358 bytes, CBR 01wb 2 0 0 1490 556 0 0.00 01wb 2 1 1 2054 558 0 0.00 01wb 2 2 2 2620 558 0 0.00 01wb 2 3 3 3186 556 0 0.00 01wb 2 4 4 3750 558 0 0.00 01wb 2 5 5 4316 558 0 0.00 01wb 2 6 6 4882 556 0 0.00 01wb 2 7 7 5446 558 0 0.00 01wb 2 8 8 6012 558 0 0.00 01wb 2 9 9 6578 556 0 0.00 01wb 2 10 10 7142 558 0 0.00 01wb 2 11 11 7708 558 0 0.00 01wb 2 12 12 8274 556 0 0.00 01wb 2 13 13 8838 558 0 0.00 01wb 2 14 14 9404 556 0 0.00 00db 1 15 0 9968 14 0 0.00 00db 1 16 1 9990 8 0 0.00 00db 1 17 2 10006 40948 0 0.00 00db 1 18 3 50962 17864 0 0.00 00db 1 19 4 68834 14584 0 0.00 00db 1 20 5 83426 15680 0 0.00 00db 1 21 6 99114 15453 0 0.00 ............. The TAG identifier *00wb* is of audio chunk, and *00db* is of Video chunk. There after no audio chunk is added into the file. I'm going through the debug report with GST_DEBUG=4. If anybody faced this problem, please guide me. Thank you. -- Regards, Prabhulinga Swamy B S From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From mjoachimiak at gmail.com Tue Sep 21 13:52:53 2010 From: mjoachimiak at gmail.com (Michael Joachimiak) Date: Tue, 21 Sep 2010 14:52:53 +0300 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: <1284625346611-2541735.post@n4.nabble.com> References: <1284625346611-2541735.post@n4.nabble.com> Message-ID: You could take a look at tee element. It might be suitable for you. 2010/9/16 frknml > > > > Hi everyone; > > I'm very new for gstreamer and i'm developing multimedi project.My first > aim > is showing video which is in my local file system and at the same time i > want to record this video as a second copy of my original video.I can show > video and i can record this video individually but not simultaneously.I > couldn't find enough information in Gstreamer Application Development > Manual > from gstreamer.net to solve my problem. > If you have any document,ebook or example please share me :) because i > couldn't find any useful resource about gstreamer. > > Faruk > -- > View this message in context: > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -- Your Sincerely Michael Joachimiak -------------- next part -------------- An HTML attachment was scrubbed... URL: From gary at mlbassoc.com Tue Sep 21 14:13:01 2010 From: gary at mlbassoc.com (Gary Thomas) Date: Tue, 21 Sep 2010 06:13:01 -0600 Subject: [gst-devel] H264 problems In-Reply-To: References: <4C97DBFC.5000008@mlbassoc.com> Message-ID: <4C98A14D.8030508@mlbassoc.com> On 09/21/2010 02:43 AM, Andrey Nechypurenko wrote: >> and this one stall? >> >> gst-launch -v v4l2src num-buffers=200 ! >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)UYVY' ! >> \ ffmpegcolorspace ! >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)I420' ! >> x264enc ! filesink location=/tmp/hold.h264 > > I would suggest to try the following. Run your pipeline with > increased debug level, i.e. GST_DEBUG=3 gst-launch ... and search > for any relevant messages which might give you the hint about > what is going wrong. In addition, I remember some strange > behavior got fixed by explicitly mentioning the framerate. In > your case, for example, in caps filter right after v4l2src add > framerate=30/1 or whatever is appropriate frame rate for your > camera. I sure don't see anything, perhaps someone that understands this better can. I put a level 4 dump of this at http://pastebin.com/iDDVuHgv > > In addition, since you are using gstreamer on TI platform, you > can consider asking the question also here: > https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?action=ForumBrowse&forum_id=187 Except that my problem is not with any of the TI components - it's only the off-the-shelf encoder that stalls. -- ------------------------------------------------------------ Gary Thomas | Consulting for the MLB Associates | Embedded world ------------------------------------------------------------ From jieke.wu at gmail.com Tue Sep 21 03:09:45 2010 From: jieke.wu at gmail.com (wu jieke) Date: Tue, 21 Sep 2010 09:09:45 +0800 Subject: [gst-devel] question on playbin2 for RTP streaming. Message-ID: hi, all , i am setting up RTP streaming environment between a X86 server and a embedded system client, host app is VLC, and target/client is gst-launch, commands are following: HOST: # vlc -vvv big_buck_bunny_480p_h264.mov --sout '#rtp{dst=, port=5004,sdp=rtsp://:8080/test.sdp}' Target/client: # gst-launch udpsrc multicast-group= caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" port=5004 ! rtph264depay ! my-codec-hw ! my-render-hw" the command works well, then i hope playbin2 creates hardware pipeline automatically, command line here: # gst-launch playbin2 uri=rtsp://:8080/test.sdp it fails to play. btw : playbin2 works well with my optimized codec and render, i test it with command. (gst-launch playbin2 uri=file:///big_buck.mov ), it can find the right elements, such as "my-codec-hw" and "my-render-hw". then i dump the log of gst-launch , and find playbin2 not perform preroll for live streams, which cause full or real pipeline is not ready before getting GstSystemClock. in fact, my optimized render can only use the specified clock provided with *_sink_provide_clock(), not GstSystemClock. so the pipeline hang even it links the optimized elements. my question is that how can i tell playbin2 to use my provided clock for live pipeline? if any misunderstanding , pls correct me. -- It's not the things you do in life that you regret , but the things that you do not do -------------- next part -------------- An HTML attachment was scrubbed... URL: From rsbultje at gmail.com Tue Sep 21 14:34:40 2010 From: rsbultje at gmail.com (Ronald S. Bultje) Date: Tue, 21 Sep 2010 08:34:40 -0400 Subject: [gst-devel] Hello from Germany In-Reply-To: <065539429916A2498BAB44D79888BF69063B6B64@DEEXECLUSP20.computacenter.de> References: <065539429916A2498BAB44D79888BF69063B6B64@DEEXECLUSP20.computacenter.de> Message-ID: Hi Carsten, these were created back when GStreamer 0.6 or 0.8 was the current version, so they are most certainly not up to date anymore. I don't even know what the current GStreamer version is. Also, I unfortunately don't have sources for either application on my harddisk anymore, they were lost when I bought a new computer 5 years ago. @gst-devel: should these links be removed or replaced with more current examples? Ronald On Tue, Sep 21, 2010 at 8:02 AM, Thielepape, Carsten wrote: > Hello Ronald, > > > > I need to start with gstreamer and at the official gstreamer web page there > are links to two example applications from you which seems to be a good > starting point (if they are coded in c/cpp) > > > > Aldegonde > > Aldegonde is a simple media player made to ease debugging of GStreamer > playback issues. > > Kiss > > Kiss is a simple KDE base media player meant as an example application for > KDE developers who want to use GStreamer > > > > Unfortunately, the links are dead, so are you able to send them to me by > email? (if still available) > > > > Greetings! > > > > > > Yours sincerely > > > > Carsten Thielepape > > International Senior Service Manager > > Global Operations > > > > Computacenter AG & Co. oHG > > Services & Solutions > > > > Kokkolastrasse 1, 40882 Ratingen, Germany > > Tel.: ??????????????? +49 2102 169 2304 > > Mobile: ?????????? +49 172 8458065 > > Short-Dial: ???? 62304 > > E-Mail: carsten.thielepape at computacenter.com > > WWW: www.computacenter.de > > > > ----------------------------------- > Computacenter AG & Co. oHG, mit Sitz in Kerpen (Amtsgericht K?ln HRA 18096) > Vertretungsberechtigte Gesellschafter: > Computacenter Aktiengesellschaft, mit Sitz in K?ln (Amtsgericht K?ln HRB > 28384) > Vorstand: Oliver Tuszik (Vorsitzender), Dr. Karsten Freihube, Hans-Georg > Freitag, Frank Kottmann, Reiner Louis > Aufsichtsrat: Michael Norris (Vorsitzender) > Computacenter Management GmbH, mit Sitz in K?ln (Amtsgericht K?ln HRB 28284) > Gesch?ftsf?hrer: Ulrich Irnich, J?rgen Stauber > Visit us on the Internet: http://www.computacenter.de > Visit our Online-Shop: https://shop.computacenter.de > > This email is confidential. If you are not the intended recipient, you must > not disclose or use the information contained in it. If you have received > this mail in error, please tell us immediately by return email and delete > the document. > ----------------------------------- From ebclark2 at gmail.com Tue Sep 21 20:51:44 2010 From: ebclark2 at gmail.com (Edward Clark) Date: Tue, 21 Sep 2010 14:51:44 -0400 Subject: [gst-devel] MPEG private data streams Message-ID: Hello, I'm currently using gstreamer to encode h264 video and send it out in an mpeg transport stream. I now have the need to include some metadata in a private data stream. From what I have seen there isn't a plugin that handles private data streams. Can anyone verify that this is the case? Thanks, -Ed From lijinsyam at gmail.com Wed Sep 22 13:35:42 2010 From: lijinsyam at gmail.com (liJin) Date: Wed, 22 Sep 2010 17:05:42 +0530 Subject: [gst-devel] ErroR: Message-ID: Hi , I just developed a software record multiple streams and record it in to different files simultaneously ... It was all written in C and now i am getting some error like * (:21724): GStreamer-WARNING **: failed to create thread: Error creating thread: Resource temporarily unavailable* * gst_adapter_take: assertion `nbytes > 0' failed* I just checked number of threads used by my program and it take only 15 threads and it was recording 3 different streams. I have no idea whats happening... Any solutions...? Regards Lijin -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Wed Sep 22 16:20:44 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Wed, 22 Sep 2010 15:20:44 +0100 Subject: [gst-devel] ErroR: In-Reply-To: References: Message-ID: <1285165244.8978.19.camel@zingle> On Wed, 2010-09-22 at 17:05 +0530, liJin wrote: Hi, > I just developed a software record multiple streams and record it in > to different files simultaneously ... It was all written in C and now > i am getting some error > > like > > (:21724): GStreamer-WARNING **: failed to create thread: > Error creating thread: Resource temporarily unavailable > > gst_adapter_take: assertion `nbytes > 0' failed > > I just checked number of threads used by my program and it take only > 15 threads and it was recording 3 different streams. I have no idea > whats happening... > > Any solutions...? Without more information, it's hard to say anything useful here. We don't even know the pipeline(s) you are using or the elements involved. Or your code. You can run your code in gdb with the G_DEBUG=fatal_warnings environment variable set, then it will break at the first warning and you can investigate with 'bt' where it comes from. Alternatively, set a break point on g_logv() or so. Cheers -Tim PS: a more descriptive subject line would be nice next time From rahlf at fs.tum.de Wed Sep 22 16:39:43 2010 From: rahlf at fs.tum.de (Matthias Rahlf) Date: Wed, 22 Sep 2010 16:39:43 +0200 (CEST) Subject: [gst-devel] Dynamically activate links? Message-ID: Hi, I want to be able to start/stop playback and start/stop recording. I can do both if I set it all up right from the start: | /* setup */ | GstElement *m_pipeline; | GMainLoop *m_gloop; | boost::thread* m_playerThread; | gst_init(NULL,NULL); | m_gloop = g_main_loop_new(NULL, FALSE); | m_pipeline = gst_parse_launch("alsasrc device=hw:2 ! tee name=tee " | "! queue name=playqueue ! alsasink sync=false tee. " | "! queue name=recordqueue ! audioconvert ! audioresample ! lame " | "! filesink name=filesink location=recording.mp3", &0); | m_playerThread = new boost::thread( | boost::bind(g_main_loop_run, m_gloop)); | | /* my stuff here */ | /* here i want to activate playback or recording dynamically */ | gst_element_set_state(m_pipeline, GST_STATE_PLAYING); | sleep(10); | gst_element_set_state(m_pipeline, GST_STATE_NULL); | | /* clean up */ | g_main_loop_quit(m_gloop); | m_playerThread->join(); | delete m_playerThread; | gst_object_unref(GST_OBJECT(m_pipeline)); But I did not find a way to attach the queues dynamically to the tee. I tried: - gst_element_unlink(tee, recordqueue); but there was no more playback. - not to link the queues to the tee during setup but use gst_element_link(tee, playqueue) later. No playback, gst_element_set_state reported an error. - to get the request_pad from tee and use gst_element_link_pads(). No playback, gst_element_set_state reported an error. I did not find any online resources for my use case. Do you have a hint or a useful link for me? BTW: I could not setup this pipeline without gst_parse_launch(). How do you do this? TIA, Matthias From baldur at medizza.com Wed Sep 22 17:25:20 2010 From: baldur at medizza.com (Baldur Gislason) Date: Wed, 22 Sep 2010 15:25:20 +0000 Subject: [gst-devel] Transcoding and otherwise dealing with streams. In-Reply-To: References: Message-ID: I went back to basics, building some test pipelines by hand with gst-launch. Same result as before, if I create the pipeline with decodebin or decodebin2, I get output but the performance is terrible, maxes one core at 100% and real time transcoding becomes impossible. If I construct a pipeline by hand using mpegtsdemux, mpeg2dec, x264enc and mpegtsmux, transcoding video only works great, and performance is as expected, using about 20% CPU time on this quad core Xeon E5507 machine. However when I try to pass audio through that pipeline, I get a deadlock somewhere and nothing comes out of the TS mux. Decodebin pipeline that works but performs very bad: gst-launch-0.10 -v mpegtsmux name ="mux" ! filesink location="/home/baldur/launch.ts" \ udpsrc multicast-group="239.192.192.1" port=1234 ! decodebin2 name="d" { d. ! queue ! x264enc key-int-max=50 ip-factor=0.71 analyse=0x113 me=0 subme=1 bframes=3 trellis=0 interlaced=1 ! queue max-size-buffers=16384 ! mux. } \ { d. ! audioconvert ! audioresample ! ffenc_aac ! queue max-size-time=10000000000 ! mux. } Discrete pipeline that works and performs great for video only: gst-launch-0.10 -v mpegtsmux name ="mux" ! filesink location="/home/baldur/launch.ts" \ udpsrc multicast-group="239.192.192.1" port=1234 ! mpegtsdemux name="d" { d. ! queue ! mpeg2dec ! x264enc key-int-max=50 ip-factor=0.71 analyse=0x113 me=0 subme=1 bframes=3 trellis=0 interlaced=1 ! queue ! mux. } Discrete pipeline that does not feed anything through it, CPU usage goes up for a while, until it fills some queue I guess and then it stops. Nothing is ever written to the filesink: gst-launch-0.10 -v mpegtsmux name ="mux" ! filesink location="/home/baldur/launch.ts" \ udpsrc multicast-group="239.192.192.1" port=1234 ! mpegtsdemux name="d" { d. ! queue ! mpegvideoparse ! mpeg2dec ! x264enc key-int-max=50 ip-factor=0.71 analyse=0x113 me=0 subme=1 bframes=3 trellis=0 interlaced=1 ! queue ! mux. } \ { d.audio_%04x ! queue max-size-time=10000000000 max-size-buffers=16384 ! mp3parse ! mux. } Another discrete that feeds nothing: gst-launch-0.10 -v mpegtsmux name ="mux" ! filesink location="/home/baldur/launch.ts" \ udpsrc multicast-group="239.192.192.1" port=1234 ! mpegtsdemux name="d" { d. ! queue ! mpegvideoparse ! mpeg2dec ! x264enc key-int-max=50 ip-factor=0.71 analyse=0x113 me=0 subme= 1 bframes=3 trellis=0 interlaced=1 ! queue ! mux. } \ { d. ! queue ! ffdec_mp3 ! audioconvert ! audioresample ! ffenc_aac ! queue ! mux. } I have tried a bunch of configurations on the audio side, but can't seem to get any audio through. Output from gst-launch for decodebin based pipeline: http://pastebin.com/LRUnZhEf I don't see anything in the output that I am missing from the other pipeline. Baldur On Sat, Sep 18, 2010 at 2:31 PM, Marco Ballesio wrote: > Hi, > > sorry for the late reply. I hope it will help. > > On Fri, Sep 10, 2010 at 7:52 PM, Baldur Gislason wrote: >> >> I am trying to construct a C application that can either pick up >> audio/video from a file (mpeg transport stream) or receive mpeg? transport >> stream on a UDP socket. >> Input format is MPEG2 A/V and output is H.264 with MPEG audio or AAC, >> transport stream multiplexing on output. >> So far I have managed to transcode video from network but am getting >> buffer underruns on the queue. If I add audio everything just stops, the >> pipeline does nothing and I can't quite figure out why. > > I don't have many details about how your pipelines ar4e built, but in 90% of > the cases the behaviour you're observing is due to a missing queue at the > right point. The audio thread should be straightforward from source to sink, > while one queue element should be separating the video thread before the > muxer. > >> >> If I read the data from a file, I get buffer overruns. So clearly this is >> a clocking thing. I have searched for documentation regarding clocking in >> gstreamer but found nothing useful. > > You can find something here: > > http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-clocks.txt > > and here: > > http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-scheduling.txt > http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-synchronisation.txt > >> >> The app developers manual mentions clocking but nothing about how it >> applies to code, and gst-inspect says none of the elements I have looked at >> have any clocking capabilities?!f >> >> Anyway, I was wondering if anyone had an example for building an MPEG >> transcoding pipeline in C, for working with a live stream and not file >> input, file output. File input, network output would be the other >> scenario. > > See here the excellent guide (should be from Mark Nauwelaerts): > > http://gentrans.sourceforge.net/docs/head/manual/html/howto.html > > Regards > >> >> Baldur Gislason >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > From ensonic at hora-obscura.de Wed Sep 22 20:35:36 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Wed, 22 Sep 2010 21:35:36 +0300 Subject: [gst-devel] Stopping and restarting complicated pipeline with muxing In-Reply-To: <1285021415812-2547757.post@n4.nabble.com> References: <1285021415812-2547757.post@n4.nabble.com> Message-ID: <4C9A4C78.805@hora-obscura.de> Am 21.09.2010 01:23, schrieb BillBock: > > All: > > I?m hoping my issue is some kind of basic misunderstanding of GStreamer > design philosophy. I have an application with a fairly complicated > pipeline. The pipeline has two sources, v4l2src and alsasrc. It can do > simultaneous video capture (without muxing audio), video streaming (using > tcpsink), individual frame capture, audio capture, and audio streaming. The > various branches of the pipeline all end at fakesinks. > > If I want to capture video, I block the src pads of v4l2src and alsasrc, > unlink stuff, insert an encoder element, a muxer element, and a filesink > element, link them to the pipeline, set them to playing, and unblock the src > pads. When I?m ready to stop capturing, I again block the src pads, unlink > and relink elements so that the encoder, muxer, and filesink are linked to > each other, but not to the rest of the elements in the pipeline, send an EOS > to the encoder, unlink and set the elements to null, remove them from the > pipeline, link stuff back to a fakesink, and unblock the src pads. I do > similar things with the other functions. Everything works great > (simultaneously and successively) without general stream errors or internal > data flow errors. I see nothing obviosly wrong. Just as a suggestion to avoid the pad-blocking and relinking bussiness, would using output-selector work for you? Stefan > > It?s when I try to do muxing that I have issues. It will usually work the > first time, except that when unblocking the src pads after reconfiguring the > pipeline to end in fakesinks again when done with capture, no more buffers > pass through the video source. > > If I strip down the pileline to simplify it and only do muxing of video and > audio (so, no tee elements at all in the pipeline, for example) then > successive capturing of muxed video and audio works. I?ve done up to 30 > captures in a row. Whenever I stop capturing in this case, I have to send > the EOS events on the video and and audio encoders before blocking the video > and audio src pads (so I also don?t unlink/relink stuff so that section of > the pipeline is off by itself). I always get a couple general stream errors > in this case, but everything continues to work afterwards and I can start > and stop again, over and over. > > I?m hoping for suggestions as to what I?m doing wrong. I?d be happy to > post code if that would help. From padhi.chinmaya at gmail.com Wed Sep 22 23:09:04 2010 From: padhi.chinmaya at gmail.com (Chinmaya Padhi) Date: Thu, 23 Sep 2010 02:39:04 +0530 Subject: [gst-devel] Contribute to gstreamer project Message-ID: Hi , I wanted to contribute to the gstreamer project. Initally by starting with some bug fixes. I have taken the gstreamer module of the GStreamer project from the git. Could build it on my linux workstation. Now since the gstreamer is a framework do we have any example applications to test ? How should one approach in terms of bug fixing or feature enhancements ? ~Chinmaya -------------- next part -------------- An HTML attachment was scrubbed... URL: From julien.isorce at gmail.com Thu Sep 23 00:29:22 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Thu, 23 Sep 2010 00:29:22 +0200 Subject: [gst-devel] udpsrc - multicast - IGMP version Message-ID: Hi, According to the code of udpsrc, IGMP V2 is used. At least the part of the code that uses winsock. Not sure on linux. I mean when joining a multicast group, an IGMP packet is send and its IGMP version is 2 because of the way winsock is used (structures for V2 are used) I think udpsrc should also support IGMP V3. Newer TV broadcast platform use it (such as MediaRoom). And for example it would be activated through a property 'igmp-version'. Should I create a bug or was it already discussed and rejected ? Sincerely Julien -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Thu Sep 23 11:17:38 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Thu, 23 Sep 2010 12:17:38 +0300 Subject: [gst-devel] Contribute to gstreamer project In-Reply-To: References: Message-ID: <4C9B1B32.7040904@hora-obscura.de> On 23.09.2010 00:09, Chinmaya Padhi wrote: > Hi , > > I wanted to contribute to the gstreamer project. Initally by starting > with some bug fixes. I have taken the gstreamer module of the > GStreamer project from the git. Just for your information, the project consists of the gstreamer (core) module, several gst-plugins-xxx modules (base, good, ugly, bad, opengl), gst-ffmpeg, gst-python and others. > Could build it on my linux workstation. Now since the gstreamer is a > framework do we have any example applications to test ? How should one > approach > in terms of bug fixing or feature enhancements ? Each module runs the module testsuite when doing "make check". Also http://gstreamer.freedesktop.org/apps/ lists applications that use gstreamer. If you are running a gnome desktop, e.g. totem and rythmbox use it. Bugs and Feature requests are tracked in https://bugzilla.gnome.org/buglist.cgi?quicksearch=product%3A%22GStreamer%22+ Stefan > > ~Chinmaya > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > From lijinsyam at gmail.com Thu Sep 23 17:16:58 2010 From: lijinsyam at gmail.com (liJin) Date: Thu, 23 Sep 2010 20:46:58 +0530 Subject: [gst-devel] Using gst_object_unref Message-ID: Hi.. I just confused about the "Hello World " example in the gstreamer manual. I can see a pipeline and some elements are created in code. In the end the pipeline is unrefed , using the method gst_object_unref(pipeline). What what about the elements we created...? Its there and the reference count of the elements doesn't seems decremented.... In chapter 5.2 . its says we need to use gst-object_unref for elements also Regards, Lijin -------------- next part -------------- An HTML attachment was scrubbed... URL: From t.i.m at zen.co.uk Thu Sep 23 17:36:14 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Thu, 23 Sep 2010 16:36:14 +0100 Subject: [gst-devel] Using gst_object_unref In-Reply-To: References: Message-ID: <1285256174.5374.2.camel@zingle> On Thu, 2010-09-23 at 20:46 +0530, liJin wrote: > I just confused about the "Hello World " example in the gstreamer > manual. I can see a pipeline and some elements are created in code. > In the end the pipeline is unrefed , using the method > gst_object_unref(pipeline). What what about the elements we > created...? Its there and the reference count of the elements doesn't > seems decremented.... > > In chapter 5.2 . its says we need to use gst-object_unref for > elements also if you use gst_bin_add (bin_or_pipeline, element); then the bin/pipeline will take ownership of the element (if it is a newly-created element) and will take care of freeing it when the pipeline is freed. Also see http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstObject.html#GstObject.description Cheers -Tim From lijinsyam at gmail.com Thu Sep 23 17:56:29 2010 From: lijinsyam at gmail.com (liJin) Date: Thu, 23 Sep 2010 21:26:29 +0530 Subject: [gst-devel] Using gst_object_unref In-Reply-To: References: <1285256174.5374.2.camel@zingle> Message-ID: Great ...ThankS... On Thu, Sep 23, 2010 at 9:23 PM, liJin wrote: > Great ...ThankS...[?] > > > On Thu, Sep 23, 2010 at 9:06 PM, Tim-Philipp M?ller wrote: > >> On Thu, 2010-09-23 at 20:46 +0530, liJin wrote: >> >> > I just confused about the "Hello World " example in the gstreamer >> > manual. I can see a pipeline and some elements are created in code. >> > In the end the pipeline is unrefed , using the method >> > gst_object_unref(pipeline). What what about the elements we >> > created...? Its there and the reference count of the elements doesn't >> > seems decremented.... >> > >> > In chapter 5.2 . its says we need to use gst-object_unref for >> > elements also >> >> if you use >> >> gst_bin_add (bin_or_pipeline, element); >> >> then the bin/pipeline will take ownership of the element (if it is a >> newly-created element) and will take care of freeing it when the >> pipeline is freed. Also see >> >> >> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstObject.html#GstObject.description >> >> >> Cheers >> -Tim >> >> >> >> ------------------------------------------------------------------------------ >> Nokia and AT&T present the 2010 Calling All Innovators-North America >> contest >> Create new apps & games for the Nokia N8 for consumers in U.S. and Canada >> $10 million total in prizes - $4M cash, 500 devices, nearly $6M in >> marketing >> Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store >> http://p.sf.net/sfu/nokia-dev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 569 bytes Desc: not available URL: From chmario at hotmail.com Thu Sep 23 20:20:56 2010 From: chmario at hotmail.com (chmario) Date: Thu, 23 Sep 2010 11:20:56 -0700 (PDT) Subject: [gst-devel] How to change window title of a autovideosink plugin? Message-ID: <1285266056120-2552492.post@n4.nabble.com> Does any one know who to change the window title form a window created with autovideosink plugin or any other plug in that allow this? Regards Mario ----- MChC -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-to-change-window-title-of-a-autovideosink-plugin-tp2552492p2552492.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From rpavithra.87 at gmail.com Fri Sep 24 06:10:32 2010 From: rpavithra.87 at gmail.com (rpavithra.87) Date: Thu, 23 Sep 2010 21:10:32 -0700 (PDT) Subject: [gst-devel] Download mode in Gstreamer Message-ID: <1285301432671-2553020.post@n4.nabble.com> I am little bit confused about the streaming and download modes in GStreamer. In case of streaming mode, the buffering percent is calculated with respect to the temporary buffer. 1) In Case of download mode, the buffering percent is calculated with respect to temp file or the temporary buffer? 2) If there is a temporary buffer in this case, how the data goes from buffer to the file? Actually in part-buffering.txt they had mentioned about the temp buffer. Please clarify. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Download-mode-in-Gstreamer-tp2553020p2553020.html Sent from the GStreamer-devel mailing list archive at Nabble.com. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Fri Sep 24 08:55:37 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Fri, 24 Sep 2010 09:55:37 +0300 Subject: [gst-devel] How to change window title of a autovideosink plugin? In-Reply-To: <1285266056120-2552492.post@n4.nabble.com> References: <1285266056120-2552492.post@n4.nabble.com> Message-ID: <4C9C4B69.9030903@hora-obscura.de> hi, Am 23.09.2010 21:20, schrieb chmario: > > Does any one know who to change the window title form a window created with > autovideosink plugin or any other plug in that allow this? The window shows the title from the video if any. Just embed the video output in your own UI (see GstXOverlay docs) and set what ever window title you like. Stefan > > Regards > Mario > > ----- > MChC From rpavithra.87 at gmail.com Fri Sep 24 09:36:31 2010 From: rpavithra.87 at gmail.com (rpavithra.87) Date: Fri, 24 Sep 2010 00:36:31 -0700 (PDT) Subject: [gst-devel] how to flush a gstreamer pipeline? Message-ID: <1285313791321-2553146.post@n4.nabble.com> Please tell me how to flush a gstreamer pipeline. I am using appsrc in my pipeline. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/how-to-flush-a-gstreamer-pipeline-tp2553146p2553146.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From cneerf2m at inf.h-brs.de Fri Sep 24 15:49:15 2010 From: cneerf2m at inf.h-brs.de (cneerf2m at inf.h-brs.de) Date: Fri, 24 Sep 2010 15:49:15 +0200 Subject: [gst-devel] rtsp server MP2T-ES, vlc and mplayer Message-ID: <201009241549.15179.cneerf2m@inf.h-brs.de> Hi, I tried to use the gst-rtsp-server to stream the video of a camera to vlc, mplayer and my vdr. I succeded in streaming the video to the totem player using the gstreamer backend but not to any other players. Those players all reported: "video/MP2T-ES failed, RTP payload format unknown" BTW all the players use the mediaLive library so this might be an issue of this library and not gstreamer. I'm not an expert in media streaming. The 'rtpmp2tpay' plugin reports MP2T-ES as the enconding type for the rtp payload. This information ist used by the gst-rtsp-server to create the sdp information. So the sdp contains the line 'a=rtpmap:33 MP2T-ES/90000' which is parsed by mediaLive. But mediaLive does not know how to handle MP2T-ES and closes the connection. I changed the 'rtpmp2tpay' plugin to report MP2T as enconding type and not MP2T-ES and now everything works just fine. mplayer, vlc and the freeboxtv plugin for vdr can show the stream. Regards, Chris From wmiller at sdr.com Fri Sep 24 17:44:18 2010 From: wmiller at sdr.com (Wes Miller) Date: Fri, 24 Sep 2010 08:44:18 -0700 (PDT) Subject: [gst-devel] Where can I get x264enc, x264dec, rtph264pay and depay? Message-ID: <1285343058605-2553663.post@n4.nabble.com> I would like to play with x264enc and x264ddec adn rtph264pay and --depay. What package ae these in? Is there an Ubuntu prebuilt apt-get-able package?. I have gstreamer-development pda in my apt-get sourc4es but no joy finding them. Or do I just need to build something? Wes -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Where-can-I-get-x264enc-x264dec-rtph264pay-and-depay-tp2553663p2553663.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From t.i.m at zen.co.uk Fri Sep 24 18:00:08 2010 From: t.i.m at zen.co.uk (Tim-Philipp =?ISO-8859-1?Q?M=FCller?=) Date: Fri, 24 Sep 2010 17:00:08 +0100 Subject: [gst-devel] Where can I get x264enc, x264dec, rtph264pay and depay? In-Reply-To: <1285343058605-2553663.post@n4.nabble.com> References: <1285343058605-2553663.post@n4.nabble.com> Message-ID: <1285344008.27127.5.camel@zingle> On Fri, 2010-09-24 at 08:44 -0700, Wes Miller wrote: > I would like to play with x264enc and x264ddec adn rtph264pay and --depay. > > What package are these in? Honestly, have you even *tried* looking for them? You could start with http://gstreamer.freedesktop.org -> Documentation -> "Overview of all Plug-ins" There's no x264dec, but there's an ffdec_h264 Cheers -Tim From chmario at hotmail.com Fri Sep 24 19:17:20 2010 From: chmario at hotmail.com (chmario) Date: Fri, 24 Sep 2010 10:17:20 -0700 (PDT) Subject: [gst-devel] How to change window title of a autovideosink plugin? In-Reply-To: <4C9C4B69.9030903@hora-obscura.de> References: <1285266056120-2552492.post@n4.nabble.com> <4C9C4B69.9030903@hora-obscura.de> Message-ID: <1285348640366-2554373.post@n4.nabble.com> where can i find GstXOverlay documents? ----- MChC -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/How-to-change-window-title-of-a-autovideosink-plugin-tp2552492p2554373.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From chmario at hotmail.com Fri Sep 24 19:27:36 2010 From: chmario at hotmail.com (chmario) Date: Fri, 24 Sep 2010 10:27:36 -0700 (PDT) Subject: [gst-devel] how to get width and height with uridecodebin Message-ID: <1285349256460-2556729.post@n4.nabble.com> we are using uridecodebin to get and process video from a IP camera. The problem is that we are unable to obtain the width and height from the capabilities? does any one know how to obtain width and hight using uridecode bin or can suggest an other plugin for IP cameras? ----- MChC -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/how-to-get-width-and-height-with-uridecodebin-tp2556729p2556729.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From baldur at medizza.com Fri Sep 24 19:39:36 2010 From: baldur at medizza.com (Baldur Gislason) Date: Fri, 24 Sep 2010 17:39:36 +0000 Subject: [gst-devel] Transcoding and otherwise dealing with streams. In-Reply-To: References: Message-ID: Ok. I have reached a point where the pipeline is functional and performs as expected. gst-launch-0.10 -v -m mpegtsmux name ="mux" ! filesink location="/home/baldur/launch.ts" \ udpsrc multicast-group="239.192.192.1" port=1234 ! mpegtsdemux name="d" \ { d. ! mpeg2dec ! x264enc key-int-max=50 ip-factor=0.71 analyse=0x113 me=0 subme=1 bframes=3 trellis=0 interlaced=1 ! queue max-size-time=0 max-size-buffers=0 ! mux. } \ { d. ! queue max-size-time=0 max-size-buffers=0 ! mux. } It seems I was just running out of queue space to synchronise the outputs, which did not happen in the same manner with decodebin. Baldur -------------- next part -------------- An HTML attachment was scrubbed... URL: From 123sandy at gmail.com Sat Sep 25 04:57:19 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Fri, 24 Sep 2010 19:57:19 -0700 (PDT) Subject: [gst-devel] JPEG, x264enc -> UDP In-Reply-To: <1285368996352-2614089.post@n4.nabble.com> References: <1285368996352-2614089.post@n4.nabble.com> Message-ID: <1285383439603-2659164.post@n4.nabble.com> Hi Pedro, Pedro.henrique wrote: > > gst-launch souphttpsrc location ="http://teste.dvrdns.org:999/CAM:2" ! > jpegdec ! x264dec ! udpsink host = . . . port = . . . > > Unexpected x264 header > Pipeline is wrong. You need to use an x264enc and not a x264dec (Which is non existant FYI). Your pipeline will not even link. Are you really trying out these pipelines? Pedro.henrique wrote: > > gst-launch souphttpsrc location ="http://teste.dvrdns.org:999/CAM:2" ! > jpegdec ! x264dec ! rtph264pay ! udpsink host = . . . port = . . . > > Unexpected x264 header > Same here. Correct the pipeline. Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/JPEG-x264enc-UDP-tp2614089p2659164.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From halley.zhao at intel.com Sat Sep 25 09:33:21 2010 From: halley.zhao at intel.com (Zhao, Halley) Date: Sat, 25 Sep 2010 15:33:21 +0800 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> <5D8008F58939784290FAB48F549751982C55D64720@shsmsx502.ccr.corp.intel.com> Message-ID: <5D8008F58939784290FAB48F549751982CA9F73B56@shsmsx502.ccr.corp.intel.com> However, I think it in another way, the way like most bittorrent tools does. At the start of progressive downloaded, a dummy file is created on disk; When there is seek operation requested from parser, souphttpsrc will write bunk of data to file with offset as well. Then I will get the entire file after finish playback. From: Marco Ballesio [mailto:gibrovacco at gmail.com] Sent: Monday, September 20, 2010 1:54 PM To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Hi, 2010/9/20 Zhao, Halley > Your suggestion may be helpful, But I expect a solution needn?t care about demux/mux, because all these data are passed through souphttpsrc, save the data from souphttpsrc shouldn?t care about mux/demux. As you wrote: "it seek to the end of the mp4 file at the beginning of playback." the seek operation is performed from the demuxer (qtdemux), which identifies essential meta-data present at the end of the file. This data will not be transferred again at the end of the playback, so *in this case* you can't consider souphttpsrc as just a mere data pipe through which you get the complete clip. Said so, you have many ways to address this: - You can (try and) use the "moov-recovery-file" transmuxing the file after having saved it. It will restore the missing meta-info. - You can re-mux on-the-file the file while you're getting it from souphttpsrc. Again, it will rebuild the lost meta-infos. - You can use only progressive-download compliant files: they will have all the meta-information stored at the beginning and no seek will be needed. Regards From: Marco Ballesio [mailto:gibrovacco at gmail.com] Sent: Sunday, September 19, 2010 12:53 AM To: Discussion of the development of GStreamer Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Hi, 2010/9/13 Zhao, Halley > Thanks Stefan. After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got same result. I think the possible solution is to enhance souphttpsrc to save content to disk after some refractor, because souphttpsrc does some seek following the command of parser. Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 file at the beginning of playback. Finally, tail of the original mp4 file is missing in downloaded mp4 file. It looks like you're not re-muxing the content. Are you simply storing the raw mp4 data to a file or are you using a muxer before the filesink? What does mp4info tell about your output file? You may try and recover the saved files with mp4mux using the option "moov-recovery-file". What happens if you transmux the files using it? Regards halley at halley-lucid:~/swap/streaming/mp4$ ls -l total 5216 -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 -----Original Message----- From: Stefan Kost [mailto:ensonic at hora-obscura.de] Sent: 2010?9?11? 1:27 To: Discussion of the development of GStreamer Cc: Zhao, Halley Subject: Re: [gst-devel] some issues when trying to save content to disk during http progressive downloaded Am 08.09.2010 04:45, schrieb Zhao, Halley: > During playback of progressive content, I tried to save the content to disk as well. > > But the result is strange: > > Some contents are saved correctly, some contents are saved but can?t playback > again; some contents even can?t playback during progressive downloaded. > > > > ## most ogg contents work well, the saved contents can playback again > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! decodebin > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > ## some mp4 saved contents can?t playback again, the saved contents differ from > the original one; even the following test.mp4 and test2.mp4 are different > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > gst-launch-0.10 souphttpsrc location=http:// > 10.238.37.11/share/media/video/test.mp4 ! filesink > location=/home/halley/swap/streaming/test2.mp4 > At first use decodebin2! If the http source is seekable, the muxer in decodebin will do pull. You could try: gst-launch-0.10 souphttpsrc location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee name=t ! decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 Stefan > > > ## some wmv contents even can?t playback during progressive downloaded (though > some saved wmv contents can playback again) > > gst-launch-0.10 -v -v souphttpsrc location=http:// > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > thanks in advance for your help. > > > > > > *ZHAO, Halley (Aihua)* > > Email: halley.zhao at intel.com > > > Tel: +86(21)61166476 iNet: 8821-6476 > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > > > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Sat Sep 25 12:27:35 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Sat, 25 Sep 2010 13:27:35 +0300 Subject: [gst-devel] some issues when trying to save content to disk during http progressive downloaded In-Reply-To: <5D8008F58939784290FAB48F549751982CA9F73B56@shsmsx502.ccr.corp.intel.com> References: <5D8008F58939784290FAB48F54975198278A379D62@shsmsx502.ccr.corp.intel.com> <4C8A6A62.5000808@hora-obscura.de> <5D8008F58939784290FAB48F549751982C55A6F5FF@shsmsx502.ccr.corp.intel.com> <5D8008F58939784290FAB48F549751982C55D64720@shsmsx502.ccr.corp.intel.com> <5D8008F58939784290FAB48F549751982CA9F73B56@shsmsx502.ccr.corp.intel.com> Message-ID: Hi, 2010/9/25 Zhao, Halley > However, I think it in another way, the way like most bittorrent tools > does. > > At the start of progressive downloaded, a dummy file is created on disk; > When there is seek operation requested from parser, souphttpsrc will write > bunk of data to file with offset as well. > > Then I will get the entire file after finish playback. > > I agree such an approach may be useful for a few cases where you want to save the file while playing it, but it can bring to really bad performances in some others: think e.g. about an embedded device using a slow flash as mass storage device. Caching everything on a file may either not be possible because of the narrow size usually these devices have or, if possible, it would be really slow at the risk of saturating the flash chip bandwidth when e.g. both reading the file and swapping. Btw I'd still like to have somebody prototyping an (optional) behaviour like the one you're suggesting. Adding a way to store to file all the contents httpsrc reads the way you point out may be really interesting at least for debugging purposes. Unfortunately, I've the feeling that getting it accepted in the "official" souphttpsrc would be a completely separate beast. Writing a custom element (such an evolved tee) may help to work around the issue. Regards > > > > *From:* Marco Ballesio [mailto:gibrovacco at gmail.com] > *Sent:* Monday, September 20, 2010 1:54 PM > > *To:* Discussion of the development of GStreamer > *Subject:* Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > > > Hi, > > 2010/9/20 Zhao, Halley > > Your suggestion may be helpful, > > But I expect a solution needn?t care about demux/mux, because all these > data are passed through souphttpsrc, save the data from souphttpsrc shouldn > ?t care about mux/demux. > > > As you wrote: > > "it seek to the end of the mp4 file at the beginning of playback." > > the seek operation is performed from the demuxer (qtdemux), which > identifies essential meta-data present at the end of the file. This data > will not be transferred again at the end of the playback, so *in this case* > you can't consider souphttpsrc as just a mere data pipe through which you > get the complete clip. > > Said so, you have many ways to address this: > > - You can (try and) use the "moov-recovery-file" transmuxing the file after > having saved it. It will restore the missing meta-info. > - You can re-mux on-the-file the file while you're getting it from > souphttpsrc. Again, it will rebuild the lost meta-infos. > - You can use only progressive-download compliant files: they will have all > the meta-information stored at the beginning and no seek will be needed. > > Regards > > > > > > > > *From:* Marco Ballesio [mailto:gibrovacco at gmail.com] > *Sent:* Sunday, September 19, 2010 12:53 AM > > > *To:* Discussion of the development of GStreamer > > *Subject:* Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > > > Hi, > > 2010/9/13 Zhao, Halley > > Thanks Stefan. > After add a 'queue' after 'souphttpsrc' and use 'decodebin2'; I still got > same result. > > I think the possible solution is to enhance souphttpsrc to save content to > disk after some refractor, because souphttpsrc does some seek following the > command of parser. > > Attached mp4.log is the log of souphttpsrc, it seek to the end of the mp4 > file at the beginning of playback. Finally, tail of the original mp4 file is > missing in downloaded mp4 file. > > > It looks like you're not re-muxing the content. Are you simply storing the > raw mp4 data to a file or are you using a muxer before the filesink? What > does mp4info tell about your output file? > > You may try and recover the saved files with mp4mux using the option > "moov-recovery-file". What happens if you transmux the files using it? > > Regards > > > halley at halley-lucid:~/swap/streaming/mp4$ ls -l > total 5216 > -rwxr--r-- 1 halley halley 1776915 2010-09-08 23:08 download.mp4 > -rw-r--r-- 1 halley halley 1773281 2010-09-08 18:15 original.mp4 > > > -----Original Message----- > From: Stefan Kost [mailto:ensonic at hora-obscura.de] > Sent: 2010?9?11? 1:27 > To: Discussion of the development of GStreamer > > Cc: Zhao, Halley > Subject: Re: [gst-devel] some issues when trying to save content to disk > during http progressive downloaded > > Am 08.09.2010 04:45, schrieb Zhao, Halley: > > During playback of progressive content, I tried to save the content to > disk as well. > > > > But the result is strange: > > > > Some contents are saved correctly, some contents are saved but can?t > playback > > again; some contents even can?t playback during progressive downloaded. > > > > > > > > ## most ogg contents work well, the saved contents can playback again > > > > gst-launch-0.10 souphttpsrc > > location=http://10.238.37.11/share/media/video/test.ogv ! tee name=t ! > decodebin > > ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.ogv > > > > > > > > ## some mp4 saved contents can?t playback again, the saved contents > differ from > > the original one; even the following test.mp4 and test2.mp4 are different > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! tee name=t ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.mp4 > > > > gst-launch-0.10 souphttpsrc location=http:// > > 10.238.37.11/share/media/video/test.mp4 ! filesink > > location=/home/halley/swap/streaming/test2.mp4 > > > > At first use decodebin2! > > If the http source is seekable, the muxer in decodebin will do pull. You > could try: > > gst-launch-0.10 souphttpsrc > location=http://10.238.37.11/share/media/video/test.mp4 ! queue ! tee > name=t ! > decodebin2 ! ffmpegcolorspace ! xvimagesink t. ! queue ! filesink > location=test.mp4 > > Stefan > > > > > > > ## some wmv contents even can?t playback during progressive downloaded > (though > > some saved wmv contents can playback again) > > > > gst-launch-0.10 -v -v souphttpsrc location=http:// > > 10.238.37.11/share/media/test.wmv ! tee name=t ! queue ! decodebin ! > > ffmpegcolorspace ! xvimagesink t. ! queue ! filesink location=test.wmv > > > > > > > > thanks in advance for your help. > > > > > > > > > > > > *ZHAO, Halley (Aihua)* > > > > Email: halley.zhao at intel.com > > > > Tel: +86(21)61166476 iNet: 8821-6476 > > > > SSG/OTC/Moblin 3W038 Pole: F4 > > > > > > > > > > > > > ------------------------------------------------------------------------------ > > This SF.net Dev2Dev email is sponsored by: > > > > Show off your parallel programming skills. > > Enter the Intel(R) Threading Challenge 2010. > > http://p.sf.net/sfu/intel-thread-sfd > > > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From sledgehammer_999 at hotmail.com Sun Sep 26 13:21:49 2010 From: sledgehammer_999 at hotmail.com (sledge hammer) Date: Sun, 26 Sep 2010 14:21:49 +0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. Message-ID: Do I need to do something special to "play" a pipeline with multiple filesrc's elements? I have a filesrc that loads an AVI file and another one that loads an SRT file (external subtitle). Let's say I decode the AVI file, plug a textoverlay before the videosink and then decode and plug the SRT file into the textoverlay. Should this be enough for the pipeline to start playing or do I need something that syncs the 2(or more) sources? Note: I insert the appropriate 'queue' elements after the 'avidemux' element and after the 'subparse' element. -------------- next part -------------- An HTML attachment was scrubbed... URL: From parveen.jain at one97.net Mon Sep 27 09:34:40 2010 From: parveen.jain at one97.net (Parveen Kumar Jain) Date: Mon, 27 Sep 2010 13:04:40 +0530 Subject: [gst-devel] Recieve DTMF using GStreamer In-Reply-To: <1283930807.2483.0.camel@TesterTop4> References: <1283929717386-2530905.post@n4.nabble.com> <1283930807.2483.0.camel@TesterTop4> Message-ID: Hi Oliver/Anyone, Can you please provide me any sample application or "pipeline example" for using "rtpdtmfdepay".I was trying to create a pipeline using "udpsrc" and then linked it to "rtpdtmfdepay".But somehow it is not working.The call back function is not able to receive any event from given Bus. Can anyone provide any clue what wrong with this ? Regards, Parveen Jain 2010/9/8 Olivier Cr?te > On Wed, 2010-09-08 at 00:08 -0700, newbie wrote: > > This is my first post in this mailing list. i have already searched the > > archive to find out the answer of my question. But i did not find it. > > I need to know that can we use GStreamer to receive DTMF from peer. i > have > > seen the plug ins to send DTMF to peer but not able to find the plug ins > to > > receive DTMF. > > If this is possible to receive DTMF using GStreamer then please share the > > solution for the same. > > DTMF can arrive in two forms, sound or RTP events. You can receive RTP > events with "rtpdtmfdepay". There is also "dtmfdetect" which will try to > detect DTMF in the incoming sound (your mileage may vary). > > -- > Olivier Cr?te > olivier.crete at collabora.co.uk > > > ------------------------------------------------------------------------------ > This SF.net Dev2Dev email is sponsored by: > > Show off your parallel programming skills. > Enter the Intel(R) Threading Challenge 2010. > http://p.sf.net/sfu/intel-thread-sfd > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vz-gstreamer at zeitlins.org Mon Sep 27 15:30:54 2010 From: vz-gstreamer at zeitlins.org (Vadim Zeitlin) Date: Mon, 27 Sep 2010 15:30:54 +0200 Subject: [gst-devel] Using playbin to read a file being constantly updated? Message-ID: Hello, Can I use playbin to play back the contents of a file which is being updated while it's being played? I didn't find any mention of this in the documentation but the following simple test doesn't work for me: % gst-launch-0.10 videotestsrc ! theoraenc ! oggmux ! \ filesink location=/tmp/videotest.ogg & sleep 5; \ gst-launch-0.10 -v playbin uri=file:///tmp/videotest.ogg It starts playing well enough but sooner or later (after about 40 seconds of playback, i.e. much longer than 5 second initial delay) it stops with Got EOS from element "playbin0". message. Does anybody know why does this happen and, more importantly, if there is any way to avoid it? And if this can't be made to work, I'd like to know if anybody can advise me about a better way to achieve my goal, which is to play the output of v4l2src using playbin while also saving the stream to a file. I thought that just saving the web cam stream to a file and playing it would be the simplest way to do it but if there is a better solution I'd like to hear about it. Thanks in advance, VZ -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 196 bytes Desc: not available URL: From thiagossantos at gmail.com Mon Sep 27 16:17:33 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Mon, 27 Sep 2010 11:17:33 -0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: References: Message-ID: 2010/9/26 sledge hammer > Do I need to do something special to "play" a pipeline with multiple > filesrc's elements? I have a filesrc that loads an AVI file and another one > that loads an SRT file (external subtitle). Let's say I decode the AVI file, > plug a textoverlay before the videosink and then decode and plug the SRT > file into the textoverlay. Should this be enough for the pipeline to start > playing or do I need something that syncs the 2(or more) sources? > Seems like it should work. What's your exact pipeline? Can you post a launch line of it? > > Note: I insert the appropriate 'queue' elements after the 'avidemux' > element and after the 'subparse' element. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From emmanuel at gnome.org Mon Sep 27 16:22:36 2010 From: emmanuel at gnome.org (Emmanuel Pacaud) Date: Mon, 27 Sep 2010 16:22:36 +0200 Subject: [gst-devel] Release of Aravis 0.1.2 Message-ID: <1285597357.2100.12.camel@lappc-p348> Hi, I've just released the third unstable version of Aravis, a LGPLv2+ gobject based library for the acquisition of digital camera video streams. http://ftp.gnome.org/pub/GNOME/sources/aravis/0.1/ http://blogs.gnome.org/emmanuel/category/aravis/ It includes: * Support for gigabit ethernet cameras * Support for a large subset of the Genicam interface * A simple API for easy camera control * A work-in-progress documentation * A simple gstreamer source element * Gobject-introspection support * An ethernet camera simulator The changes since the last release are: * Add exposure and gain settings to the gstreamer source element * fix exposure setting in ArvCamera for Basler cameras * gather running statistics for the GV devices * fix GV stream fixed buffer size * add a new arv-show-devices utility * make API more consistent with the Genicam standard As Aravis don?t have a bug report facility yet (it should happen soon, hopefully on bugzilla.gnome.org), please report any bug to me using this address: emmanuel at gnome org. Emmanuel. From bbock at digitalforcetech.com Mon Sep 27 16:47:29 2010 From: bbock at digitalforcetech.com (BillBock) Date: Mon, 27 Sep 2010 07:47:29 -0700 (PDT) Subject: [gst-devel] Stopping and restarting complicated pipeline with muxing In-Reply-To: <4C9A4C78.805@hora-obscura.de> References: <1285021415812-2547757.post@n4.nabble.com> <4C9A4C78.805@hora-obscura.de> Message-ID: <7344F642D7793040B6FB04A881A0D174012CA118@pigeon.dft.local> Stefan: Thanks for the response. I took a look at output-selector and input-selector. I haven?t tried using them yet, but they did point me in some directions I hadn?t considered before. For example, I?m now being much more careful to ensure that only the elements (especially sinks) that I want to see EOS signals get to see them. After making several changes, I?ve gotten video streaming, video capturing (with muxed audio), and image capturing all working simultaneously and successively. The next few steps are to get audio streaming and audio capturing working alongside the above. However, since I?m using an H.264 encoder for the video stuff, I found that I need to make sure that I store the first SPS/PPS frame so that I can apply it to future IDR frames when doing streaming after video capturing has already begun. This isn?t a GStreamer problem, although I will be solving it with GStreamer by modifying a plugin to do the frame modification work for me. Thanks, Bill Bock Software Engineer Digital Force Technologies 9455 Waples Street, Suite #100 San Diego, CA 92121 Phone: (858) 546-1244 Fax: (858) 597-1750 bbock at digitalforcetech.com www.digitalforcetech.com ****************************************************** CONFIDENTIALITY NOTICE: This E-Mail is intended only for the use of the individual or entity to whom it is addressed and may contain information that is privileged, confidential, and exempt from disclosure under applicable law. If you have received this communication in error, please do not distribute, and delete the original message. Please notify the sender by E-Mail at the address shown. Thank you for your compliance. From: Stefan Kost [via GStreamer-devel] [mailto:ml-node+2550857-789378914-161056 at n4.nabble.com] Sent: Wednesday, September 22, 2010 11:35 AM To: Bill Bock Subject: Re: Stopping and restarting complicated pipeline with muxing Am 21.09.2010 01:23, schrieb BillBock: > > All: > > I?m hoping my issue is some kind of basic misunderstanding of GStreamer > design philosophy. I have an application with a fairly complicated > pipeline. The pipeline has two sources, v4l2src and alsasrc. It can do > simultaneous video capture (without muxing audio), video streaming (using > tcpsink), individual frame capture, audio capture, and audio streaming. The > various branches of the pipeline all end at fakesinks. > > If I want to capture video, I block the src pads of v4l2src and alsasrc, > unlink stuff, insert an encoder element, a muxer element, and a filesink > element, link them to the pipeline, set them to playing, and unblock the src > pads. When I?m ready to stop capturing, I again block the src pads, unlink > and relink elements so that the encoder, muxer, and filesink are linked to > each other, but not to the rest of the elements in the pipeline, send an EOS > to the encoder, unlink and set the elements to null, remove them from the > pipeline, link stuff back to a fakesink, and unblock the src pads. I do > similar things with the other functions. Everything works great > (simultaneously and successively) without general stream errors or internal > data flow errors. I see nothing obviosly wrong. Just as a suggestion to avoid the pad-blocking and relinking bussiness, would using output-selector work for you? Stefan > > It?s when I try to do muxing that I have issues. It will usually work the > first time, except that when unblocking the src pads after reconfiguring the > pipeline to end in fakesinks again when done with capture, no more buffers > pass through the video source. > > If I strip down the pileline to simplify it and only do muxing of video and > audio (so, no tee elements at all in the pipeline, for example) then > successive capturing of muxed video and audio works. I?ve done up to 30 > captures in a row. Whenever I stop capturing in this case, I have to send > the EOS events on the video and and audio encoders before blocking the video > and audio src pads (so I also don?t unlink/relink stuff so that section of > the pipeline is off by itself). I always get a couple general stream errors > in this case, but everything continues to work afterwards and I can start > and stop again, over and over. > > I?m hoping for suggestions as to what I?m doing wrong. I?d be happy to > post code if that would help. ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ________________________________ View message @ http://gstreamer-devel.966125.n4.nabble.com/Stopping-and-restarting-complicated-pipeline-with-muxing-tp2547757p2550857.html To unsubscribe from Stopping and restarting complicated pipeline with muxing, click here . -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Stopping-and-restarting-complicated-pipeline-with-muxing-tp2547757p2715527.html Sent from the GStreamer-devel mailing list archive at Nabble.com. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Mon Sep 27 19:14:22 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Mon, 27 Sep 2010 20:14:22 +0300 Subject: [gst-devel] How to change window title of a autovideosink plugin? In-Reply-To: <1285348640366-2554373.post@n4.nabble.com> References: <1285266056120-2552492.post@n4.nabble.com> <4C9C4B69.9030903@hora-obscura.de> <1285348640366-2554373.post@n4.nabble.com> Message-ID: <4CA0D0EE.3090503@hora-obscura.de> Am 24.09.2010 20:17, schrieb chmario: > > where can i find GstXOverlay documents? > > ----- > MChC http://gstreamer.freedesktop.org/documentation/ http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst-plugins-base-libs-gstxoverlay.html at least this first is not so hard to find right. I'd recomnmend you to install devhelp and the doc packages under linux. Stefan From jcai at tju.edu.cn Tue Sep 28 05:40:02 2010 From: jcai at tju.edu.cn (Cai Jing) Date: Tue, 28 Sep 2010 11:40:02 +0800 Subject: [gst-devel] How to register gstreamer plugin on windows Message-ID: <485645290.04364@tju.edu.cn> Hi, I am trying to develop gstreamer plugin on Windows. And the new plugin seems can not be registered successfully(Can not be found via gst_registry_find_feature()), even I copy the plugin into plugin directory. So: 1. Does that mean my plugin creation is something wrong (seem not, for the code is the sample code (myelement) from gstreamer release)? If it is how to resolve it? 2. Does that mean its registration wrong? If it is, how to resolve it? Only copy is not enough? I remember on Linux, everything will be OK, just by "make install" Many thanks Jimmy -------------- next part -------------- An HTML attachment was scrubbed... URL: From sledgehammer_999 at hotmail.com Tue Sep 28 21:18:49 2010 From: sledgehammer_999 at hotmail.com (sledge hammer) Date: Tue, 28 Sep 2010 22:18:49 +0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: References: , Message-ID: I was building an example program that did the bare minimum to test the situation and to post it here if it showed the same behavior as my other code. My other code is part of a much larger project(a media player) which is yet to be released and it would be confusing for someone else to go through it in a few moments. Well, I was creating a **really** basic avi player to demonstrate my problem but for some reason I keep getting "Error: Internal data stream error.". I think I do everything right but I can't find the problem. Please take a look yourself. The code is here-> http://pastebin.com/YZs5garS After your answer, I thought that probably me and my code is at fault. I probably do something wrong somewhere. I thought that I forgot some element in the pipeline but no. I dumped the list of the elements in the pipeline and it looks right. Do you have any suggestions/directions for debugging? What should I look for? What could cause a stall like this? Notes: 1. Build it as c++ source(I use std::strings) 2. Update the 'avipath' global var to point to an AVI file with Xvid/Divx video and mp3 audio. 3. I didn't implement the loading of external subs yet, since I can't get it to load an avi file. 4. I didn't give much emphasis on memory management intentionally. To make it simpler. From: thiagossantos at gmail.com Date: Mon, 27 Sep 2010 11:17:33 -0300 To: gstreamer-devel at lists.sourceforge.net Subject: Re: [gst-devel] Pipeline with multiple filesources is stalled. 2010/9/26 sledge hammer Do I need to do something special to "play" a pipeline with multiple filesrc's elements? I have a filesrc that loads an AVI file and another one that loads an SRT file (external subtitle). Let's say I decode the AVI file, plug a textoverlay before the videosink and then decode and plug the SRT file into the textoverlay. Should this be enough for the pipeline to start playing or do I need something that syncs the 2(or more) sources? Seems like it should work. What's your exact pipeline? Can you post a launch line of it? Note: I insert the appropriate 'queue' elements after the 'avidemux' element and after the 'subparse' element. ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -- Thiago Sousa Santos ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From thiagossantos at gmail.com Tue Sep 28 22:22:39 2010 From: thiagossantos at gmail.com (thiagossantos at gmail.com) Date: Tue, 28 Sep 2010 17:22:39 -0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: References: Message-ID: 2010/9/28 sledge hammer > I was building an example program that did the bare minimum to test the > situation and to post it here if it showed the same behavior as my other > code. > My other code is part of a much larger project(a media player) which is yet > to be released and it would be confusing for someone else to go through it > in a few moments. > Well, I was creating a **really** basic avi player to demonstrate my > problem but for some reason I keep getting "Error: Internal data stream > error.". I think I do everything right but I can't find the problem. Please > take a look yourself. The code is here-> http://pastebin.com/YZs5garS > > After your answer, I thought that probably me and my code is at fault. I > probably do something wrong somewhere. I thought that I forgot some element > in the pipeline but no. I dumped the list of the elements in the pipeline > and it looks right. Do you have any suggestions/directions for debugging? > What should I look for? What could cause a stall like this? > When receiving the error on the bus callback, print the debug part of the error too, it is likely that it will help. Also try running the application with GST_DEBUG=2 (at a minimum), to have a little more information about the problem. If that doesn't solve, a full log (GST_DEBUG=5) will lead you to the cause. Search for the error and go up on the log until you find the reason for it to happen. > > Notes: > 1. Build it as c++ source(I use std::strings) > 2. Update the 'avipath' global var to point to an AVI file with Xvid/Divx > video and mp3 audio. > 3. I didn't implement the loading of external subs yet, since I can't get > it to load an avi file. > 4. I didn't give much emphasis on memory management intentionally. To make > it simpler. > > > ------------------------------ > From: thiagossantos at gmail.com > Date: Mon, 27 Sep 2010 11:17:33 -0300 > To: gstreamer-devel at lists.sourceforge.net > Subject: Re: [gst-devel] Pipeline with multiple filesources is stalled. > > > > 2010/9/26 sledge hammer > > Do I need to do something special to "play" a pipeline with multiple > filesrc's elements? I have a filesrc that loads an AVI file and another one > that loads an SRT file (external subtitle). Let's say I decode the AVI file, > plug a textoverlay before the videosink and then decode and plug the SRT > file into the textoverlay. Should this be enough for the pipeline to start > playing or do I need something that syncs the 2(or more) sources? > > > Seems like it should work. What's your exact pipeline? Can you post a > launch line of it? > > > > Note: I insert the appropriate 'queue' elements after the 'avidemux' > element and after the 'subparse' element. > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > -- > Thiago Sousa Santos > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances and start using > them to simplify application deployment and accelerate your shift to cloud > computing. http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ gstreamer-devel mailing > list gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -- Thiago Sousa Santos -------------- next part -------------- An HTML attachment was scrubbed... URL: From 123sandy at gmail.com Wed Sep 29 06:49:53 2010 From: 123sandy at gmail.com (Sandeep Prakash) Date: Tue, 28 Sep 2010 21:49:53 -0700 (PDT) Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: References: Message-ID: <1285735793580-2718318.post@n4.nabble.com> Hi One problem I found in the pad-added callback is: /*plug a queue in the pad and decided what to do according to the mimetype*/ queue = gst_element_factory_make("queue", NULL); sinkpad = gst_element_get_static_pad(queue, "sink"); gst_pad_link(pad, sinkpad); gst_object_unref(sinkpad); queue element has to be part of the bin for you to do a gst_pad_link. Here gst_pad_link will fail. /*plug a queue in the pad and decided what to do according to the mimetype*/ queue = gst_element_factory_make("queue", NULL); gst_bin_add(GST_BIN(pipeline), queue); sinkpad = gst_element_get_static_pad(queue, "sink"); gst_pad_link(pad, sinkpad); gst_object_unref(sinkpad); Next time better check all the return values. Regards, Sandeep Prakash http://sandeepprakash.homeip.net -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Pipeline-with-multiple-filesources-is-stalled-tp2714279p2718318.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From inaky at linux.intel.com Wed Sep 29 07:10:41 2010 From: inaky at linux.intel.com (Inaky Perez-Gonzalez) Date: Tue, 28 Sep 2010 22:10:41 -0700 Subject: [gst-devel] Stopping and restarting tcpclientsrc, what am I doing wrong? Message-ID: <1285737041.3136.344.camel@localhost.localdomain> Hi All I am working on a demo concept where I have a video streaming to a tcpserversink and a client reading with a tcpclientsrc. The idea is that an external agent can drive the tcpclientsrc connection from being playing to stopped and viceversa. The final intention is that if network connectivity changes, for example, it is lost, network manager can tell it to stop without killing the pipeline and when it comes back, it can re-starts. As well, if we switch from say wifi to ethernet, it just closes the socket and reopens it, with a "minimal" glitch in the middle. The attached [crude] example does more or less that, but using the standard input to feed the events (online/offline/reroute [which means online->offline->online]). The issue is that when I run it (type 'online'), the video starts streaming ok. When I stop it (type 'offline'), it stops as expected. However, if you type 'online' again, the pipeline doesn't restart. I added some tracing to the tcpclientsrc functions and I could see how the 'gst_tcp_client_src_create()' function was being called, but nothing is displayed and at some point it stopped being called. What am I doing wrong? Thanks in advance! -------------- next part -------------- A non-text attachment was scrubbed... Name: demo.c Type: text/x-csrc Size: 6512 bytes Desc: not available URL: From noreply at badoo.com Wed Sep 29 08:22:41 2010 From: noreply at badoo.com (Badoo) Date: Wed, 29 Sep 2010 06:22:41 +0000 Subject: [gst-devel] =?utf-8?q?Paulo_Benatto_deixou_uma_mensagem_para_voc?= =?utf-8?q?=C3=AA_no_Badoo!?= Message-ID: Voc? tem uma nova mensagem em Badoo! Paulo Benatto deixou uma mensagem pra voc?. Siga este link para abrir: http://us1.badoo.com/01124351687/in/DniPMVLNK3s/?lang_id=61 Mais gente que espera pacientemente por voc?: Patricios (Florian?polis, Brasil) Mel (Florian?polis, Brasil) Yaclon (Florian?polis, Brasil) http://us1.badoo.com/01124351687/in/DniPMVLNK3s/?lang_id=61 Se os links desta mensagem n?o funcionam, copie e cole-os na barra de endere?os do seu browser. Este email ? parte do sistema de entrega de mensagens enviadas por Paulo Benatto. Se voc? recebeu este email por engano, por favor, ignore. Em breve a mensagem ser? removida do sistema. Divirta-se! A Badoo Equipe Voc? recebeu este email, porque um membro do Badoo deixou uma mensagem para voc? no Badoo. Esta ? uma mensagem de somente envio. Respostas a esta mensagem n?o s?o monitoradas ou respondidas. Se voc? n?o quer receber mais mensagens do Badoo, por favor nos notifique: http://us1.badoo.com/impersonation.phtml?lang_id=61&mail_code=21&email=gstreamer-devel%40lists.sourceforge.net&secret=&invite_id=378079&user_id=1124351687 -------------- next part -------------- An HTML attachment was scrubbed... URL: From Arnab.Samanta at symphonysv.com Wed Sep 29 12:35:02 2010 From: Arnab.Samanta at symphonysv.com (Arnab Samanta) Date: Wed, 29 Sep 2010 16:05:02 +0530 Subject: [gst-devel] Unable to build x64enc plugin from gst-plugins-ugly In-Reply-To: References: Message-ID: <0F5FCFC9A064004DA46F28981BB330C4012BEA611AEF@BLR2K7EXCL01.symphonysv.com> Hello All, I am unable to build the x264enc plugin from package gst-plugins-ugly-0.10.16. I have libx264.so present in /usr/local/lib. I am configuring the package as follows ./configure LDFLAGS=-L/usr/local/lib LIBS=-lx264 LIBS=-lm Am I doing anything wrong ? Or are there any other dependencies for this plugin ? Please suggest. The log from the configure is as follows --------------------------------------------- configure: *** checking feature: x264 plug-in *** configure: *** for plug-ins: x264 *** checking for x264_encoder_encode in -lx264... yes checking x264.h usability... yes checking x264.h presence... yes checking for x264.h... yes checking for uptodate x264 API version... no configure: *** These plugins will not be built: x264 ------------------------------------------------- Regards, Arnab "This email and any files transmitted with it contain confidential, proprietary, privileged information of Symphony Services Corp (India) Pvt. Ltd. and are intended solely for the use of the recipient/s to whom it is addressed. Any unauthorized notifying, copying or distributing of this e-mail, directly or indirectly, and the contents therein in full or part is prohibited by any entity who is not a recipient. Any email received inadvertently or by mistake should be deleted by the entity who is not a recipient thereof. You may be pleased to notify the sender immediately by email and the email should be deleted from your system". From tiagomatos at gmail.com Wed Sep 29 12:42:50 2010 From: tiagomatos at gmail.com (=?UTF-8?Q?Rui_Tiago_Ca=C3=A7=C3=A3o_Matos?=) Date: Wed, 29 Sep 2010 11:42:50 +0100 Subject: [gst-devel] Unable to build x64enc plugin from gst-plugins-ugly In-Reply-To: <0F5FCFC9A064004DA46F28981BB330C4012BEA611AEF@BLR2K7EXCL01.symphonysv.com> References: <0F5FCFC9A064004DA46F28981BB330C4012BEA611AEF@BLR2K7EXCL01.symphonysv.com> Message-ID: On 29 September 2010 11:35, Arnab Samanta wrote: > The log from the configure is as follows > --------------------------------------------- > configure: *** checking feature: x264 plug-in *** > configure: *** for plug-ins: x264 *** > checking for x264_encoder_encode in -lx264... yes > checking x264.h usability... yes > checking x264.h presence... yes > checking for x264.h... yes > checking for uptodate x264 API version... no Pretty obvious, no? You need a newer x264 version. Rui From sledgehammer_999 at hotmail.com Wed Sep 29 15:06:24 2010 From: sledgehammer_999 at hotmail.com (sledge hammer) Date: Wed, 29 Sep 2010 16:06:24 +0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: <1285735793580-2718318.post@n4.nabble.com> References: , , , <1285735793580-2718318.post@n4.nabble.com> Message-ID: @Sandeep Prakash Yes, that was the problem. And it was pretty stupid too. @thiago You were right, it works fine in my example code. So I definetely do something wrong in my code. I will have a closer look into it. For anyone else whose interested here is a link to example code that plays a DivX/XviD-mp3 AVI file with external SRT subs(compile as C++ source)-->http://pastebin.com/vTJh7FcM > Date: Tue, 28 Sep 2010 21:49:53 -0700 > From: 123sandy at gmail.com > To: gstreamer-devel at lists.sourceforge.net > Subject: Re: [gst-devel] Pipeline with multiple filesources is stalled. > > > Hi > > One problem I found in the pad-added callback is: > > /*plug a queue in the pad and decided what to do according to the mimetype*/ > queue = gst_element_factory_make("queue", NULL); > sinkpad = gst_element_get_static_pad(queue, "sink"); > gst_pad_link(pad, sinkpad); > gst_object_unref(sinkpad); > > queue element has to be part of the bin for you to do a gst_pad_link. Here > gst_pad_link will > fail. > > /*plug a queue in the pad and decided what to do according to the mimetype*/ > queue = gst_element_factory_make("queue", NULL); > gst_bin_add(GST_BIN(pipeline), queue); > sinkpad = gst_element_get_static_pad(queue, "sink"); > gst_pad_link(pad, sinkpad); > gst_object_unref(sinkpad); > > Next time better check all the return values. > > Regards, > Sandeep Prakash > http://sandeepprakash.homeip.net > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Pipeline-with-multiple-filesources-is-stalled-tp2714279p2718318.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoni.silvestre at gmail.com Wed Sep 29 16:16:47 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Wed, 29 Sep 2010 16:16:47 +0200 Subject: [gst-devel] G722 in gstreamer Message-ID: Hello, I've been looking for it and I guess there isn't one but just in case I've missed it somehow I'll ask it here, is there any G.722 codec and payloader for gstreamer? Thanks, Toni Silvestre -------------- next part -------------- An HTML attachment was scrubbed... URL: From prybski at cs.cmu.edu Wed Sep 29 16:21:32 2010 From: prybski at cs.cmu.edu (Paul E. Rybski) Date: Wed, 29 Sep 2010 10:21:32 -0400 Subject: [gst-devel] receiving and displaying video streams in a non-GTK application Message-ID: <4CA34B6C.2010700@cs.cmu.edu> Hi, I've just recently discovered gstreamer and have been exploring its use for transmitting audio and video from one linux machine to another. So far I've only been exploring the use of gst-launch but now want to try to receive the video stream in a separate application rather than using xvimagesink. This is complicated by the fact that I'm using FLTK (http://www.fltk.org/) rather than GTK for the GUI. (The reasons for using FLTK are essentially historical and it's currently impractical for me to consider throwing away my old legacy GUI code and rewriting it from scratch in GTK.) I can see two different paths that I can try to follow to achieve my goal: 1) My first option is to try to integrate the gstreamer API directly into my fltk application. I've started to look at the documentation for how to encode gstreamer pipelines in a C application but one thing that currently escapes me is how I get access to the raw uncompressed frames of video at the end of the pipeline. The way I understand it, I should be able to encode my pipeline so that the application receives the video stream from a socket and decodes it (I'm using smokeenc) but then I'm completely unclear as to how I might copy the image into a buffer that I can feed into an FLTK widget for drawing. I'm also completely unclear how easy or difficult it would be to integrate the GTK main event loop with the FLTK main event loop as the gstreamer API seems to be heavily wedded to GTK. I have no experience programming with GTK at the moment either. 2) My second option is to keep the client gst-launch command as it stands now but instead of piping the video to xvimagesink, I create a new local socket (or pipe) and shove the frames of video into those (perhaps encoded as jpegs) and then have my FLTK application receive the data from this pipe, decode each jpeg, and display it. This seems somewhat easier to achieve because then all I need to do is to figure out how the data is encoded into the socket so I can write the code to decode it. Any thoughts, advice, or experiences that people could share with this? I'd kind of like to do the first option because it's conceptually simpler for the end-user of my system but I'm concerned that I might end up needing to rewrite my entire GUI in GTK which I'd rather not have to do at this time. Here are the gst-launch commands that I'm using right now. Server: gst-launch-0.10 -vm oggmux name=mux ! filesink location=movie.ogg v4lsrc ! video/x-raw-yuv,width=320,height=240 ! tee name=t_vnet ! queue ! ffmpegcolorspace ! smokeenc qmin=1 qmax=50 ! udpsink port=5000 host=localhost sync=false t_vnet. ! queue ! videorate ! 'video/x-raw-yuv' ! theoraenc ! mux. alsasrc device=hw:0,0 ! audio/x-raw-int,rate=48000,channels=2,depth=16 ! tee name=t_anet ! queue ! audioconvert ! flacenc ! udpsink port=4000 host=localhost sync=false t_anet. !queue ! audioconvert ! vorbisenc ! mux. Client: gst-launch-0.10 -vm tee name=vid -vm udpsrc port=5000 ! smokedec ! xvimagesink vid. !tee name=aud udpsrc port=4000 ! flacdec ! audioconvert ! audioresample ! alsasink sync=false aud. I'm on Ubuntu 8.04 LTS 64-bit using the gstreamer packages that come with that distro. I've found that these commands also work for me on Ubuntu 10.4 LTS 64-bit. Thanks, -Paul -- Paul E. Rybski, Ph.D., Systems Scientist The Robotics Institute, Carnegie Mellon University Phone: 412-268-7417, Fax: 412-268-7350 Web: http://www.cs.cmu.edu/~prybski From sledgehammer_999 at hotmail.com Wed Sep 29 16:28:34 2010 From: sledgehammer_999 at hotmail.com (sledge hammer) Date: Wed, 29 Sep 2010 17:28:34 +0300 Subject: [gst-devel] Pipeline with multiple filesources is stalled. In-Reply-To: References: , , , , , , <1285735793580-2718318.post@n4.nabble.com>, Message-ID: I actually made the code reproduce my problem. I don't add the textoverlay at the decoding of the video stream. I add it later on the decoding of the sub file and shove it between the ffmpegcolorspace and xvimagesink. I suppose this forceful unlinking has something to do with it. New code-> http://pastebin.com/ZPmj810S What should I do? (I don't want to put unnecessary textoverlays if there's no subtitle file). From: sledgehammer_999 at hotmail.com To: gstreamer-devel at lists.sourceforge.net Date: Wed, 29 Sep 2010 16:06:24 +0300 Subject: Re: [gst-devel] Pipeline with multiple filesources is stalled. @Sandeep Prakash Yes, that was the problem. And it was pretty stupid too. @thiago You were right, it works fine in my example code. So I definetely do something wrong in my code. I will have a closer look into it. For anyone else whose interested here is a link to example code that plays a DivX/XviD-mp3 AVI file with external SRT subs(compile as C++ source)-->http://pastebin.com/vTJh7FcM > Date: Tue, 28 Sep 2010 21:49:53 -0700 > From: 123sandy at gmail.com > To: gstreamer-devel at lists.sourceforge.net > Subject: Re: [gst-devel] Pipeline with multiple filesources is stalled. > > > Hi > > One problem I found in the pad-added callback is: > > /*plug a queue in the pad and decided what to do according to the mimetype*/ > queue = gst_element_factory_make("queue", NULL); > sinkpad = gst_element_get_static_pad(queue, "sink"); > gst_pad_link(pad, sinkpad); > gst_object_unref(sinkpad); > > queue element has to be part of the bin for you to do a gst_pad_link. Here > gst_pad_link will > fail. > > /*plug a queue in the pad and decided what to do according to the mimetype*/ > queue = gst_element_factory_make("queue", NULL); > gst_bin_add(GST_BIN(pipeline), queue); > sinkpad = gst_element_get_static_pad(queue, "sink"); > gst_pad_link(pad, sinkpad); > gst_object_unref(sinkpad); > > Next time better check all the return values. > > Regards, > Sandeep Prakash > http://sandeepprakash.homeip.net > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Pipeline-with-multiple-filesources-is-stalled-tp2714279p2718318.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoni.silvestre at gmail.com Wed Sep 29 17:14:52 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Wed, 29 Sep 2010 17:14:52 +0200 Subject: [gst-devel] GST GL plugins for embedded devices Message-ID: Hello, can the gst-plugins-gl be compiled for embedded devices that only have egl and opengl es? When I run the configure script it looks like everything's fine however when I compile it I see that the code has non-compatible opengl es directives like GL_QUADS. Maybe I should pass something to the configure script or is it that it really needs to have regular openGL? Thanks, Toni Silvestre -------------- next part -------------- An HTML attachment was scrubbed... URL: From andreynech at googlemail.com Wed Sep 29 20:04:56 2010 From: andreynech at googlemail.com (Andrey Nechypurenko) Date: Wed, 29 Sep 2010 20:04:56 +0200 Subject: [gst-devel] receiving and displaying video streams in a non-GTK application In-Reply-To: <4CA34B6C.2010700@cs.cmu.edu> References: <4CA34B6C.2010700@cs.cmu.edu> Message-ID: Hi Paul, > 1) My first option is to try to integrate the gstreamer API directly > into my fltk application. ?I've started to look at the documentation for > how to encode gstreamer pipelines in a C application but one thing that > currently escapes me is how I get access to the raw uncompressed frames > of video at the end of the pipeline. ?The way I understand it, I should > be able to encode my pipeline so that the application receives the video > stream from a socket and decodes it (I'm using smokeenc) but then I'm > completely unclear as to how I might copy the image into a buffer that I > can feed into an FLTK widget for drawing. I would suggest to take a look at appsink[1,2] and fakesink[3] elements. Somehow I feel like appsink is the preffered way. However, fakesink could be used as well with its hand-off mechanism. > I'm also completely unclear > how easy or difficult it would be to integrate the GTK main event loop > with the FLTK main event loop as the gstreamer API seems to be heavily > wedded to GTK. ?I have no experience programming with GTK at the moment > either. I think the simplies way would be to run gtk (gstreamer) event loop in a separate thread. Using appsink or fakesink mentioned above, you will get access to raw frame. Then you will need to implement thread safe mechanism to pass raw buffers from gstreamer thread to your UI thread. For example, there is a set of examples on how to integrate Qt with gstreamer [4] where similar technique is used. In particular, qglwtextureshare shows how to run gstreamer event loop in separate thread and interact with Qt GUI (please note, that this example uses GL texture sharing mechanism instead of passing raw buffers through memory buffers). In addition, this example illustrates how to easily construct the pipeline in essentially the same way as with gst-launch using gst_parse_launch() function. > 2) My second option is to keep the client gst-launch command as it > stands now but instead of piping the video to xvimagesink, I create a > new local socket (or pipe) I personally would not suggest such approach because of greater complexity compared to the first one. > Any thoughts, advice, or experiences that people could share with this? As I understand, you are working on robotics and remotely controlled vehicles. That is why, it might be interesting for you to take a look at this project [6,7]. Here you can find the complete example how to control vehicle over wireless/internet. In particular there is gstreamer based video capturing and encoding from on-board camera, networking infrastructure to stream video/sensor data to driver cockpit and transmit control signals back to the vehicle using Ice middleware [8]. In addition there is SDL/OpenGL based UI to display live video with hardware acceleration which uses the appsink/fakesink/thread approach I mentioned above. Regards, Andrey. [1] http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-appsink.html [2] http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/tests/examples/app?id=e17b42181c2cbcc389f87a35539f7a1b07d3dd54 [3] http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-fakesink.html [4] http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/tree/tests/examples/qt?id=fab824ea01f43c3fecaa2fed5e9e828774db5b24 [5] http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/tree/tests/examples/qt/qglwtextureshare?id=fab824ea01f43c3fecaa2fed5e9e828774db5b24 [6] http://www.gitorious.org/veter/pages/Home [7] http://veter-project.blogspot.com/ [8] http://www.zeroc.com From julien.isorce at gmail.com Wed Sep 29 22:59:39 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Wed, 29 Sep 2010 22:59:39 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: Hi, 2010/9/29 Antoni Silvestre Padr?s > Hello, can the gst-plugins-gl be compiled for embedded devices that only > have egl and opengl es? yes When I run the configure script it looks like everything's fine however when > I compile it I see that the code has non-compatible opengl es directives > like GL_QUADS. > This is not normal. I mean if ES is detected then it only compiles the code that is ES - compatible. Could you send us the log when running configure script and then the exact compile error. Maybe I should pass something to the configure script or is it that it > really needs to have regular openGL? > no Sincerely Julien > Thanks, > Toni Silvestre > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cristianurban86 at gmail.com Thu Sep 30 09:07:12 2010 From: cristianurban86 at gmail.com (cristiurban) Date: Thu, 30 Sep 2010 00:07:12 -0700 (PDT) Subject: [gst-devel] Video metadata Message-ID: <1285830432330-2720275.post@n4.nabble.com> Hello! I am trying to extract video metadata like : Film title, Main Actor, Director, Year, Genre, Languages, Playtime, Label Name, Film Cover. I have try the id3demux element but it doesn't work like it wotks for mp3 samples. Does anyone know how can I extract this metadata? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Video-metadata-tp2720275p2720275.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From Kaustubh.Raste at imgtec.com Thu Sep 30 09:10:39 2010 From: Kaustubh.Raste at imgtec.com (Kaustubh Raste) Date: Thu, 30 Sep 2010 12:40:39 +0530 Subject: [gst-devel] not able to export plugin Message-ID: Hi, I have built the gst plugin and want to use with gstreamer. If I export the path where I have built the testplugin.so plugin to GST_PLUGIN_PATH, then gstreamer can find the plugin and use it. (build path is /home/test/build/bin/testplugin.so) I want to move the testplugin.so file to other folder. When I moved testplugin.so to 'test' folder and exported the GST_PLUGIN_PATH to the 'test', then gst-inspect gives error as (gst-plugin-scanner:18118): GStreamer-WARNING **: Failed to load plugin '/home/test/testplugin.so': /home/test/testplugin.so: cannot open shared object file: No such file or directory. Can someone tell me if this is possible; if yes, then how can I do it. -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoni.silvestre at gmail.com Thu Sep 30 09:40:29 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Thu, 30 Sep 2010 09:40:29 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: Hi, thanks a lot for your answer, here I'm attaching the log. Toni On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: > Hi, > > 2010/9/29 Antoni Silvestre Padr?s > > Hello, can the gst-plugins-gl be compiled for embedded devices that only >> have egl and opengl es? > > > yes > > When I run the configure script it looks like everything's fine however >> when I compile it I see that the code has non-compatible opengl es >> directives like GL_QUADS. >> > > This is not normal. I mean if ES is detected then it only compiles the code > that is ES - compatible. > Could you send us the log when running configure script and then the exact > compile error. > > Maybe I should pass something to the configure script or is it that it >> really needs to have regular openGL? >> > > no > > Sincerely > Julien > > >> Thanks, >> Toni Silvestre >> > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: config.log Type: text/x-log Size: 134920 bytes Desc: not available URL: From Kaustubh.Raste at imgtec.com Thu Sep 30 10:29:46 2010 From: Kaustubh.Raste at imgtec.com (Kaustubh Raste) Date: Thu, 30 Sep 2010 13:59:46 +0530 Subject: [gst-devel] not able to export plugin In-Reply-To: References: Message-ID: Found the problem. Now its working. -Kaustubh From: Kaustubh Raste [mailto:Kaustubh.Raste at imgtec.com] Sent: Thursday, September 30, 2010 12:41 PM To: gstreamer-devel at lists.sourceforge.net. Subject: [gst-devel] not able to export plugin Hi, I have built the gst plugin and want to use with gstreamer. If I export the path where I have built the testplugin.so plugin to GST_PLUGIN_PATH, then gstreamer can find the plugin and use it. (build path is /home/test/build/bin/testplugin.so) I want to move the testplugin.so file to other folder. When I moved testplugin.so to 'test' folder and exported the GST_PLUGIN_PATH to the 'test', then gst-inspect gives error as (gst-plugin-scanner:18118): GStreamer-WARNING **: Failed to load plugin '/home/test/testplugin.so': /home/test/testplugin.so: cannot open shared object file: No such file or directory. Can someone tell me if this is possible; if yes, then how can I do it. -------------- next part -------------- An HTML attachment was scrubbed... URL: From julien.isorce at gmail.com Thu Sep 30 12:47:59 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Thu, 30 Sep 2010 12:47:59 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: 2010/9/30 Antoni Silvestre Padr?s > Hi, thanks a lot for your answer, here I'm attaching the log. > > Toni > > On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: > >> Hi, >> >> 2010/9/29 Antoni Silvestre Padr?s >> >> Hello, can the gst-plugins-gl be compiled for embedded devices that only >>> have egl and opengl es? >> >> >> yes >> >> When I run the configure script it looks like everything's fine however >>> when I compile it I see that the code has non-compatible opengl es >>> directives like GL_QUADS. >>> >> >> This is not normal. I mean if ES is detected then it only compiles the >> code that is ES - compatible. >> Could you send us the log when running configure script and then the exact >> compile error. >> > configure is ok. Compile error log now ? > >> Maybe I should pass something to the configure script or is it that it >>> really needs to have regular openGL? >>> >> >> no >> >> Sincerely >> Julien >> >> >>> Thanks, >>> Toni Silvestre >>> >> >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wim.taymans at gmail.com Thu Sep 30 13:30:55 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Thu, 30 Sep 2010 13:30:55 +0200 Subject: [gst-devel] G722 in gstreamer In-Reply-To: References: Message-ID: <1285846255.2520.15.camel@metal> On Wed, 2010-09-29 at 16:16 +0200, Antoni Silvestre Padr?s wrote: > Hello, > > > I've been looking for it and I guess there isn't one but just in case > I've missed it somehow I'll ask it here, is there any G.722 codec and > payloader for gstreamer? There is a G722 encoder/decoder in ffmpeg git so it could be added to gst-ffmpeg too. I'll commit an G722 payloader/depayloader soon. Wim > > > Thanks, > Toni Silvestre > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From wim.taymans at gmail.com Thu Sep 30 13:30:55 2010 From: wim.taymans at gmail.com (Wim Taymans) Date: Thu, 30 Sep 2010 13:30:55 +0200 Subject: [gst-devel] G722 in gstreamer In-Reply-To: References: Message-ID: <1285846255.2520.15.camel@metal> On Wed, 2010-09-29 at 16:16 +0200, Antoni Silvestre Padr?s wrote: > Hello, > > > I've been looking for it and I guess there isn't one but just in case > I've missed it somehow I'll ask it here, is there any G.722 codec and > payloader for gstreamer? There is a G722 encoder/decoder in ffmpeg git so it could be added to gst-ffmpeg too. I'll commit an G722 payloader/depayloader soon. Wim > > > Thanks, > Toni Silvestre > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ gstreamer-devel mailing list gstreamer-devel at lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gstreamer-devel From antoni.silvestre at gmail.com Thu Sep 30 13:46:00 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Thu, 30 Sep 2010 13:46:00 +0200 Subject: [gst-devel] G722 in gstreamer In-Reply-To: <1285846255.2520.15.camel@metal> References: <1285846255.2520.15.camel@metal> Message-ID: Those are great news, thanks for the information. Toni On Thu, Sep 30, 2010 at 13:30, Wim Taymans wrote: > On Wed, 2010-09-29 at 16:16 +0200, Antoni Silvestre Padr?s wrote: > > Hello, > > > > > > I've been looking for it and I guess there isn't one but just in case > > I've missed it somehow I'll ask it here, is there any G.722 codec and > > payloader for gstreamer? > > There is a G722 encoder/decoder in ffmpeg git so it could be added to > gst-ffmpeg too. I'll commit an G722 payloader/depayloader soon. > > Wim > > > > > > > Thanks, > > Toni Silvestre > > > ------------------------------------------------------------------------------ > > Start uncovering the many advantages of virtual appliances > > and start using them to simplify application deployment and > > accelerate your shift to cloud computing. > > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ gstreamer-devel mailing > list gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From antoni.silvestre at gmail.com Thu Sep 30 14:05:39 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Thu, 30 Sep 2010 14:05:39 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: Here is the compilation log Thanks, Toni On Thu, Sep 30, 2010 at 12:47, Julien Isorce wrote: > > > 2010/9/30 Antoni Silvestre Padr?s > > Hi, thanks a lot for your answer, here I'm attaching the log. >> >> Toni >> >> On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: >> >>> Hi, >>> >>> 2010/9/29 Antoni Silvestre Padr?s >>> >>> Hello, can the gst-plugins-gl be compiled for embedded devices that only >>>> have egl and opengl es? >>> >>> >>> yes >>> >>> When I run the configure script it looks like everything's fine however >>>> when I compile it I see that the code has non-compatible opengl es >>>> directives like GL_QUADS. >>>> >>> >>> This is not normal. I mean if ES is detected then it only compiles the >>> code that is ES - compatible. >>> Could you send us the log when running configure script and then the >>> exact compile error. >>> >> > configure is ok. > > Compile error log now ? > > >> >>> Maybe I should pass something to the configure script or is it that it >>>> really needs to have regular openGL? >>>> >>> >>> no >>> >>> Sincerely >>> Julien >>> >>> >>>> Thanks, >>>> Toni Silvestre >>>> >>> >>> >>> >>> ------------------------------------------------------------------------------ >>> Start uncovering the many advantages of virtual appliances >>> and start using them to simplify application deployment and >>> accelerate your shift to cloud computing. >>> http://p.sf.net/sfu/novell-sfdev2dev >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.sourceforge.net >>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>> >>> >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: comp_log Type: application/octet-stream Size: 2516 bytes Desc: not available URL: From julien.isorce at gmail.com Thu Sep 30 15:00:55 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Thu, 30 Sep 2010 15:00:55 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: ok this a bug. This part of the code is not protected by the var OPENGL_ES2 The problem was introduced in this commit: http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/commit/?id=6184670652af5f378eec42fae84971b0754855cd Could you please open a bug: https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer Then I'll try to make a patch but I have not ES env setup right now, so you will have to wait a little bit. But you can start the patch, you just have to put #ifdef OPENGL_ES2 ... #endif around some of the functions that use regular opengl (in gstglfilter.c) If you have a patch then please attach it to the bug you open before, but after reading this: http://www.gstreamer.net/wiki/SubmittingPatches Sincerely Julien 2010/9/30 Antoni Silvestre Padr?s > Here is the compilation log > > Thanks, > Toni > > > On Thu, Sep 30, 2010 at 12:47, Julien Isorce wrote: > >> >> >> 2010/9/30 Antoni Silvestre Padr?s >> >> Hi, thanks a lot for your answer, here I'm attaching the log. >>> >>> Toni >>> >>> On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: >>> >>>> Hi, >>>> >>>> 2010/9/29 Antoni Silvestre Padr?s >>>> >>>> Hello, can the gst-plugins-gl be compiled for embedded devices that only >>>>> have egl and opengl es? >>>> >>>> >>>> yes >>>> >>>> When I run the configure script it looks like everything's fine however >>>>> when I compile it I see that the code has non-compatible opengl es >>>>> directives like GL_QUADS. >>>>> >>>> >>>> This is not normal. I mean if ES is detected then it only compiles the >>>> code that is ES - compatible. >>>> Could you send us the log when running configure script and then the >>>> exact compile error. >>>> >>> >> configure is ok. >> >> Compile error log now ? >> >> >>> >>>> Maybe I should pass something to the configure script or is it that it >>>>> really needs to have regular openGL? >>>>> >>>> >>>> no >>>> >>>> Sincerely >>>> Julien >>>> >>>> >>>>> Thanks, >>>>> Toni Silvestre >>>>> >>>> >>>> >>>> >>>> ------------------------------------------------------------------------------ >>>> Start uncovering the many advantages of virtual appliances >>>> and start using them to simplify application deployment and >>>> accelerate your shift to cloud computing. >>>> http://p.sf.net/sfu/novell-sfdev2dev >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.sourceforge.net >>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>> >>>> >>> >>> >>> ------------------------------------------------------------------------------ >>> Start uncovering the many advantages of virtual appliances >>> and start using them to simplify application deployment and >>> accelerate your shift to cloud computing. >>> http://p.sf.net/sfu/novell-sfdev2dev >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.sourceforge.net >>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>> >>> >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ensonic at hora-obscura.de Thu Sep 30 16:32:03 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Thu, 30 Sep 2010 17:32:03 +0300 Subject: [gst-devel] receiving and displaying video streams in a non-GTK application In-Reply-To: <4CA34B6C.2010700@cs.cmu.edu> References: <4CA34B6C.2010700@cs.cmu.edu> Message-ID: <4CA49F63.3060105@hora-obscura.de> On 29.09.2010 17:21, Paul E. Rybski wrote: > Hi, > I've just recently discovered gstreamer and have been exploring its use > for transmitting audio and video from one linux machine to another. So > far I've only been exploring the use of gst-launch but now want to try > to receive the video stream in a separate application rather than using > xvimagesink. This is complicated by the fact that I'm using FLTK > (http://www.fltk.org/) rather than GTK for the GUI. (The reasons for > using FLTK are essentially historical and it's currently impractical for > me to consider throwing away my old legacy GUI code and rewriting it > from scratch in GTK.) I can see two different paths that I can try to > follow to achieve my goal: > > 1) My first option is to try to integrate the gstreamer API directly > into my fltk application. I've started to look at the documentation for > how to encode gstreamer pipelines in a C application but one thing that > currently escapes me is how I get access to the raw uncompressed frames > of video at the end of the pipeline. The way I understand it, I should > be able to encode my pipeline so that the application receives the video > stream from a socket and decodes it (I'm using smokeenc) but then I'm > completely unclear as to how I might copy the image into a buffer that I > can feed into an FLTK widget for drawing. I'm also completely unclear > how easy or difficult it would be to integrate the GTK main event loop > with the FLTK main event loop as the gstreamer API seems to be heavily > wedded to GTK. I have no experience programming with GTK at the moment > either. > If there is a drawable widget in fltk that is backed by a xwindow then you should be able to use the xoverlay interface just fine. A quick google search turned up this: http://www.fltk.org/doc-1.0/osissues.html Window fl_xid(const Fl_Window *) Stefan > 2) My second option is to keep the client gst-launch command as it > stands now but instead of piping the video to xvimagesink, I create a > new local socket (or pipe) and shove the frames of video into those > (perhaps encoded as jpegs) and then have my FLTK application receive the > data from this pipe, decode each jpeg, and display it. This seems > somewhat easier to achieve because then all I need to do is to figure > out how the data is encoded into the socket so I can write the code to > decode it. > > Any thoughts, advice, or experiences that people could share with this? > I'd kind of like to do the first option because it's conceptually > simpler for the end-user of my system but I'm concerned that I might end > up needing to rewrite my entire GUI in GTK which I'd rather not have to > do at this time. > > Here are the gst-launch commands that I'm using right now. > > Server: > > gst-launch-0.10 -vm oggmux name=mux ! filesink location=movie.ogg v4lsrc > ! video/x-raw-yuv,width=320,height=240 ! tee name=t_vnet ! queue ! > ffmpegcolorspace ! smokeenc qmin=1 qmax=50 ! udpsink port=5000 > host=localhost sync=false t_vnet. ! queue ! videorate ! > 'video/x-raw-yuv' ! theoraenc ! mux. alsasrc device=hw:0,0 ! > audio/x-raw-int,rate=48000,channels=2,depth=16 ! tee name=t_anet ! queue > ! audioconvert ! flacenc ! udpsink port=4000 host=localhost sync=false > t_anet. !queue ! audioconvert ! vorbisenc ! mux. > > > Client: > > gst-launch-0.10 -vm tee name=vid -vm udpsrc port=5000 ! smokedec ! > xvimagesink vid. !tee name=aud udpsrc port=4000 ! flacdec ! audioconvert > ! audioresample ! alsasink sync=false aud. > > > I'm on Ubuntu 8.04 LTS 64-bit using the gstreamer packages that come > with that distro. I've found that these commands also work for me on > Ubuntu 10.4 LTS 64-bit. > > Thanks, > > -Paul > > From ensonic at hora-obscura.de Thu Sep 30 16:35:16 2010 From: ensonic at hora-obscura.de (Stefan Kost) Date: Thu, 30 Sep 2010 17:35:16 +0300 Subject: [gst-devel] Video metadata In-Reply-To: <1285830432330-2720275.post@n4.nabble.com> References: <1285830432330-2720275.post@n4.nabble.com> Message-ID: <4CA4A024.8050503@hora-obscura.de> On 30.09.2010 10:07, cristiurban wrote: > Hello! I am trying to extract video metadata like : Film title, Main Actor, > Director, Year, Genre, Languages, Playtime, Label Name, Film Cover. I have > try the id3demux element but it doesn't work like it wotks for mp3 samples. > Does anyone know how can I extract this metadata? > >From which format are you trying to get these. Does the metadata show up when doing: gst-launch -t playbin2 uri=file:///path/to/file Stefan From antoni.silvestre at gmail.com Thu Sep 30 17:24:30 2010 From: antoni.silvestre at gmail.com (=?ISO-8859-1?Q?Antoni_Silvestre_Padr=F3s?=) Date: Thu, 30 Sep 2010 17:24:30 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: Hi, thanks a lot for your help! I posted the bug. Also I've made a temporary patch for myself, however I'm pretty sure disabling those parts of the code will make at least the plugins glfilterblur and glfiltersobel unusable, that's no problem for me as I don't need them right now, but I guess when you ask me to submit a patch you were not referring to upload something like that. I've tried to not disable those functions and change them to use openGL ES 2 code but my openGL is a bit rusty and I actually have no idea about openGL ES. I've looked for many examples or guides on the net in order to fix it but all of them were openGL ES 1.0 based and they didn't work... Also there's a lot of code in the tests folder that has non openGL ES 2 instructions so in the end I completely disabled the compilation of that folder. Toni On Thu, Sep 30, 2010 at 15:00, Julien Isorce wrote: > > ok this a bug. This part of the code is not protected by the var OPENGL_ES2 > > The problem was introduced in this commit: > > http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/commit/?id=6184670652af5f378eec42fae84971b0754855cd > > > Could > you please open a bug: > https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer > > Then I'll try to make a patch but I have not ES env setup right now, so you > will have to wait a little bit. > > But you can start the patch, you just have to put > > #ifdef OPENGL_ES2 > ... > #endif > > around some of the functions that use regular opengl (in gstglfilter.c) > > If you have a patch then please attach it to the bug you open before, but > after reading this: > http://www.gstreamer.net/wiki/SubmittingPatches > > Sincerely > Julien > > > 2010/9/30 Antoni Silvestre Padr?s > >> Here is the compilation log >> >> Thanks, >> Toni >> >> >> On Thu, Sep 30, 2010 at 12:47, Julien Isorce wrote: >> >>> >>> >>> 2010/9/30 Antoni Silvestre Padr?s >>> >>> Hi, thanks a lot for your answer, here I'm attaching the log. >>>> >>>> Toni >>>> >>>> On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: >>>> >>>>> Hi, >>>>> >>>>> 2010/9/29 Antoni Silvestre Padr?s >>>>> >>>>> Hello, can the gst-plugins-gl be compiled for embedded devices that >>>>>> only have egl and opengl es? >>>>> >>>>> >>>>> yes >>>>> >>>>> When I run the configure script it looks like everything's fine however >>>>>> when I compile it I see that the code has non-compatible opengl es >>>>>> directives like GL_QUADS. >>>>>> >>>>> >>>>> This is not normal. I mean if ES is detected then it only compiles the >>>>> code that is ES - compatible. >>>>> Could you send us the log when running configure script and then the >>>>> exact compile error. >>>>> >>>> >>> configure is ok. >>> >>> Compile error log now ? >>> >>> >>>> >>>>> Maybe I should pass something to the configure script or is it that it >>>>>> really needs to have regular openGL? >>>>>> >>>>> >>>>> no >>>>> >>>>> Sincerely >>>>> Julien >>>>> >>>>> >>>>>> Thanks, >>>>>> Toni Silvestre >>>>>> >>>>> >>>>> >>>>> >>>>> ------------------------------------------------------------------------------ >>>>> Start uncovering the many advantages of virtual appliances >>>>> and start using them to simplify application deployment and >>>>> accelerate your shift to cloud computing. >>>>> http://p.sf.net/sfu/novell-sfdev2dev >>>>> _______________________________________________ >>>>> gstreamer-devel mailing list >>>>> gstreamer-devel at lists.sourceforge.net >>>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>>> >>>>> >>>> >>>> >>>> ------------------------------------------------------------------------------ >>>> Start uncovering the many advantages of virtual appliances >>>> and start using them to simplify application deployment and >>>> accelerate your shift to cloud computing. >>>> http://p.sf.net/sfu/novell-sfdev2dev >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.sourceforge.net >>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>> >>>> >>> >>> >>> ------------------------------------------------------------------------------ >>> Start uncovering the many advantages of virtual appliances >>> and start using them to simplify application deployment and >>> accelerate your shift to cloud computing. >>> http://p.sf.net/sfu/novell-sfdev2dev >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.sourceforge.net >>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>> >>> >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From julien.isorce at gmail.com Thu Sep 30 20:27:35 2010 From: julien.isorce at gmail.com (Julien Isorce) Date: Thu, 30 Sep 2010 20:27:35 +0200 Subject: [gst-devel] GST GL plugins for embedded devices In-Reply-To: References: Message-ID: Let's continue our discussion on bug #631019 2010/9/30 Antoni Silvestre Padr?s > Hi, thanks a lot for your help! I posted the bug. Also I've made a > temporary patch for myself, however I'm pretty sure disabling those parts of > the code will make at least the plugins glfilterblur and glfiltersobel > unusable, that's no problem for me as I don't need them right now, but I > guess when you ask me to submit a patch you were not referring to upload > something like that. > > I've tried to not disable those functions and change them to use openGL ES > 2 code but my openGL is a bit rusty and I actually have no idea about openGL > ES. I've looked for many examples or guides on the net in order to fix it > but all of them were openGL ES 1.0 based and they didn't work... > > Also there's a lot of code in the tests folder that has non openGL ES 2 > instructions so in the end I completely disabled the compilation of that > folder. > > Toni > > > On Thu, Sep 30, 2010 at 15:00, Julien Isorce wrote: > >> >> ok this a bug. This part of the code is not protected by the >> var OPENGL_ES2 >> >> The problem was introduced in this commit: >> >> http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/commit/?id=6184670652af5f378eec42fae84971b0754855cd >> >> >> Could >> you please open a bug: >> https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer >> >> Then I'll try to make a patch but I have not ES env setup right now, so >> you will have to wait a little bit. >> >> But you can start the patch, you just have to put >> >> #ifdef OPENGL_ES2 >> ... >> #endif >> >> around some of the functions that use regular opengl (in gstglfilter.c) >> >> If you have a patch then please attach it to the bug you open before, but >> after reading this: >> http://www.gstreamer.net/wiki/SubmittingPatches >> >> Sincerely >> Julien >> >> >> 2010/9/30 Antoni Silvestre Padr?s >> >>> Here is the compilation log >>> >>> Thanks, >>> Toni >>> >>> >>> On Thu, Sep 30, 2010 at 12:47, Julien Isorce wrote: >>> >>>> >>>> >>>> 2010/9/30 Antoni Silvestre Padr?s >>>> >>>> Hi, thanks a lot for your answer, here I'm attaching the log. >>>>> >>>>> Toni >>>>> >>>>> On Wed, Sep 29, 2010 at 22:59, Julien Isorce wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> 2010/9/29 Antoni Silvestre Padr?s >>>>>> >>>>>> Hello, can the gst-plugins-gl be compiled for embedded devices that >>>>>>> only have egl and opengl es? >>>>>> >>>>>> >>>>>> yes >>>>>> >>>>>> When I run the configure script it looks like everything's fine >>>>>>> however when I compile it I see that the code has non-compatible opengl es >>>>>>> directives like GL_QUADS. >>>>>>> >>>>>> >>>>>> This is not normal. I mean if ES is detected then it only compiles the >>>>>> code that is ES - compatible. >>>>>> Could you send us the log when running configure script and then the >>>>>> exact compile error. >>>>>> >>>>> >>>> configure is ok. >>>> >>>> Compile error log now ? >>>> >>>> >>>>> >>>>>> Maybe I should pass something to the configure script or is it that it >>>>>>> really needs to have regular openGL? >>>>>>> >>>>>> >>>>>> no >>>>>> >>>>>> Sincerely >>>>>> Julien >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> Toni Silvestre >>>>>>> >>>>>> >>>>>> >>>>>> >>>>>> ------------------------------------------------------------------------------ >>>>>> Start uncovering the many advantages of virtual appliances >>>>>> and start using them to simplify application deployment and >>>>>> accelerate your shift to cloud computing. >>>>>> http://p.sf.net/sfu/novell-sfdev2dev >>>>>> _______________________________________________ >>>>>> gstreamer-devel mailing list >>>>>> gstreamer-devel at lists.sourceforge.net >>>>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>>>> >>>>>> >>>>> >>>>> >>>>> ------------------------------------------------------------------------------ >>>>> Start uncovering the many advantages of virtual appliances >>>>> and start using them to simplify application deployment and >>>>> accelerate your shift to cloud computing. >>>>> http://p.sf.net/sfu/novell-sfdev2dev >>>>> _______________________________________________ >>>>> gstreamer-devel mailing list >>>>> gstreamer-devel at lists.sourceforge.net >>>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>>> >>>>> >>>> >>>> >>>> ------------------------------------------------------------------------------ >>>> Start uncovering the many advantages of virtual appliances >>>> and start using them to simplify application deployment and >>>> accelerate your shift to cloud computing. >>>> http://p.sf.net/sfu/novell-sfdev2dev >>>> _______________________________________________ >>>> gstreamer-devel mailing list >>>> gstreamer-devel at lists.sourceforge.net >>>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>>> >>>> >>> >>> >>> ------------------------------------------------------------------------------ >>> Start uncovering the many advantages of virtual appliances >>> and start using them to simplify application deployment and >>> accelerate your shift to cloud computing. >>> http://p.sf.net/sfu/novell-sfdev2dev >>> _______________________________________________ >>> gstreamer-devel mailing list >>> gstreamer-devel at lists.sourceforge.net >>> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >>> >>> >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gibrovacco at gmail.com Thu Sep 30 20:41:34 2010 From: gibrovacco at gmail.com (Marco Ballesio) Date: Thu, 30 Sep 2010 21:41:34 +0300 Subject: [gst-devel] [ANN] new release of GladSToNe available Message-ID: Hi all, for who's interested (maybe Farsight fellas) a new/first release of GladSToNe g729 plugin is available at: http://www.gitorious.org/gladstone/ A big thank you to Youness Alaoui who gave his time to the elements' timing. News and noteworthy: - Semi-original revision brought to gitorius with some changes. - Added timestamp handling improvements - credits to Youness Alaoui aka finikino. - Added reference code patches handling. - Added first reference code patch set with a 50% of overall optimisation. - Added (optional) possibility to automatically dowload the reference code at compile time. - Added automatic conversion of the filenames to be Makefile-firendly. - Added option to specify a path where the reference code can be found. Plans for the future: - Improvement of automatic download though merge of the second patch from Kakaroto. - Annex B support. - Comfort Noise Generation support. Regards. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rob.krakora at messagenetsystems.com Tue Sep 21 22:52:44 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Tue, 21 Sep 2010 20:52:44 -0000 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video Message-ID: Hi Faruk, I believe that you want something like this: gst-launch -v gnomevfssrc location=http://admin:mncamera at 192.168.1.176/img/video.asf ! tee name=t ! queue ! fluasfdemux ! mpeg4videoparse ! flumpeg4vdec ! xvimagesink t. ! queue ! fluasfdemux ! mpeg4videoparse ! mp4mux ! filesink location=file.mp4 This pipeline plays back an MPEG4 video in an ASF container from a Linksys WVC200 PTZ camera over an HTTP connection and also transcodes and stores it as an 'mp4' file. The 'tee' and 'queue' elements are the keys here. Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax On Tue, Sep 21, 2010 at 8:45 AM, < gstreamer-devel-request at lists.sourceforge.net> wrote: > Send gstreamer-devel mailing list submissions to > gstreamer-devel at lists.sourceforge.net > > To subscribe or unsubscribe via the World Wide Web, visit > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > or, via email, send a message with subject or body 'help' to > gstreamer-devel-request at lists.sourceforge.net > > You can reach the person managing the list at > gstreamer-devel-owner at lists.sourceforge.net > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of gstreamer-devel digest..." > > > Today's Topics: > > 1. Re: simultaneously showing and recording MPEG-2 video > (Michael Joachimiak) > 2. Re: simultaneously showing and recording MPEG-2 video > (Michael Joachimiak) > 3. Re: simultaneously showing and recording MPEG-2 video > (Michael Joachimiak) > 4. Re: H264 problems (Gary Thomas) > 5. question on playbin2 for RTP streaming. (wu jieke) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Tue, 21 Sep 2010 14:52:53 +0300 > From: Michael Joachimiak > Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 > video > To: Discussion of the development of GStreamer > > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > You could take a look at tee element. > It might be suitable for you. > > 2010/9/16 frknml > > > > > > > > > Hi everyone; > > > > I'm very new for gstreamer and i'm developing multimedi project.My first > > aim > > is showing video which is in my local file system and at the same time i > > want to record this video as a second copy of my original video.I can > show > > video and i can record this video individually but not simultaneously.I > > couldn't find enough information in Gstreamer Application Development > > Manual > > from gstreamer.net to solve my problem. > > If you have any document,ebook or example please share me :) because i > > couldn't find any useful resource about gstreamer. > > > > Faruk > > -- > > View this message in context: > > > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > > > > > ------------------------------------------------------------------------------ > > Start uncovering the many advantages of virtual appliances > > and start using them to simplify application deployment and > > accelerate your shift to cloud computing. > > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > -- > Your Sincerely > Michael Joachimiak > -------------- next part -------------- > An HTML attachment was scrubbed... > > ------------------------------ > > Message: 2 > Date: Tue, 21 Sep 2010 14:52:53 +0300 > From: Michael Joachimiak > Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 > video > To: Discussion of the development of GStreamer > > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > You could take a look at tee element. > It might be suitable for you. > > 2010/9/16 frknml > > > > > > > > > Hi everyone; > > > > I'm very new for gstreamer and i'm developing multimedi project.My first > > aim > > is showing video which is in my local file system and at the same time i > > want to record this video as a second copy of my original video.I can > show > > video and i can record this video individually but not simultaneously.I > > couldn't find enough information in Gstreamer Application Development > > Manual > > from gstreamer.net to solve my problem. > > If you have any document,ebook or example please share me :) because i > > couldn't find any useful resource about gstreamer. > > > > Faruk > > -- > > View this message in context: > > > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > > > > > ------------------------------------------------------------------------------ > > Start uncovering the many advantages of virtual appliances > > and start using them to simplify application deployment and > > accelerate your shift to cloud computing. > > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > -- > Your Sincerely > Michael Joachimiak > -------------- next part -------------- > An HTML attachment was scrubbed... > > ------------------------------ > > Message: 3 > Date: Tue, 21 Sep 2010 14:52:53 +0300 > From: Michael Joachimiak > Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 > video > To: Discussion of the development of GStreamer > > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > You could take a look at tee element. > It might be suitable for you. > > 2010/9/16 frknml > > > > > > > > > Hi everyone; > > > > I'm very new for gstreamer and i'm developing multimedi project.My first > > aim > > is showing video which is in my local file system and at the same time i > > want to record this video as a second copy of my original video.I can > show > > video and i can record this video individually but not simultaneously.I > > couldn't find enough information in Gstreamer Application Development > > Manual > > from gstreamer.net to solve my problem. > > If you have any document,ebook or example please share me :) because i > > couldn't find any useful resource about gstreamer. > > > > Faruk > > -- > > View this message in context: > > > http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html > > Sent from the GStreamer-devel mailing list archive at Nabble.com. > > > > > > > ------------------------------------------------------------------------------ > > Start uncovering the many advantages of virtual appliances > > and start using them to simplify application deployment and > > accelerate your shift to cloud computing. > > http://p.sf.net/sfu/novell-sfdev2dev > > _______________________________________________ > > gstreamer-devel mailing list > > gstreamer-devel at lists.sourceforge.net > > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > > > > -- > Your Sincerely > Michael Joachimiak > -------------- next part -------------- > An HTML attachment was scrubbed... > > ------------------------------ > > Message: 4 > Date: Tue, 21 Sep 2010 06:13:01 -0600 > From: Gary Thomas > Subject: Re: [gst-devel] H264 problems > To: Discussion of the development of GStreamer > > Message-ID: <4C98A14D.8030508 at mlbassoc.com> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed > > On 09/21/2010 02:43 AM, Andrey Nechypurenko wrote: > >> and this one stall? > >> > >> gst-launch -v v4l2src num-buffers=200 ! > >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)UYVY' ! > >> \ ffmpegcolorspace ! > >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)I420' ! > >> x264enc ! filesink location=/tmp/hold.h264 > > > > I would suggest to try the following. Run your pipeline with > > increased debug level, i.e. GST_DEBUG=3 gst-launch ... and search > > for any relevant messages which might give you the hint about > > what is going wrong. In addition, I remember some strange > > behavior got fixed by explicitly mentioning the framerate. In > > your case, for example, in caps filter right after v4l2src add > > framerate=30/1 or whatever is appropriate frame rate for your > > camera. > > I sure don't see anything, perhaps someone that understands this better > can. I put a level 4 dump of this at http://pastebin.com/iDDVuHgv > > > > > In addition, since you are using gstreamer on TI platform, you > > can consider asking the question also here: > > > https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?action=ForumBrowse&forum_id=187 > > Except that my problem is not with any of the TI components - it's > only the off-the-shelf encoder that stalls. > > -- > ------------------------------------------------------------ > Gary Thomas | Consulting for the > MLB Associates | Embedded world > ------------------------------------------------------------ > > > > ------------------------------ > > Message: 5 > Date: Tue, 21 Sep 2010 09:09:45 +0800 > From: wu jieke > Subject: [gst-devel] question on playbin2 for RTP streaming. > To: gstreamer-devel at lists.sourceforge.net > Message-ID: > > Content-Type: text/plain; charset="iso-8859-1" > > hi, all , > i am setting up RTP streaming environment between a X86 server and a > embedded system client, host app is VLC, and target/client is gst-launch, > commands are following: > > HOST: > # vlc -vvv big_buck_bunny_480p_h264.mov --sout > '#rtp{dst=, port=5004,sdp=rtsp://:8080/test.sdp}' > > Target/client: > # gst-launch udpsrc multicast-group= > > caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" > port=5004 ! rtph264depay ! my-codec-hw ! my-render-hw" > > the command works well, then i hope playbin2 creates hardware pipeline > automatically, command line here: > > # gst-launch playbin2 uri=rtsp://:8080/test.sdp > > it fails to play. btw : playbin2 works well with my optimized codec > and render, i test it with command. (gst-launch playbin2 > uri=file:///big_buck.mov ), it can find the right elements, such as > "my-codec-hw" and "my-render-hw". > > then i dump the log of gst-launch , and find playbin2 not perform > preroll for live streams, which cause full or real pipeline is not ready > before getting GstSystemClock. > in fact, my optimized render can only use the specified clock provided > with *_sink_provide_clock(), not GstSystemClock. so the pipeline hang even > it links the optimized elements. > > my question is that how can i tell playbin2 to use my provided clock > for live pipeline? > if any misunderstanding , pls correct me. > > > > -- > It's not the things you do in life that you regret , > but the things that you do not do > -------------- next part -------------- > An HTML attachment was scrubbed... > > ------------------------------ > > > ------------------------------------------------------------------------------ > Start uncovering the many advantages of virtual appliances > and start using them to simplify application deployment and > accelerate your shift to cloud computing. > http://p.sf.net/sfu/novell-sfdev2dev > > ------------------------------ > > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > > End of gstreamer-devel Digest, Vol 52, Issue 48 > *********************************************** > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rob.krakora at messagenetsystems.com Tue Sep 21 23:07:11 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Tue, 21 Sep 2010 21:07:11 -0000 Subject: [gst-devel] simultaneously showing and recording MPEG-2 video In-Reply-To: References: Message-ID: Faruk: One thing I forgot on the pipeline example in my response was the '--*eos* -on-*shutdown' option *after 'gst-launch'. This is needed by the 'filesink' element in order to properly close the file on a SIGINT. gst-launch -v --eos-on-shutdown gnomevfssrc location=http://admin:mncamera at 192.168.1.176/img/video.asf ! tee name=t ! queue ! fluasfdemux ! mpeg4videoparse ! flumpeg4vdec ! xvimagesink t. ! queue ! fluasfdemux ! mpeg4videoparse ! mp4mux ! filesink location=file.mp4 Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax On Tue, Sep 21, 2010 at 4:52 PM, Robert Krakora < rob.krakora at messagenetsystems.com> wrote: > Hi Faruk, > > I believe that you want something like this: > > gst-launch -v gnomevfssrc location=http://admin:mncamera at 192.168.1.176/img/video.asf ! tee name=t ! queue > > ! fluasfdemux ! mpeg4videoparse ! flumpeg4vdec ! xvimagesink t. ! queue ! > fluasfdemux ! mpeg4videoparse ! mp4mux ! filesink location=file.mp4 > > This pipeline plays back an MPEG4 video in an ASF container from a Linksys WVC200 PTZ camera over an HTTP connection and also transcodes and stores > > it as an 'mp4' file. The 'tee' and 'queue' elements are the keys here. > > Best Regards, > > -- > Rob Krakora > Senior Software Engineer > MessageNet Systems > 101 East Carmel Dr. Suite 105 > Carmel, IN 46032 > (317)566-1677 Ext. 206 > (317)663-0808 Fax > > On Tue, Sep 21, 2010 at 8:45 AM, < > gstreamer-devel-request at lists.sourceforge.net> wrote: > >> Send gstreamer-devel mailing list submissions to >> gstreamer-devel at lists.sourceforge.net >> >> To subscribe or unsubscribe via the World Wide Web, visit >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> or, via email, send a message with subject or body 'help' to >> gstreamer-devel-request at lists.sourceforge.net >> >> You can reach the person managing the list at >> gstreamer-devel-owner at lists.sourceforge.net >> >> When replying, please edit your Subject line so it is more specific >> than "Re: Contents of gstreamer-devel digest..." >> >> >> Today's Topics: >> >> 1. Re: simultaneously showing and recording MPEG-2 video >> (Michael Joachimiak) >> 2. Re: simultaneously showing and recording MPEG-2 video >> (Michael Joachimiak) >> 3. Re: simultaneously showing and recording MPEG-2 video >> (Michael Joachimiak) >> 4. Re: H264 problems (Gary Thomas) >> 5. question on playbin2 for RTP streaming. (wu jieke) >> >> >> ---------------------------------------------------------------------- >> >> Message: 1 >> Date: Tue, 21 Sep 2010 14:52:53 +0300 >> From: Michael Joachimiak >> Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 >> video >> To: Discussion of the development of GStreamer >> >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> You could take a look at tee element. >> It might be suitable for you. >> >> 2010/9/16 frknml >> >> > >> > >> > >> > Hi everyone; >> > >> > I'm very new for gstreamer and i'm developing multimedi project.My first >> > aim >> > is showing video which is in my local file system and at the same time i >> > want to record this video as a second copy of my original video.I can >> show >> > video and i can record this video individually but not simultaneously.I >> > couldn't find enough information in Gstreamer Application Development >> > Manual >> > from gstreamer.net to solve my problem. >> > If you have any document,ebook or example please share me :) because i >> > couldn't find any useful resource about gstreamer. >> > >> > Faruk >> > -- >> > View this message in context: >> > >> http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html >> > Sent from the GStreamer-devel mailing list archive at Nabble.com. >> > >> > >> > >> ------------------------------------------------------------------------------ >> > Start uncovering the many advantages of virtual appliances >> > and start using them to simplify application deployment and >> > accelerate your shift to cloud computing. >> > http://p.sf.net/sfu/novell-sfdev2dev >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > >> >> >> >> -- >> Your Sincerely >> Michael Joachimiak >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> >> ------------------------------ >> >> Message: 2 >> Date: Tue, 21 Sep 2010 14:52:53 +0300 >> From: Michael Joachimiak >> Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 >> video >> To: Discussion of the development of GStreamer >> >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> You could take a look at tee element. >> It might be suitable for you. >> >> 2010/9/16 frknml >> >> > >> > >> > >> > Hi everyone; >> > >> > I'm very new for gstreamer and i'm developing multimedi project.My first >> > aim >> > is showing video which is in my local file system and at the same time i >> > want to record this video as a second copy of my original video.I can >> show >> > video and i can record this video individually but not simultaneously.I >> > couldn't find enough information in Gstreamer Application Development >> > Manual >> > from gstreamer.net to solve my problem. >> > If you have any document,ebook or example please share me :) because i >> > couldn't find any useful resource about gstreamer. >> > >> > Faruk >> > -- >> > View this message in context: >> > >> http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html >> > Sent from the GStreamer-devel mailing list archive at Nabble.com. >> > >> > >> > >> ------------------------------------------------------------------------------ >> > Start uncovering the many advantages of virtual appliances >> > and start using them to simplify application deployment and >> > accelerate your shift to cloud computing. >> > http://p.sf.net/sfu/novell-sfdev2dev >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > >> >> >> >> -- >> Your Sincerely >> Michael Joachimiak >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> >> ------------------------------ >> >> Message: 3 >> Date: Tue, 21 Sep 2010 14:52:53 +0300 >> From: Michael Joachimiak >> Subject: Re: [gst-devel] simultaneously showing and recording MPEG-2 >> video >> To: Discussion of the development of GStreamer >> >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> You could take a look at tee element. >> It might be suitable for you. >> >> 2010/9/16 frknml >> >> > >> > >> > >> > Hi everyone; >> > >> > I'm very new for gstreamer and i'm developing multimedi project.My first >> > aim >> > is showing video which is in my local file system and at the same time i >> > want to record this video as a second copy of my original video.I can >> show >> > video and i can record this video individually but not simultaneously.I >> > couldn't find enough information in Gstreamer Application Development >> > Manual >> > from gstreamer.net to solve my problem. >> > If you have any document,ebook or example please share me :) because i >> > couldn't find any useful resource about gstreamer. >> > >> > Faruk >> > -- >> > View this message in context: >> > >> http://gstreamer-devel.966125.n4.nabble.com/simultaneously-showing-and-recording-MPEG-2-video-tp2541735p2541735.html >> > Sent from the GStreamer-devel mailing list archive at Nabble.com. >> > >> > >> > >> ------------------------------------------------------------------------------ >> > Start uncovering the many advantages of virtual appliances >> > and start using them to simplify application deployment and >> > accelerate your shift to cloud computing. >> > http://p.sf.net/sfu/novell-sfdev2dev >> > _______________________________________________ >> > gstreamer-devel mailing list >> > gstreamer-devel at lists.sourceforge.net >> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> > >> >> >> >> -- >> Your Sincerely >> Michael Joachimiak >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> >> ------------------------------ >> >> Message: 4 >> Date: Tue, 21 Sep 2010 06:13:01 -0600 >> From: Gary Thomas >> Subject: Re: [gst-devel] H264 problems >> To: Discussion of the development of GStreamer >> >> Message-ID: <4C98A14D.8030508 at mlbassoc.com> >> Content-Type: text/plain; charset=ISO-8859-1; format=flowed >> >> On 09/21/2010 02:43 AM, Andrey Nechypurenko wrote: >> >> and this one stall? >> >> >> >> gst-launch -v v4l2src num-buffers=200 ! >> >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)UYVY' ! >> >> \ ffmpegcolorspace ! >> >> 'video/x-raw-yuv,width=720,height=480,format=(fourcc)I420' ! >> >> x264enc ! filesink location=/tmp/hold.h264 >> > >> > I would suggest to try the following. Run your pipeline with >> > increased debug level, i.e. GST_DEBUG=3 gst-launch ... and search >> > for any relevant messages which might give you the hint about >> > what is going wrong. In addition, I remember some strange >> > behavior got fixed by explicitly mentioning the framerate. In >> > your case, for example, in caps filter right after v4l2src add >> > framerate=30/1 or whatever is appropriate frame rate for your >> > camera. >> >> I sure don't see anything, perhaps someone that understands this better >> can. I put a level 4 dump of this at http://pastebin.com/iDDVuHgv >> >> > >> > In addition, since you are using gstreamer on TI platform, you >> > can consider asking the question also here: >> > >> https://gstreamer.ti.com/gf/project/gstreamer_ti/forum/?action=ForumBrowse&forum_id=187 >> >> Except that my problem is not with any of the TI components - it's >> only the off-the-shelf encoder that stalls. >> >> -- >> ------------------------------------------------------------ >> Gary Thomas | Consulting for the >> MLB Associates | Embedded world >> ------------------------------------------------------------ >> >> >> >> ------------------------------ >> >> Message: 5 >> Date: Tue, 21 Sep 2010 09:09:45 +0800 >> From: wu jieke >> Subject: [gst-devel] question on playbin2 for RTP streaming. >> To: gstreamer-devel at lists.sourceforge.net >> Message-ID: >> >> Content-Type: text/plain; charset="iso-8859-1" >> >> hi, all , >> i am setting up RTP streaming environment between a X86 server and a >> embedded system client, host app is VLC, and target/client is gst-launch, >> commands are following: >> >> HOST: >> # vlc -vvv big_buck_bunny_480p_h264.mov --sout >> '#rtp{dst=, port=5004,sdp=rtsp://:8080/test.sdp}' >> >> Target/client: >> # gst-launch udpsrc multicast-group= >> >> caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" >> port=5004 ! rtph264depay ! my-codec-hw ! my-render-hw" >> >> the command works well, then i hope playbin2 creates hardware >> pipeline >> automatically, command line here: >> >> # gst-launch playbin2 uri=rtsp://:8080/test.sdp >> >> it fails to play. btw : playbin2 works well with my optimized codec >> and render, i test it with command. (gst-launch playbin2 >> uri=file:///big_buck.mov ), it can find the right elements, such as >> "my-codec-hw" and "my-render-hw". >> >> then i dump the log of gst-launch , and find playbin2 not perform >> preroll for live streams, which cause full or real pipeline is not ready >> before getting GstSystemClock. >> in fact, my optimized render can only use the specified clock >> provided >> with *_sink_provide_clock(), not GstSystemClock. so the pipeline hang >> even >> it links the optimized elements. >> >> my question is that how can i tell playbin2 to use my provided clock >> for live pipeline? >> if any misunderstanding , pls correct me. >> >> >> >> -- >> It's not the things you do in life that you regret , >> but the things that you do not do >> -------------- next part -------------- >> An HTML attachment was scrubbed... >> >> ------------------------------ >> >> >> ------------------------------------------------------------------------------ >> Start uncovering the many advantages of virtual appliances >> and start using them to simplify application deployment and >> accelerate your shift to cloud computing. >> http://p.sf.net/sfu/novell-sfdev2dev >> >> ------------------------------ >> >> _______________________________________________ >> gstreamer-devel mailing list >> gstreamer-devel at lists.sourceforge.net >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> >> End of gstreamer-devel Digest, Vol 52, Issue 48 >> *********************************************** >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From confusosk8 at gmail.com Wed Sep 22 21:17:27 2010 From: confusosk8 at gmail.com (Gabriel Duarte) Date: Wed, 22 Sep 2010 16:17:27 -0300 Subject: [gst-devel] Grab frames from the stream Message-ID: Hello guys! I'm using dv1394src from C, but I cant grab RGB data into a file... There is some way? Cheers! -- Gabriel Duarte Linux User #471185 Rio de Janeiro - RJ http://w3.impa.br/~gabrield Phones: (55) (21) 9463-7760 -> Mobile (55) (21) 2464-9302 -> Home (55) (21) 2529-5080 -> Work -------------- next part -------------- An HTML attachment was scrubbed... URL: From lijinshyam at gmail.com Thu Sep 23 17:53:33 2010 From: lijinshyam at gmail.com (liJin) Date: Thu, 23 Sep 2010 21:23:33 +0530 Subject: [gst-devel] Using gst_object_unref In-Reply-To: <1285256174.5374.2.camel@zingle> References: <1285256174.5374.2.camel@zingle> Message-ID: Great ...ThankS...[?] On Thu, Sep 23, 2010 at 9:06 PM, Tim-Philipp M?ller wrote: > On Thu, 2010-09-23 at 20:46 +0530, liJin wrote: > > > I just confused about the "Hello World " example in the gstreamer > > manual. I can see a pipeline and some elements are created in code. > > In the end the pipeline is unrefed , using the method > > gst_object_unref(pipeline). What what about the elements we > > created...? Its there and the reference count of the elements doesn't > > seems decremented.... > > > > In chapter 5.2 . its says we need to use gst-object_unref for > > elements also > > if you use > > gst_bin_add (bin_or_pipeline, element); > > then the bin/pipeline will take ownership of the element (if it is a > newly-created element) and will take care of freeing it when the > pipeline is freed. Also see > > > http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstObject.html#GstObject.description > > > Cheers > -Tim > > > > ------------------------------------------------------------------------------ > Nokia and AT&T present the 2010 Calling All Innovators-North America > contest > Create new apps & games for the Nokia N8 for consumers in U.S. and Canada > $10 million total in prizes - $4M cash, 500 devices, nearly $6M in > marketing > Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store > http://p.sf.net/sfu/nokia-dev2dev > _______________________________________________ > gstreamer-devel mailing list > gstreamer-devel at lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/png Size: 569 bytes Desc: not available URL: From pedro.rootss at yahoo.com.br Fri Sep 24 19:06:38 2010 From: pedro.rootss at yahoo.com.br (Pedro.henrique) Date: Fri, 24 Sep 2010 10:06:38 -0700 (PDT) Subject: [gst-devel] Where can I get x264enc, x264dec, rtph264pay and depay? In-Reply-To: <1285344008.27127.5.camel@zingle> References: <1285343058605-2553663.post@n4.nabble.com> <1285344008.27127.5.camel@zingle> Message-ID: <1285347998781-2553768.post@n4.nabble.com> I'm using this pipeline, gst-launch souphttpsrc location="http://gstramer.dvrdns.org:5000/CAM:1" ! jpegdec ! x264enc ! ffdec_h264 ! udpsink host = . . . port = 5000 but it ins't work ERROR : -> " Unexpected x264 Error " -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Where-can-I-get-x264enc-x264dec-rtph264pay-and-depay-tp2553663p2553768.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.rootss at yahoo.com.br Fri Sep 24 19:17:55 2010 From: pedro.rootss at yahoo.com.br (Pedro.henrique) Date: Fri, 24 Sep 2010 10:17:55 -0700 (PDT) Subject: [gst-devel] Where can I get x264enc, x264dec, rtph264pay and depay? In-Reply-To: <1285347998781-2553768.post@n4.nabble.com> References: <1285343058605-2553663.post@n4.nabble.com> <1285344008.27127.5.camel@zingle> <1285347998781-2553768.post@n4.nabble.com> Message-ID: <1285348675014-2554407.post@n4.nabble.com> i tried also with webcam images autovideosrc ! udpsink host = ... port = ... Seeting Pipeline To PAUSED . . . Pipeline is Live and does not need PRERROL ... Setting pipeline to PLAYING . . . New clock : GstSystemClock Receiver ... udpsrc ! autovideosink autovideosrc ! udpsink host = ... port = ... Seeting Pipeline To PAUSED . . . Pipeline is Live and does not need PRERROL ... Setting pipeline to PLAYING . . . New clock : GstSystemClock But, i cant see de webcam images is correct? -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Where-can-I-get-x264enc-x264dec-rtph264pay-and-depay-tp2553663p2554407.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From pedro.rootss at yahoo.com.br Sat Sep 25 00:56:36 2010 From: pedro.rootss at yahoo.com.br (Pedro.henrique) Date: Fri, 24 Sep 2010 15:56:36 -0700 (PDT) Subject: [gst-devel] JPEG, x264enc -> UDP Message-ID: <1285368996352-2614089.post@n4.nabble.com> Hi All, Can someone help me how to do a Stream in h264? I am capturing images from a camera in MJPEG, MJPEG then need to convert to h264 and send to the client. Server gst-launch souphttpsrc location ="http://teste.dvrdns.org:999/CAM:2" ! jpegdec ! x264dec ! udpsink host = . . . port = . . . Unexpected x264 header gst-launch souphttpsrc location ="http://teste.dvrdns.org:999/CAM:2" ! jpegdec ! x264dec ! rtph264pay ! udpsink host = . . . port = . . . Unexpected x264 header How can i do to send and receive this stream ? Thanks ! -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/JPEG-x264enc-UDP-tp2614089p2614089.html Sent from the GStreamer-devel mailing list archive at Nabble.com. From rob.krakora at messagenetsystems.com Mon Sep 27 19:07:59 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Mon, 27 Sep 2010 13:07:59 -0400 Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid Message-ID: Hello All, When I execute this pipeline from the command line... gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile ...and send a SIGINT to the process I get a file that is playable because I specified the '-e' which indicates that EOS on shutdown. However, if I execute the same pipeline from the command line and remove the RTSP server (i.e. force a socket error) the process exits without performing an EOS on shutdown. A quick look at the code in "gst-launch.c" confirmed my suspicion that EOS on shutdown is only performed on a SIGINT. What if my application has recorded valuable video data and the network goes down, does this mean I have lost all of that data since the file will not be viewable due to the absence of EOS on shutdown processing on anything that is not a SIGINT? Thanks in advance. Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From rob.krakora at messagenetsystems.com Mon Sep 27 20:39:33 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Mon, 27 Sep 2010 14:39:33 -0400 Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid In-Reply-To: References: Message-ID: On Mon, Sep 27, 2010 at 1:07 PM, Robert Krakora < rob.krakora at messagenetsystems.com> wrote: > Hello All, > > When I execute this pipeline from the command line... > > gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! > rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile > > ...and send a SIGINT to the process I get a file that is playable because I > specified the '-e' which indicates that EOS on shutdown. > > However, if I execute the same pipeline from the command line and remove > the RTSP server (i.e. force a socket error) the process exits without > performing an EOS on shutdown. A quick look at the code in "gst-launch.c" > confirmed my suspicion that EOS on shutdown is only performed on a SIGINT. > What if my application has recorded valuable video data and the network goes > down, does this mean I have lost all of that data since the file will not be > viewable due to the absence of EOS on shutdown processing on anything that > is not a SIGINT? > > Thanks in advance. > > Best Regards, > > -- > Rob Krakora > Senior Software Engineer > MessageNet Systems > 101 East Carmel Dr. Suite 105 > Carmel, IN 46032 > (317)566-1677 Ext. 206 > (317)663-0808 Fax > -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From kretschmann at kde.org Tue Sep 28 19:41:47 2010 From: kretschmann at kde.org (Mark Kretschmann) Date: Tue, 28 Sep 2010 19:41:47 +0200 Subject: [gst-devel] Switching two video sources with input-selector Message-ID: Heya, I am trying to build an application with GStreamer that allows switching between two video sources dynamically, one being a live-video (videotestsrc for now), and the other a filesrc. The sink is a standard xvimagesink. The goal is to do the switching with as little latency as possible, and also only running the elements that are necessary for each source. Wim Taymans recommended to use an "input-selector" for switching between the sources, which makes sense to me. However, how should the pipeline design look? Would I need two pipelines, one for each source, and play/pause them dynamically? Where must the input-selector and the sink go? Thanks, Mark. -- Mark Kretschmann Amarok Developer Fellow of the Free Software Foundation Europe http://amarok.kde.org - http://www.fsfe.org From rob.krakora at messagenetsystems.com Wed Sep 29 03:13:29 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Tue, 28 Sep 2010 21:13:29 -0400 Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid In-Reply-To: References: Message-ID: On Mon, Sep 27, 2010 at 1:07 PM, Robert Krakora < rob.krakora at messagenetsystems.com> wrote: > Hello All, > > When I execute this pipeline from the command line... > > gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! > rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile > > ...and send a SIGINT to the process I get a file that is playable because I > specified the '-e' which indicates that EOS on shutdown. > > However, if I execute the same pipeline from the command line and remove > the RTSP server (i.e. force a socket error) the process exits without > performing an EOS on shutdown. A quick look at the code in "gst-launch.c" > confirmed my suspicion that EOS on shutdown is only performed on a SIGINT. > What if my application has recorded valuable video data and the network goes > down, does this mean I have lost all of that data since the file will not be > viewable due to the absence of EOS on shutdown processing on anything that > is not a SIGINT? > > Thanks in advance. > > Best Regards, > > -- > Rob Krakora > Senior Software Engineer > MessageNet Systems > 101 East Carmel Dr. Suite 105 > Carmel, IN 46032 > (317)566-1677 Ext. 206 > (317)663-0808 Fax > -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From robkrakora at att.net Wed Sep 29 03:18:58 2010 From: robkrakora at att.net (Robert Krakora) Date: Tue, 28 Sep 2010 18:18:58 -0700 (PDT) Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid Message-ID: <972268.44330.qm@web180309.mail.gq1.yahoo.com> Hello All, > >When I execute this pipeline from the command line... > >?gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! >rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile > >...and send a SIGINT to the process I get a file that is playable because I >specified the '-e' which indicates that EOS on shutdown. > >However, if I execute the same pipeline from the command line and remove the >RTSP server (i.e. force a socket error) the process exits without performing an >EOS on shutdown.? A quick look at the code in "gst-launch.c" confirmed my >suspicion that EOS on shutdown is only performed on a SIGINT.? What if my >application has recorded valuable video data and the network goes down, does >this mean I have lost all of that data since the file will not be viewable due >to the absence of EOS on shutdown processing on anything that is not a SIGINT? > >Thanks in advance. > >Best Regards, > >-- >Rob Krakora >Senior Software Engineer >MessageNet Systems >101 East Carmel Dr. Suite 105 >Carmel, IN 46032 >(317)566-1677 Ext. 206 >(317)663-0808 Fax > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robkrakora at att.net Wed Sep 29 03:25:31 2010 From: robkrakora at att.net (Robert Krakora) Date: Tue, 28 Sep 2010 18:25:31 -0700 (PDT) Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid Message-ID: <529282.67345.qm@web180312.mail.gq1.yahoo.com> Hello All, When I execute this pipeline from the command line... ?gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile ...and send a SIGINT to the process I get a file that is playable because I specified the '-e' which indicates that EOS on shutdown. However, if I execute the same pipeline from the command line and remove the RTSP server (i.e. force a socket error) the process exits without performing an EOS on shutdown.? A quick look at the code in "gst-launch.c" confirmed my suspicion that EOS on shutdown is only performed on a SIGINT.? What if my application has recorded valuable video data and the network goes down, does this mean I have lost all of that data since the file will not be viewable due to the absence of EOS on shutdown processing on anything that is not a SIGINT? Thanks in advance. Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From robkrakora at att.net Wed Sep 29 03:28:42 2010 From: robkrakora at att.net (Robert Krakora) Date: Tue, 28 Sep 2010 18:28:42 -0700 (PDT) Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid Message-ID: <491617.54612.qm@web180315.mail.gq1.yahoo.com> Hello All, When I execute this pipeline from the command line... ?gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile ...and send a SIGINT to the process I get a file that is playable because I specified the '-e' which indicates that EOS on shutdown. However, if I execute the same pipeline from the command line and remove the RTSP server (i.e. force a socket error) the process exits without performing an EOS on shutdown.? A quick look at the code in "gst-launch.c" confirmed my suspicion that EOS on shutdown is only performed on a SIGINT.? What if my application has recorded valuable video data and the network goes down, does this mean I have lost all of that data since the file will not be viewable due to the absence of EOS on shutdown processing on anything that is not a SIGINT? Thanks in advance. Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From rob.krakora at messagenetsystems.com Wed Sep 29 15:47:00 2010 From: rob.krakora at messagenetsystems.com (Robert Krakora) Date: Wed, 29 Sep 2010 09:47:00 -0400 Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid In-Reply-To: References: Message-ID: > > Hello All, > > When I execute this pipeline from the command line... > > gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! > rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile > > ...and send a SIGINT to the process I get a file that is playable because I > specified the '-e' which indicates that EOS on shutdown. > > However, if I execute the same pipeline from the command line and remove > the RTSP server (i.e. force a socket error) the process exits without > performing an EOS on shutdown. A quick look at the code in "gst-launch.c" > confirmed my suspicion that EOS on shutdown is only performed on a SIGINT. > What if my application has recorded valuable video data and the network goes > down, does this mean I have lost all of that data since the file will not be > viewable due to the absence of EOS on shutdown processing on anything that > is not a SIGINT? > > Thanks in advance. > > Best Regards, > > -- > Rob Krakora > Senior Software Engineer > MessageNet Systems > 101 East Carmel Dr. Suite 105 > Carmel, IN 46032 > (317)566-1677 Ext. 206 > (317)663-0808 Fax > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robkrakora at att.net Wed Sep 29 19:47:29 2010 From: robkrakora at att.net (Robert Krakora) Date: Wed, 29 Sep 2010 10:47:29 -0700 (PDT) Subject: [gst-devel] EOS only on SIGINT and not on socket error causes file created by 'filesink' element to be invalid Message-ID: <114877.38795.qm@web180308.mail.gq1.yahoo.com> Hello All, When I execute this pipeline from the command line... gst-launch -e rtspsrc location=rtsp://192.168.1.211:8109/camera.sdp ! rtpmp4vdepay ! mpeg4videoparse ! mp4mux ! filesink location=myfile ...and send a SIGINT to the process I get a file that is playable because I specified the '-e' which indicates that EOS on shutdown. However, if I execute the same pipeline from the command line and remove the RTSP server (i.e. force a socket error) the process exits without performing an EOS on shutdown. A quick look at the code in "gst-launch.c" confirmed my suspicion that EOS on shutdown is only performed on a SIGINT. What if my application has recorded valuable video data and the network goes down, does this mean I have lost all of that data since the file will not be viewable due to the absence of EOS on shutdown processing on anything that is not a SIGINT? Thanks in advance. Best Regards, -- Rob Krakora Senior Software Engineer MessageNet Systems 101 East Carmel Dr. Suite 105 Carmel, IN 46032 (317)566-1677 Ext. 206 (317)663-0808 Fax -------------- next part -------------- An HTML attachment was scrubbed... URL: From jonas.melin at saabgroup.com Thu Sep 30 09:18:00 2010 From: jonas.melin at saabgroup.com (STJME) Date: Thu, 30 Sep 2010 00:18:00 -0700 (PDT) Subject: [gst-devel] Reduce latency for a MJPEG over UDP multicast pipe Message-ID: <1285831080783-2720288.post@n4.nabble.com> I am trying to reduce the latency for a live video feed from a camera connected to a video grabber board, compressing using MJPEG and distribute it over UDP to a computer uncompressing the video and imediately render the video to the display. Today, using Dual core Inte CPU's and Gbit network, the total latency for the above scenario is 200 ms (from the light hits the camera until the pixel illuminates). Q: How can I tweek the latency to get it as short as possible. Basically, I want to remove all buffers, compress/decompress with such codecs and rates so it is as quick as possible. Less focus on packet loss etc.. -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Reduce-latency-for-a-MJPEG-over-UDP-multicast-pipe-tp2720288p2720288.html Sent from the GStreamer-devel mailing list archive at Nabble.com.