[gst-devel] converting a gst-launch pipeline into source code problem

Terry Leung terry83 at gmail.com
Mon Oct 22 04:28:00 CEST 2007


Hi all,

First, I want to thx all for the help
I want to ask if the muxer is the same as demuxer(in turn of creating
pad on the fly)
As in my questions, the pipeline use mux and udpsrc to record live
stream into a file
I've set the GST_DEBUG to higher level, and i saw that the pads are connected

Second, I think i am using the newest(or very new) version of
gstreamer(download above 1 month ago, the plugin i am using are from
cvs)

Third, I want to ask one more question, if i use gst_parse_launch(),
is it possible for me to stop it like the "Controlled shutdown of live
sources in applications" in this page

http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-libs/html/GstBaseSrc.html

?
can I still control every element with the name i used in the gst_parse_launch?


On 10/18/07, Stefan Kost <ensonic at hora-obscura.de> wrote:
> Hi,
>
> Quoting Terry Leung <terry83 at gmail.com>:
>
> > Hi all,
> >
> > I got a problem when i convert a pipeline into source code
> > The pipeline is
> >
> > gst-launch -m ffmux_3gp name=mux ! filesink
> > location=/home/video/receive.3gp \
> > { udpsrc num-buffers=500 port=8100 name=audioudp
> > caps="application/x-rtp, media=(string)audio, payload=(int)98,
> > clock-rate=(int)8000, encoding-name=(string)AMR,
> > encoding-params=(string)1, octet-align=(string)1" ! rtpamrdepay
> > queue-delay=0 ! queue } ! mux.audio_0
> >
> You don't need the '{' that was gstreamer-0.8.
>
> >
> > The code is shown below:
> >
> >         m_pPipe = gst_pipeline_new (
> > (string("VideoPipe").append(sId)).c_str());
> >
>
>   m_pPipe = gst_pipeline_new ("VideoPipe");
> is just fine.
>
> >
> >         m_pMux = gst_element_factory_make ("ffmux_3gp",
> > (string("ffmux_3gp").append(sId)).c_str());
> >         m_pFileSink = gst_element_factory_make
> > ("filesink",(string("filesink").append(sId)).c_str() );
> >
> >         m_pAudioUdpSrc= gst_element_factory_make ("udpsrc",
> > (string("audioudpsrc").append(sId)).c_str());
> >         m_pAudioJitter= gst_element_factory_make
> > ("gstrtpjitterbuffer", (string("audiojitter").append(sId)).c_str());
> >         m_pAudioDepay= gst_element_factory_make ("rtpamrdepay",
> > (string("audiodepay").append(sId)).c_str());
> >         m_pAudioQueue= gst_element_factory_make ("queue",
> > (string("audioqueue").append(sId)).c_str());
> >
> >         gst_bin_add_many (GST_BIN (m_pPipe),
> >         m_pMux , m_pFileSink ,
> >         m_pAudioUdpSrc , m_pAudioJitter , m_pAudioDepay ,
> > m_pAudioQueue ,NULL);
> >
> > GstCaps* pAudioUdpSrcCaps = gst_caps_new_simple ("application/x-rtp",
> >         "media",G_TYPE_STRING,"audio",
> >         "payload",G_TYPE_INT,98,
> >         "clock-rate", G_TYPE_INT, 8000,
> >         "encoding-name", G_TYPE_STRING, "AMR",
> >         "encoding-params",G_TYPE_STRING,"1",
> >         "octet-align",G_TYPE_STRING,"1",
> >         NULL);
> >         nResult =
> > gst_element_link_filtered(m_pAudioUdpSrc,m_pAudioJitter,pAudioUdpSrcCaps);
> >         gst_caps_unref(pAudioUdpSrcCaps);
> >
> >         nResult = gst_element_link (m_pMux,m_pFileSink);
> >         nResult = nResult &&
> > gst_element_link_many(m_pAudioJitter,m_pAudioDepay,m_pAudioQueue,m_pMux,NULL);
> >
> >
> > I am not sure if the pipeline is the same as the one in gst-launch
> > I've tried my best to do the same thing as what it is in the gst-launch
> > (As i've also write another program to stream 3gp clips to a specific
> > ip and port, and i can do it without problem, i think there may be
> > some tricks for this case?)
> >
> The biggest difference is that demuxers create pads on the fly. So you
> would register a signal handler for "pad-added" of gstElement and then
> connect.
> You should also add check for the results of gst_element_link() and
> gst_element_link_filtered() and print a message if one of those fails.
>
I want to say this is a mux instead of a demux
does a mux need this as well?
I

> Stefan
>
> >
> > But when i try to play the pipeline, they are still the same(elements
> > become playing while the pipeline and filesink are not playing yet)
> > However, in the first case, when a stream come in, the pipeline and
> > filesink is set to playing
> > in the second case, no any message from bus
> >
> > Anyone can tell me why it give such difference?
> >
> > -------------------------------------------------------------------------
> > This SF.net email is sponsored by: Splunk Inc.
> > Still grepping through log files to find problems?  Stop.
> > Now Search log events and configuration files using AJAX and a browser.
> > Download your FREE copy of Splunk now >> http://get.splunk.com/
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.sourceforge.net
> > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
> >
>
>
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems?  Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>




More information about the gstreamer-devel mailing list