[gst-devel] converting a gst-launch pipeline into source code problem

Terry Leung terry83 at gmail.com
Thu Oct 18 11:08:33 CEST 2007


Also, I suspect it is related to the caps properties in udpsrc
I cannot set this property like other by using  g_object_set
and hence i use gst_element_link_filtered to connect the udpsrc with
jitterbuffer
anyone know if this will make any difference?

On 10/18/07, Terry Leung <terry83 at gmail.com> wrote:
> Hi all,
>
> I got a problem when i convert a pipeline into source code
> The pipeline is
>
> gst-launch -m ffmux_3gp name=mux ! filesink location=/home/video/receive.3gp \
> { udpsrc num-buffers=500 port=8100 name=audioudp
> caps="application/x-rtp, media=(string)audio, payload=(int)98,
> clock-rate=(int)8000, encoding-name=(string)AMR,
> encoding-params=(string)1, octet-align=(string)1" ! rtpamrdepay
> queue-delay=0 ! queue } ! mux.audio_0
>
> The code is shown below:
>
>         m_pPipe = gst_pipeline_new ( (string("VideoPipe").append(sId)).c_str());
>
>         m_pMux = gst_element_factory_make ("ffmux_3gp",
> (string("ffmux_3gp").append(sId)).c_str());
>         m_pFileSink = gst_element_factory_make
> ("filesink",(string("filesink").append(sId)).c_str() );
>
>         m_pAudioUdpSrc= gst_element_factory_make ("udpsrc",
> (string("audioudpsrc").append(sId)).c_str());
>         m_pAudioJitter= gst_element_factory_make
> ("gstrtpjitterbuffer", (string("audiojitter").append(sId)).c_str());
>         m_pAudioDepay= gst_element_factory_make ("rtpamrdepay",
> (string("audiodepay").append(sId)).c_str());
>         m_pAudioQueue= gst_element_factory_make ("queue",
> (string("audioqueue").append(sId)).c_str());
>
>         gst_bin_add_many (GST_BIN (m_pPipe),
>         m_pMux , m_pFileSink ,
>         m_pAudioUdpSrc , m_pAudioJitter , m_pAudioDepay , m_pAudioQueue ,NULL);
>
> GstCaps* pAudioUdpSrcCaps = gst_caps_new_simple ("application/x-rtp",
>         "media",G_TYPE_STRING,"audio",
>         "payload",G_TYPE_INT,98,
>         "clock-rate", G_TYPE_INT, 8000,
>         "encoding-name", G_TYPE_STRING, "AMR",
>         "encoding-params",G_TYPE_STRING,"1",
>         "octet-align",G_TYPE_STRING,"1",
>         NULL);
>         nResult =
> gst_element_link_filtered(m_pAudioUdpSrc,m_pAudioJitter,pAudioUdpSrcCaps);
>         gst_caps_unref(pAudioUdpSrcCaps);
>
>         nResult = gst_element_link (m_pMux,m_pFileSink);
>         nResult = nResult &&
> gst_element_link_many(m_pAudioJitter,m_pAudioDepay,m_pAudioQueue,m_pMux,NULL);
>
>
> I am not sure if the pipeline is the same as the one in gst-launch
> I've tried my best to do the same thing as what it is in the gst-launch
> (As i've also write another program to stream 3gp clips to a specific
> ip and port, and i can do it without problem, i think there may be
> some tricks for this case?)
>
> But when i try to play the pipeline, they are still the same(elements
> become playing while the pipeline and filesink are not playing yet)
> However, in the first case, when a stream come in, the pipeline and
> filesink is set to playing
> in the second case, no any message from bus
>
> Anyone can tell me why it give such difference?
>




More information about the gstreamer-devel mailing list