[gst-devel] converting a gst-launch pipeline into source code problem
Stefan Kost
ensonic at hora-obscura.de
Sat Oct 20 18:09:56 CEST 2007
hi,
Felipe Contreras schrieb:
> On 10/18/07, Stefan Kost <ensonic at hora-obscura.de> wrote:
>> Hi,
>>
>> Quoting Terry Leung <terry83 at gmail.com>:
>>
>>> Hi all,
>>>
>>> I got a problem when i convert a pipeline into source code
>>> The pipeline is
>>>
>>> gst-launch -m ffmux_3gp name=mux ! filesink
>>> location=/home/video/receive.3gp \
>>> { udpsrc num-buffers=500 port=8100 name=audioudp
>>> caps="application/x-rtp, media=(string)audio, payload=(int)98,
>>> clock-rate=(int)8000, encoding-name=(string)AMR,
>>> encoding-params=(string)1, octet-align=(string)1" ! rtpamrdepay
>>> queue-delay=0 ! queue } ! mux.audio_0
>>>
>> You don't need the '{' that was gstreamer-0.8.
>>
>>> The code is shown below:
>>>
>>> m_pPipe = gst_pipeline_new (
>>> (string("VideoPipe").append(sId)).c_str());
>>>
>> m_pPipe = gst_pipeline_new ("VideoPipe");
>> is just fine.
>>
>>> m_pMux = gst_element_factory_make ("ffmux_3gp",
>>> (string("ffmux_3gp").append(sId)).c_str());
>>> m_pFileSink = gst_element_factory_make
>>> ("filesink",(string("filesink").append(sId)).c_str() );
>>>
>>> m_pAudioUdpSrc= gst_element_factory_make ("udpsrc",
>>> (string("audioudpsrc").append(sId)).c_str());
>>> m_pAudioJitter= gst_element_factory_make
>>> ("gstrtpjitterbuffer", (string("audiojitter").append(sId)).c_str());
>>> m_pAudioDepay= gst_element_factory_make ("rtpamrdepay",
>>> (string("audiodepay").append(sId)).c_str());
>>> m_pAudioQueue= gst_element_factory_make ("queue",
>>> (string("audioqueue").append(sId)).c_str());
>>>
>>> gst_bin_add_many (GST_BIN (m_pPipe),
>>> m_pMux , m_pFileSink ,
>>> m_pAudioUdpSrc , m_pAudioJitter , m_pAudioDepay ,
>>> m_pAudioQueue ,NULL);
>>>
>>> GstCaps* pAudioUdpSrcCaps = gst_caps_new_simple ("application/x-rtp",
>>> "media",G_TYPE_STRING,"audio",
>>> "payload",G_TYPE_INT,98,
>>> "clock-rate", G_TYPE_INT, 8000,
>>> "encoding-name", G_TYPE_STRING, "AMR",
>>> "encoding-params",G_TYPE_STRING,"1",
>>> "octet-align",G_TYPE_STRING,"1",
>>> NULL);
>>> nResult =
>>> gst_element_link_filtered(m_pAudioUdpSrc,m_pAudioJitter,pAudioUdpSrcCaps);
>>> gst_caps_unref(pAudioUdpSrcCaps);
>>>
>>> nResult = gst_element_link (m_pMux,m_pFileSink);
>>> nResult = nResult &&
>>> gst_element_link_many(m_pAudioJitter,m_pAudioDepay,m_pAudioQueue,m_pMux,NULL);
>>>
>>>
>>> I am not sure if the pipeline is the same as the one in gst-launch
>>> I've tried my best to do the same thing as what it is in the gst-launch
>>> (As i've also write another program to stream 3gp clips to a specific
>>> ip and port, and i can do it without problem, i think there may be
>>> some tricks for this case?)
>>>
>> The biggest difference is that demuxers create pads on the fly. So you
>> would register a signal handler for "pad-added" of gstElement and then
>> connect.
>
> This keeps coming again and again.
>
> Wouldn't it make sense to allow things like:
>
> gst_element_link_pads (demuxer, "video_%02d", decoder, "sink");
>
You mean that this does the delayed linking for you automatically?
Stefan
More information about the gstreamer-devel
mailing list