camerabin2 question

Thiago Santos thiagoss at osg.samsung.com
Thu Sep 11 08:01:28 PDT 2014


On 09/11/2014 11:51 AM, Alexander Malaev wrote:
> I’m going another way and have another problem.
>
> I removed source bin, added ghostpads audiosink and videosink to 
> camerabin, linked ghostpads to passed camera-source avdec_h254 element 
> and audio-source avdec_aac element.
>
> All pads were linked successfully but I could’t figure out why buffers 
> were not going through these ghostpads to decoder elements.
> In attachment my pipeline dump in png format.

That won't work. Just adding ghostpads to an element doesn't mean that 
it will know what to do with those pads. It was not programmed to do so. 
Just like your picture shows those pads are there unlinked internally to 
anything and will just drop all data and return errors (not-linked).

So, I see 2 ways you can solve your problem:

1) Use the appsink/appsrc pair as I suggested.

2) Reimplement part of camerabin yourself by adding a tee for your video 
and having one output into a video sink for displaying and putting the 
other one into the encoding part (using encodebin makes this easier). 
You should also link your audio to encodebin to get it to the output.

>
> -- 
> Alexander Malaev
> Sent with Airmail
>
> На 11 сентября 2014 г. в 18:23:23, Thiago Santos 
> (thiagoss at osg.samsung.com <mailto:thiagoss at osg.samsung.com>) написал:
>
>> On 09/10/2014 09:24 AM, Alexander Malaev wrote:
>>> Hi,
>>>
>>> I'm using gstreamer 1.2.1 and Python GObject. I'm writing a recorder 
>>> from rtmp stream using camerabin element. I set camera-source to 
>>> rtmp source bin wrapped by wrappercamerabinsrc, but I can't set 
>>> audio-source from the same bin. It writes me that source bin already 
>>> has a parent. If I don't set an audio-source it uses autoaudiosrc, 
>>> instead of audio from source bin.
>>>
>>> Here is gist with my code: 
>>> https://gist.github.com/spscream/4df7195a50b0e35ba63e
>>>
>>> I think I should make a separate bins for audio and video source, 
>>> but I can't actually figure it out. What is the best way to feed my 
>>> rtmp source to camerabin?
>>
>> Unfortunately camerabin doesn't support having the same element 
>> providing both video and audio. What I can suggest to do is to link 
>> your audio stream to an appsink and create an appsrc and use that as 
>> the audio source. When you get a buffer from appsink just push it 
>> into appsrc. Remember to set caps to appsrc and proper stream type so 
>> it behaves correctly for this scenario.
>>
>>
>>>
>>> --
>>> WBR
>>> Alexander Malaev
>>>
>>>
>>> _______________________________________________
>>> gstreamer-devel mailing list
>>> gstreamer-devel at lists.freedesktop.org
>>> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>>
>>
>> --
>> Thiago Sousa Santos
>> Senior Multimedia Engineer, Open Source Group
>> Samsung Research America - Silicon Valley
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at lists.freedesktop.org
>> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


-- 
Thiago Sousa Santos
Senior Multimedia Engineer, Open Source Group
Samsung Research America - Silicon Valley

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20140911/9fba1657/attachment-0001.html>


More information about the gstreamer-devel mailing list