jain1982 at gmail.com
Fri Jul 19 04:01:56 PDT 2013
On Fri, Jul 19, 2013 at 4:11 PM, Wim Taymans <wim.taymans at gmail.com> wrote:
> On 07/19/2013 12:33 PM, Deepak Jain wrote:
>> I am getting demuxed audio and video elementary streams from the server
>> for a audio/video file.
>> I am getting buffer, its size and PTS information. Data is H264 video and
>> AAC audio.
>> I am looking at below two approaches to get it working.
>> _Approach 1:_
>> _Approach 2:_
>> Now here are some of my questions on which I need help:
>> (1) If above two approaches at all would work for me?
> Yes, the difference is that the second one creates a second thread.
So which approach do you think is easier to manage if we think of the whole
> (2) If yes, then how audio video sync would be achieved?
> By placing the same buffer running-time on buffers that need to be
> synchronized. The rules for
> placing timestamps are here http://cgit.freedesktop.org/**
> and http://cgit.freedesktop.org/**gstreamer/gstreamer/tree/docs/**
>> (3) If not, then what is the alternative approach?
>> gstreamer-devel mailing list
>> gstreamer-devel at lists.**freedesktop.org<gstreamer-devel at lists.freedesktop.org>
> gstreamer-devel mailing list
> gstreamer-devel at lists.**freedesktop.org<gstreamer-devel at lists.freedesktop.org>
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gstreamer-devel