Synchronising audio and video in gnonlin

Ralph ralph.gucwa at racelogic.co.uk
Thu Jan 19 08:06:46 PST 2012


Problem solved!
The audiotestsrc was running at 44.1kHz mono while the audio in all my
videos was 48kHz stereo.

What I'm doing is:
Instead of adding audiotestsrc to gnlsource directly, I create a bin with
audiotestsrc and capsfilter and add that bin as a signal source go gnlsource
(added later to gnlcomposition).  When the video files are added to
gnlcomposition I read their properties using a discoverer and set the caps
of capsfilter to the caps of the file's audio stream.  This way the audio
played in the gaps between the files is exactly the same rate as the audio
played from those files.

I did the same trick for videotestsrc.

This solution works, but imposes a limitation: all video files must have the
same properties.  Fortunately it is not a problem for me.



Ralph wrote
> 
>> I don't know how your pipeline is
>> Basically I did 2 gnlcomposition, one for audio and another for video.
> 
> As I wrote in the beggining of my original post, my pipeline contains two
> separate gnlcomposition objects: one for audio and the other for video.
> The whole pipeline looks like that:
> 
> videotestsrc - gnlsource     \
>                 gnlfilesource     - gnlcomposition - d3dvideosink
>                 gnlfilesource  /
> 
> audiotestsrc - gnlsource     \
>                 gnlfilesource     - gnlcomposition - audioconvert -
> audioresample - autoaudiosink
>                 gnlfilesource  /
> 
> The sources (audiotestsrc, videotestsrc, gnlsource, gnlfilesource) are not
> explicitly added to the pipeline, they are added to gnlcomposition
> objects.
> 
> 
>> I achieved this by the caps properties
> 
> As the avi files contain both audio and video, I need to filter the pads
> of gnlfilesources, I'm setting their caps to "audio/x-raw-int;
> audio/x-raw-float" for audio sources or "video/x-raw-rgb; video/x-raw-yuv"
> for video sources.  This gnlcomposition objects forward those caps to the
> remaining part of the pipeline.
> 
> I noticed a queue object in the video pipeline in one of your messages, I
> tried adding it, but it didn't make any change.
> 
> 
> 
> Rossana Guerra wrote
>> 
>> I don't know how your pipeline is, but when I did crossfading I achieved
>> this by the caps properties. Even it fits for 2 video files I think the
>> principle is the same.
>> Here's the thread:
>> 
>> http://gstreamer-devel.966125.n4.nabble.com/crossfading-audio-error-gst-mad-chain-mad-header-decode-had-an-error-lost-synchronization-td4037851.html#a4037881
>> 
>> Basically I did 2 gnlcomposition, one for audio and another for video.
>> Sorry I don't have the pipeline in this computer.
>> But it was something like this.
>> 
>> http://gstreamer-devel.966125.n4.nabble.com/No-sound-in-my-pipeline-td3854506.html
>> 
>> Hope it helps.
>> Rossana
>> 
>> 2012/1/17 Ralph <ralph.gucwa at .co>
>> 
>>> I'm using gnonlin to play a sequence of video file fragments. I
>>> constructed a
>>> pipeline containing two gnlcomposition objects - one for video and the
>>> other
>>> for audio, they are playing the same video files. In order to be able to
>>> have gaps between the files, I created videotestsrc and audiotestsrc and
>>> added them through gnlsources (priority: uint.MaxValue) to
>>> gnlcomposition
>>> objects.
>>>
>>> My "timeline" looks like this:
>>> 659.24s file, start position 2s (2s gap before the file)
>>> 881.04s file, start position 661.24s (4s gap before the file)
>>>
>>> When I disable the audio part, the video works nicely - the file
>>> fragments
>>> are played exactly when scheduled and the gaps between them are
>>> precisely
>>> the required length.  Perfect!  The problems occur when I enable the
>>> audio
>>> section - the gaps become much shorter, strange things start to happen.
>>>
>>> When I try to play both video and audio, the first file begins after
>>> just
>>> 0.4s, and when it ends, the video skips the second gap and jumps many
>>> seconds into the second file and the video remains frozen, it waits for
>>> the
>>> audio (which is playing nicely except for the 1.6 second shorter gap) to
>>> that position and then continues.
>>> What's interesting the length of the initial gap is always 1.6 seconds
>>> shorter than it should be - when the gap should be 20s, it is played
>>> within
>>> 18.4 seconds.  The second gap is always completely wrong.
>>> It's not the problem with the video, if I disable the video section, the
>>> audio is wrong, the gaps are shorter than scheduled.
>>> What's wrong with my pipeline?
>>>
>>> My video files are:
>>> video: Xvid ISO MPEG-4 720x576 25fps
>>> audio: MPEG-1 Layer 2 48kHz 192kb/s stereo
>>>
>>> The audio gnlcomposition runs at 48kHz, the audio is then converted
>>> using
>>> audioconvert.
>>> The audiotestsrc runs at 44.1kHz.
>>> The autoaudiosink runs at 44.1kHz.
>>>
>>>
>>> --
>>> View this message in context:
>>> http://gstreamer-devel.966125.n4.nabble.com/Synchronising-audio-and-video-in-gnonlin-tp4303968p4303968.html
>>> Sent from the GStreamer-devel mailing list archive at Nabble.com.
>>> _______________________________________________
>>> gstreamer-devel mailing list
>>> gstreamer-devel at .freedesktop
>>> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>>>
>> 
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at .freedesktop
>> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>> 
> 


--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Synchronising-audio-and-video-in-gnonlin-tp4303968p4310464.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list