Dynamically added video doesn't start from the beginning
lminiero at gmail.com
Fri Jul 12 11:52:08 PDT 2013
I'm working on an application that replays old recordings by mixing them on
the fly. As I'll explain in this mail, it basically works, but I'm
encountering an issue when dynamically attaching videos, which don't play
from the beginning when attached: they start when they should, but as if
somebody seeked forward at the beginning.
To describe my scenario (I added a link to the source code at the end of
this mail, in case it may help you understand what I'm doing wrong), the
pipeline is composed of an adder element to mix audio, and a videomixer2
element to mix the video. At the beginning, they are only fed an
audiotestsrc to generate silence, and a static image with multifilesrc.
audiotestsrc(silence) --> adder --> alsasink
multifilesrc(static logo) --> jpegdec --> ffmpegcolorspace -->
videomixer2 --> ffmpegcolorspace --> ximagesink
At specific times, audio and/or video files (so no live source) are added
to the playout dynamically. The needed elements are created and added to
the pipeline, and properly linked so that audio sources are attached to the
adder, while video sources to the videomixer2.
This works fine for audio: if I want an audio file to play after 2 seconds,
it starts from its beginning after 2 seconds of pipeline playout. It
doesn't work as expected for video, though: videos are correctly attached
and started when they should (e.g., after 3 seconds), but instead of
starting from the beginning, they are played as if seeked forward (3
seconds in this examle) in their playout as well.
I thought it could be related to some timestamps in the pipeline, but this
wouldn't explain why it doesn't happen for audio as well... I've tried
several things, but none of them worked. I tried seeking videos to 0
seconds when attached, but although the method returns a success, nothing
changes. I also tried inserting an identity element with a single-segment
set to true, but unless I didn't understand how I needed to use it, it
didn't seem to help either. Probing for buffers and manipulating timestamps
there doesn't look like an option, considering that the first seconds of
playout never seem to be injected in the pipeline at all.
Can you help me understand what I'm doing wrong? At first I thought the
cause was that I was building the app using gstreamer-java, but I've
converted the whole application to C and the result is the same... you can
find the source code here:
If you need additional details about what I've tried feel free to ask, and
thanks in advance for any hint or help you'll give!
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gstreamer-devel