[gst-devel] Gnonlin as a constant source of audio

Mike MacHenry dskippy at ccs.neu.edu
Tue Jun 10 16:56:25 CEST 2008


On Tue, Jun 10, 2008 at 9:43 AM, Edward Hervey
<edward.hervey at collabora.co.uk> wrote:
>  How exactly do you detect that a 'file is playing' ? GnlComposition
> will preroll sources added to it, in order to have fast switching (from
> one source to another). You might want to track the newsegment events
> coming out of the composition (there will be one everytime the
> composition switches sources).

Whenever a gnlfilesource begins to play, the bus of the pipeline is
sent a state change message. If you look in get_message in the code,
you can see that I accept the state change messages and if the new
state is gst.STATE_PLAYING I check to see which composition the source
object that is changing state belongs to by calling the is_finishing
method of the composition. That function just compares the source
object to that composition's currently playing object, in effect
saying "Is this you that just started playing?" On of them returns
true and that composition is instructed to queue up another.

>  Try fixing the issue of when you add/remove the sources. Maybe the
> rate issues is only related to the cpu overload. The composition doesn't
> do any synchronization, it only modifies the values of the segment
> events to provide proper synchronization downstream (in the sink
> elements).

I'm afraid I don't quite understand this. What does it mean to modify
the values of the segment? Is there a place where I can read more
about this. I'm pretty sure I've read all there is out there about
gnonlin unfortunately. Here's my understanding. Maybe you can explain
why it's wrong.

The time line is represented by a gnlcomposition. Time line in the
video editor's software sense. The bar at the bottom. One creates
gnlsources to put in the compostion. The start and end represent where
on the time line the source starts and end. The media-start and
media-end represent where in the file that source will draw from. I
know that there must always be something playing in the composition or
it will smoosh them together to fill the gap. If I put down a blank
audiotestsrc in the composition for one hour, and then a filesrc at 1
second to 2 second and another from 5 seconds to 7 seconds. I should
here 1 second of silence, 1 second of a file, 3 seconds of silence and
then 2 seconds of a file. This to me means that the composition must
have some notion of time in it that understands when to start playing
those two files. Am I way off base here?

Thanks for you help,
-mike




More information about the gstreamer-devel mailing list