[gst-devel] Can a controller use a different clock?

Edward Hervey bilboed at gmail.com
Sun Jun 3 12:11:42 CEST 2007


On 6/1/07, Steve Fink <sphink at gmail.com> wrote:
> On 6/1/07, Edward Hervey <bilboed at gmail.com> wrote:
> > Hi,
> >
> > On 6/1/07, Steve Fink <sphink at gmail.com> wrote:
> > > On 6/1/07, Edward Hervey <bilboed at gmail.com> wrote:
> > > > volume_transform_ip (GstBaseTransform * base, GstBuffer * outbuf)
> > > > {
> > > >   GstVolume *this = GST_VOLUME (base);
> > > >   GstClockTime timestamp;
> > > >
> > > >   timestamp = GST_BUFFER_TIMESTAMP (outbuf);
> > >
> > > How does the buffer derive its timestamp? In a pipeline, would that
> > > normally be set from the active clock?
> >
> >   GStreamer 101 : Each buffer has a timestamp. If you don't grasp the
> > concept of timestamps on buffers, I'd recommend you read the
> > Application Developer Manual and Plugin Developer Guide.
>
> I'm taking that class right now.
>
> But I'm aware that the buffer has a timestamp (from reading the ADM,
> but also because it's clear that it has to). It's not clear to me
> where it gets it from, or what units it's in. I suppose it should
> derived entirely from the original source, either as stored in the
> file or as computed from the playback rate and the byte offset, and
> that therefore it is in units of "seconds since beginning of stream".
> But those are my assumptions, which I am testing. (And yes, in
> retrospect it doesn't make any sense for it to be associated with a
> clock. The next line is what establishes that association?)

  Yes, that's exactly what timestamps are. The association with a
clock though, is only done in the "Synchronization" part. Which is
mostly done in live sink elements (like xvimagesink, alsasink, ...).
To do synchronization it requires the NEWSEGMENT event, the timestamps
and a running clock. More about this can be found in the design
documents (gstreamer/docs/design/).

>
> > > >   timestamp =
> > > >       gst_segment_to_stream_time (&base->segment, GST_FORMAT_TIME, timestamp);
> > >
> >
> >   This is because stream time is the time realm used for GstController.
>
> And "which stream" is answered by "whatever stream the GstElement is
> within"? That seems to make sense in light of the "Clocks in
> GStreamer" chapter of the ADM.

  Stream time (and running time for that sake) is common to a whole
pipeline, not to a stream (like the video stream of an audio/video
demuxing/decoding/displaying pipeline).

>
> > > >   gst_object_sync_values (G_OBJECT (this), timestamp);
> > >
> > > This is the first time the controller gets involved? Somehow, the
> > > 'this' aka 'base' object has a back-pointer to the controller, and can
> > > ask it to compute the appropriate value given the passed-in timestamp?
> >
> >   Exactly. Because it's the (user using the) controller that decides
> > which properties are controlled, what key-values are used and which
> > interpolation to use. And therefore it knows for a given timestamp (1)
> > what values to set and (2) for what properties.
>
> Right.
>
> > > So let me check if I'm basically following this correctly. Say I
> > > wanted to use a different clock for the controller value lookup, but
> > > only for a specific plugin that I am writing. I could do it by
> > > replacing the line
> >
> >    You do NOT want to modify the code of a plugin to do what you are
> > asking. The modification you wish to do should be done as an extra
> > interpolation mode in GstController. The behaviour of GstController
> > relies on the elements using it to give it stream time positions.
> >   Or else you are failing to give some information about what you
> > really want to achieve. Maybe giving the bigger picture would help.
>
> I want dynamic volume control with a stream that I am seeking around
> within. Specifically, I want two very different types: (1) I want to
> overlay a volume "track" on top of a stream where the volume at any
> point is tied to the offset within that stream, and (2) I want a
> volume adjustment that happens relative to playback time. #1 is used
> when using a single source sound clip and making several variations,
> where each variation has its own settings for what part of the clip
> should be loud and what part should be soft. These variants will be
> played back with occasional nonlinearities where I seek to different
> points within the clip, and the same part of the clip for a given
> variant should always have the same volume. #2 is used for fading out
> the playback, probably because you're switching to something else. (It
> may not be a global fade across all clips, though; there may be other
> streams getting mixed in on their own timelines.)
>
> From what I can tell (or what you told me, rather), GstController does
> #1. If I do a seek, then the controller will adjust the property
> values based on the position within the stream (the timestamp in the
> quoted code). #2 must be done manually, or by using a custom
> interpolation function that ignores most of its input and looks up the
> current playback time instead.

  Exactly.

>
> Or am I being dense?
>


-- 
Edward Hervey
Multimedia editing developer / Fluendo S.A.
http://www.pitivi.org/




More information about the gstreamer-devel mailing list