appsrc, sparse stream, GAP events

Emil Andonov emil at prolancer.com.au
Wed Nov 13 20:33:00 UTC 2019


Dear all,

 

I am trying to get an appsrc working for a sparse stream using GAP events
between buffers with Gstreamer 1.16.1. Unfortunately I am hitting a brick
wall whatever I try, so any help would be greatly appreciated.

 

The pipeline comprises of a standard audio file replay chain in combination
with a simple appsrc to appsink chain for a sparse stream of metadata
related to the audio.

The latter is very much like a captions stream. Both chains are in in single
pipeline. For example:

 

filesrc ... ! audio/x-mulaw,rate=8000,channels=1 ! mulawdec ! audioconvert !
audioresample ! autoaudiosink

appsrc | appsink

 

The audio is 10sec long and plays fine (in all scenarios that I list below)
for 10 sec, however I cannot get the metadata part (appsrc to appsink) to
work. The metadata comprises if 2 packets of text, one at 500ms and one at
9500ms. The metadata is 200 to 300 bytes.

 

I am using the "need data" signal in the appsrc to either create a GAP event
or a Buffer with data. This is what I have tried and the results:-

 

Scenario 1:

On the first "need data" signal in appsrc I create a GAP event with PTS=0
and duration of 500000000 (500ms as ns) and send the event on the src pad of
the appsrc. The idea is to tell the pipeline that there will not be any
Buffers for 500ms.

 

In this case the appsrc does not get another "need data" signal so the code
does not do anything else and the appsink does not get any Buffers with
data. I was expecting a "need data" signal when the pipeline reaches PTS of
500ms.

 

Scenario 2:

On the first "need data" signal in appsrc I create a Buffer with data and
with PTS=500000000 (500ms as ns) and duration 0 and send it.

 

This buffer is delivered to the appsink at about the right time (some 500ms
after the audio starts playing).

 

On the second "need data" signal (which arrives around after 500ms) in
appsrc I create a GAP event with PTS=500000000 (500ms as ns) and duration of
9000000000 (9000ms as ns) and send the event on the src pad of the appsrc.
The idea is to tell the pipeline that there will not be any Buffers for
9000ms.

 

>From this point on the appsrc does not get a "need data" signal any more so
the 2nd metadata buffer is never created and sent.

 

Scenario 3:

On the first "need data" signal in appsrc I create a Buffer with data with
PTS=500000000 (500ms as ns) and duration 1000000 (1ms as ns) and send it.
Unlike scenario 2, note that this buffer has duration of 1ms instead of 0ms.

 

This buffer is delivered to the appsink at about the right time (some 500ms
after the audio starts playing).

 

On the second "need data" signal (which arrives around after 500ms) in
appsrc I create a GAP event with PTS=501000000 (501ms as ns) and duration of
8999000000 (8999ms as ns) and send the event on the src pad of the appsrc.
The idea is to tell the pipeline that there will not be any Buffers for
8999ms (9000 - 1).

 

>From this point on the appsrc does not get a "need data" signal any more so
the 2nd metadata buffer is never created and sent.

 

 

The appsrc configuration is as follows (default values used for the
properties that are not listed):

stream-type = 1 (seekable)

format = 3 (time)

is-live = false

duration = (10sec as ns)

emit-signals = true

 

 

It appears that appsrc stops getting "need data" signals once I send the
first GAP event!?

 

Any help with the issues that I am facing would be greatly appreciated. Also
examples on how to use GAP events correctly would be greatly appreciated. I
have already looked at suggestions posted to the list to use GAP events with
1sec duration, however none work for me.

 

 

Thank you

Emil

--

Emil Andonov

Prolancer Pty Ltd

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20191114/79a22f0f/attachment-0001.html>


More information about the gstreamer-devel mailing list