decode byte-stream video without gmainloop

Karl Lattimer karl at
Mon Feb 3 07:24:47 UTC 2020

>> and push that data into a gstreamer pipeline. As far as I understand
>> the raop_ntp_t struct contains information regarding time
>> synchronisation and pts provides an accurate timestamp. In my tests
>> with gstreamer so far I’ve taken the buffer, dumped it to disk as an
>> MP4 file, then used a simple playbin pipeline to open the file and
>> play it. 
> Have you looked at the appsrc element ? Note that GStreamer processing
> is asynchronous, so you may have to copy the pointer, or make sure that
> your wrapped buffer is consumed before returning.
> <>

It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline. 

Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose. 

>> I’d also like to know if it’s possible to get a playbin to dump the
>> pipeline that it’s using out somehow, there’s quite a lot of
>> information in the verbose output, but I can’t seem to spot a
>> pipeline which I could use on the command line in place of playbin, I
>> think that would help me in the final goal. 
> When you are using gst-launch-1.0 you can dump the pipeline into DOT
> files using the env GST_DEBUG_DUMP_DOT_DIR=
> To get the same thing in your application, you have to call:

perfect thanks! 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the gstreamer-devel mailing list