decode byte-stream video without gmainloop

Karl Lattimer karl at qdh.org.uk
Sun Feb 2 22:10:58 UTC 2020


I’m trying to integrate gstreamer with RpiPlay (https://github.com/FD-/RPiPlay <https://github.com/FD-/RPiPlay>) for generic linux support (https://github.com/FD-/RPiPlay/issues/24 <https://github.com/FD-/RPiPlay/issues/24>) and looking at the gstreamer documentation it seems difficult to find a method to write a buffer of received bytes to a pipeline. 

Specifically I need to take the method call 

void video_renderer_render_buffer(video_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type)

and push that data into a gstreamer pipeline. As far as I understand the raop_ntp_t struct contains information regarding time synchronisation and pts provides an accurate timestamp. In my tests with gstreamer so far I’ve taken the buffer, dumped it to disk as an MP4 file, then used a simple playbin pipeline to open the file and play it. 

I suppose what I’m looking for here are pointers to the appropriate documentation for pushing a block of data into a decode pipeline, preferably without going as far as writing a gstreamer plugin as that wouldn’t sit well with the rest of the project. 

I’d also like to know if it’s possible to get a playbin to dump the pipeline that it’s using out somehow, there’s quite a lot of information in the verbose output, but I can’t seem to spot a pipeline which I could use on the command line in place of playbin, I think that would help me in the final goal. 

Advice, guidance, links, examples appreciated. 

—

Regards,
 K

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200202/0f949ec0/attachment.htm>


More information about the gstreamer-devel mailing list