decode byte-stream video without gmainloop
Karl Lattimer
karl at qdh.org.uk
Mon Feb 3 10:59:31 UTC 2020
I seem to be getting *somewhere* but I’m having an issue with creating a new GstBuffer
This seems strange, I’ve tried both
buffer = gst_buffer_new_and_alloc(data_len);
and
buffer = gst_buffer_new ();
memory = gst_allocator_alloc (NULL, data_len, NULL);
gst_buffer_insert_memory (buffer, -1, memory);
both methods cause a segfault… Not sure why, I mean, I’m only requesting a memory allocation so this should be OK right?
backtrace isn’t entirely revealing
#0 0x00007ffff7e6c07d in gst_allocator_alloc () at /usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0
Regards,
K
> On 3 Feb 2020, at 08:00, Matthew Waters <ystreet00 at gmail.com> wrote:
>
> On 3/2/20 6:24 pm, Karl Lattimer wrote:
>>
>>>> and push that data into a gstreamer pipeline. As far as I understand
>>>> the raop_ntp_t struct contains information regarding time
>>>> synchronisation and pts provides an accurate timestamp. In my tests
>>>> with gstreamer so far I’ve taken the buffer, dumped it to disk as an
>>>> MP4 file, then used a simple playbin pipeline to open the file and
>>>> play it.
>>>
>>> Have you looked at the appsrc element ? Note that GStreamer processing
>>> is asynchronous, so you may have to copy the pointer, or make sure that
>>> your wrapped buffer is consumed before returning.
>>>
>>> https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c <https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c>
>>
>> It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline.
>>
>> Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose.
>
> You don't need a mainloop running to emit or retrieve those signals on appsrc. Nothing in GStreamer requires there to be any kind of mainloop running except for platform or plugin specifics which are few and far between (e.g. MacOS video output).
>
> Cheers
> -Matt
>
>>>
>>>>
>>>> I’d also like to know if it’s possible to get a playbin to dump the
>>>> pipeline that it’s using out somehow, there’s quite a lot of
>>>> information in the verbose output, but I can’t seem to spot a
>>>> pipeline which I could use on the command line in place of playbin, I
>>>> think that would help me in the final goal.
>>>
>>> When you are using gst-launch-1.0 you can dump the pipeline into DOT
>>> files using the env GST_DEBUG_DUMP_DOT_DIR=
>>>
>>> To get the same thing in your application, you have to call:
>>>
>>> GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);
>>
>> perfect thanks!
>>
>> K,
>>
>>
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at lists.freedesktop.org <mailto:gstreamer-devel at lists.freedesktop.org>
>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel <https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel>
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200203/ddb035f1/attachment-0001.htm>
More information about the gstreamer-devel
mailing list