I want to design a source element which can share video frames.

Psychesnet Hsieh psychesnet at gmail.com
Sun Jul 19 06:01:22 UTC 2020


Hi All,

Thanks. Looks like I need to implement a source element like shmsink/shmsrc.

@Nicolas,

Yes, services are separate process.

Peter Maersk-Moller <pmaersk at gmail.com> 於 2020年7月18日 週六 下午11:57寫道:

> Hi
>
> You can do something like this:
>
>      gst-launch-1.0 -v videotestsrc ! queue ! x264enc ! queue ! shmsink
> socket-path=/tmp/video-ctr wait-for-connection=0
>
> The "videotestsrc ! queue ! x264enc" is here the stand-in for your capture
> device. And shmsink is your stand-in for mmap.
>
> Then the clients would be something like this:
>
>     gst-launch-1.0 -v shmsrc socket-path=/tmp/pmm ! identity silent=0 !
> h264parse ! queue ! decodebin ! queue ! videoconvert ! autovideosink
>
> The client is just an example. You need to consider shm size, security,
> optimization and a lot more.
>
> Best regards
> Peter Maersk-Moller
>
> On Sat, Jul 18, 2020 at 2:55 PM Psychesnet Hsieh <psychesnet at gmail.com>
> wrote:
>
>> Hi Joshi,
>>
>> Thanks. First of all, I think I need to describe my question more and we
>> can fully address what I need. Thanks.
>>
>> CAPTURE  -> mmap -> service 1
>>                              | ----> service 2
>>                              | ----> service 3
>> Above layout, it is our design for now. We have CAPTURE which always
>> capture H264 video frames from DSP(via special ioctl, not v4l2).
>> There are a lot of server X can access H264 video frames at same time.
>> The middle interface is mmap. It is working now.
>>
>> Now, we would like to design a source element and it can satisfy
>> following requirements.
>> 1. All service X still can access H264 video frames at same time.
>> 2. Gstreamer can work with this source element.
>> 3. CAPTURE, for platform, we still need this part which capture H264
>> video frames via ioctl.
>>
>> @Joshi,
>>
>> According to your reply, I think I still need to keep my mmap and create
>> a source element which can access mmap and delivery data through whole
>> pipeline, right?
>>
>> Thanks. Have a good day. ^^
>>
>> Mandar Joshi <emailmandar at gmail.com> 於 2020年7月17日 星期五寫道:
>>
>>> Hi,
>>> I've done something similar recently. Could you please go through this
>>> discussion
>>> https://lists.freedesktop.org/archives/gstreamer-devel/2020-July/075481.html
>>> Since this is recent, I should be able to answer any queries you have about
>>> the approach.
>>>
>>> For your first question, this is my suggestion
>>> So, create a source element based on GstPushSrc (like neonhttpsrc) and
>>> in the _create function, the following should work.
>>> Map the buffer and write what you need to it. Make sure you set the SRC
>>> CAPS of the element to the correct ones so that a right element to
>>> decode/view the video data can be linked to the source element.
>>>
>>> For your second question, I don't think I completely understand it. But
>>> maybe the answer to the first question will allow you to come up with the
>>> answer to the second one.
>>>
>>> Feel free to ask any questions.
>>>
>>> Regards
>>> Mandar Joshi
>>> Czar Softech
>>> *https://www.czarsoftech.com <https://www.czarsoftech.com>*
>>>
>> _______________________________________________
>> gstreamer-devel mailing list
>> gstreamer-devel at lists.freedesktop.org
>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200719/36cef9cb/attachment.htm>


More information about the gstreamer-devel mailing list