camtwist -> obs + pipeviz
Alain Culos
gstreamer at asoundmove.net
Mon Apr 13 22:20:24 UTC 2020
On Mon, 13 Apr 2020, at 20:31, Daniel Johnson wrote:
> On Thu, Apr 9, 2020, 2:00 PM Alain Culos <gstreamer at asoundmove.net> wrote:
>>
>> I meant to say that I found OBS -https://obsproject.com/ - which might answer your requirements or not.
>> It more or less does what I want - with an add-on, https://github.com/CatxFish/obs-v4l2sink.git - but I would much prefer to be able to do this with gstreamer, starting with gst-launch for proof of concept before I move on to more involved programming whether C or python.
>
> I mostly prototyped out the pipelines with a live streaming command line app. I did the compositing in OpenGL and used used Intel hardware video encoder. I tried to use hardware acceleration wherever possible which had the negative side effects of it being broken on anything other than Intel graphics.
>
> So glupload ! glvideomixer
>
> vaapiencode
>
> Etc
>
> When someone mentioned queues that's important because without them stuff will generally be running on the same cpu core so your 6 cores won't help you. Queues allow things on either side of the queue to be on different threads. If the queues add too much latency from buffers that's tuneable.
>
> I'm currently starting over with a cross platform UI, but maybe you can fork the code to get what you need.
>
> https://github.com/teknotus/bitcorder
>
> What's still broken in your pipeline?
bitcoder sounds really interesting, but I am not ready to delve into this kind of coding at this stage. One question though: does your code include being able to output to a video device (/dev/videoXX)?
I'll also look at the gl elements, but presumably glupload is additional on both streams upstream of the mixer, right?
What is vaapiencode for, can you elaborate?
My pipeline now works without error messages:
w=1280; h=800; x0=450; y0=200; gst-launch-1.0
v4l2src device=/dev/video0 \
! videocrop left=$x0 right=$(( 1920 - $w - $x0 )) top=$y0
bottom=$(( 1080 - $h - $y0 )) \
! alpha method=green angle=70 \
! queue \
! mixer.sink_1 \
\
ximagesrc xid=0x06600003 use-damage=false \
! videoscale \
! video/x-raw,width=$w,height=$h \
! queue \
! mixer.sink_0 \
\
videomixer name=mixer sink_0::zorder=0 sink_1::zorder=1 \
! videoconvert \
! autovideosink -v sync=false
The time-lag is significant and the displayed framerate is poor (maybe 5 to 10 fps), but it works.
Note that my desktop hardly works at all (overall CPU average well below 10%, the 1 highest of 12 cores at 10-20% max).
1/ It would be great if someone could chime in with ideas to help improve the real-time performance of this pipeline - ideally resulting in a framerate of at least 15 or 20.
2/ My next step is to use the "alpha" element properly.
With the pipeline above my green screen gets replaced with a shaded background - i.e. alpha is not 0 or 1, but something in between.
How do I increase the alpha channel values?
3/ My following step is to be able to output to /dev/video11 (v4l2loopback). It currently blocks because of the ximagesrc, but I do not understand why.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200413/85568f4b/attachment.htm>
More information about the gstreamer-devel
mailing list