need help to understand the pipeline

i.nixman at autistici.org i.nixman at autistici.org
Mon Jul 19 10:56:49 UTC 2021


Hello Peter Maersk-Moller,


> Page 3: support for output formats: RAW RGB, RGB565/555/444, CCIR656,
> YUV422/420, YCbCr422, and compression
> Page 4: output formats: 8-/10-bit RGB RAW output

oh, my bad, im sorry %)

can you tell me please if I specify `format=YUV422` (gst-launch-1.0 
v4l2src device=/dev/video10 ! video/x-raw,format=YUV422 ! xvimagesink) 
in caps for v4l2src, how the camera understand that it should switch to 
YUV422 ?
can the v4l2src plugin tell the v4l2 subsystem which mode to switch the 
camera to? Or do I need to switch modes using another command?


>> is there a way how i can save only one YUV frame without any 
>> processing?
>> I just want to make sure.
>> 
> 
> One way is listed below. Otherways is to use multifilesink of Gstreamer 
> etc.
> 
> gst-launch-1.0 -e -v \
>      v4l2src device=/dev/video0 ! \
>      video/x-raw,format=Y42B,width=1280,height=800,framerate=10/1 ! \
>      fdsink fd=3 3>&1 1>&2 | od  of=MyFile.yuv bs=16384000 count=1
> 

I got it.
then tell me please, how can I be sure the frame I received is actually 
a frame in the format I specified?


> Are you trying to encode two separate streams or are you wanting to mix 
> to
> streams into one stream and encode that?

now my question is only to understand why the SOM is able to encode 
video in 1920x1080 resolution, but not capable to encode video in 
320x240 ?
and is the hardware encoder really only available for cameras supported 
by the v4l2 subsystem?


> I don't know your system. Assuming it is Linux, how much of the kernel 
> can
> you compile/modify?

I wrote about this above. Yes this is Linux, Debian-4.14.
and the kernel sources are available to me and I can configure, rebuild 
and flash the kernel to the SOM.

> I don't understand your challenges as I don't know your limitations. If 
> you
> can read images from a camera into a Linux compuer, you can always loop 
> it
> using a V4L2 loopback interfacew making it into a V4L2 device. But 
> usually
> normal camera using USB is available through V4L2 interfaces natively?
> Linux V4L2 has a loopback interface where you can loo

yes, that Arducam works using v4l2 natively, but not thermal one.
the last few days I have been trying to figure out how to create an 
additional /dev/videoN file and how to write thermal cam images into it 
from C++ code...

> Well if you have documentation and a working compiler, you can do 
> anything
> you brain can think of. That said and assuming your initial pipeline 
> works
> - I don't think you answered if it does, then I am guessing you get 
> access
> to the systems hardware encoder using the v4l2h264enc module of 
> GStreamer.

yes, when using Arducam everything works as it should, but I'm not sure 
the Arducam produces raw frames. above I asked a question about this.
and I still can't figure out how I can be sure that the hardware encoder 
is actually present in the SOM.

I found the source of v4l2h264enc plugin: 
https://github.com/GStreamer/gst-plugins-good/tree/master/sys/v4l2

but I don't see any direct interaction with the hardware encoder here. 
it looks like all the interaction with the encoder is hidden behind the 
GStreamer abstractions...


> If that is true, then what they are telling you is that to use the 
> hardware
> encode, you must use the V4L2 subsystem. And you are. You are using the
> GStreamer V4L2 encoder module called v4l2h264enc. It does not mean that 
> you
> have to use a camera that uses V4L2, although in many cases you will. 
> Do
> you understand the difference? You apparently have to use the V4L2
> subsystem to access the hardware encoder and send raw video data to it 
> and
> retrieve encoded video from it.

yes, I understand that. and in parallel I am trying to understand how to 
implement sending ARGB frames from a Thermal camera to /dev/videoN.
but I would like to make sure that the hardware encoder does exist and 
then I would like to be able to interact with it using its API.


> What are you trying that works (pipeline examples) and what doesn't 
> work
> (pipeline examples)?

atm only Arducam is supported by the v4l2 subsystem and I have no 
problem with this cam.
the only reason I mention this camera at all is to try to understand:

a) why SOM using v4l2h264enc plugin can encode 1080p in real time, but 
my implementation using libx264 cannot encode frames with 320x240 
resolution in real time.

b) I want to make sure that the hardware encoder is actually present in 
the SOM and how I can use it directly through API.


More information about the gstreamer-devel mailing list