need help to understand the pipeline
Peter Maersk-Moller
pmaersk at gmail.com
Mon Jul 19 12:30:16 UTC 2021
See replies inline.
On Mon, Jul 19, 2021 at 12:57 PM niXman via gstreamer-devel <
gstreamer-devel at lists.freedesktop.org> wrote:
> can you tell me please if I specify `format=YUV422` (gst-launch-1.0
> v4l2src device=/dev/video10 ! video/x-raw,format=YUV422 ! xvimagesink)
> in caps for v4l2src, how the camera understand that it should switch to
> YUV422 ?
>
I don't know if YUV422 is a valid naming of raw formats in GStreamer. You
can check here the correct naming.
https://gstreamer.freedesktop.org/documentation/additional/design/mediatype-video-raw.html?gi-language=c
> can the v4l2src plugin tell the v4l2 subsystem which mode to switch the r
> camera to? Or do I need to switch modes using another command?
>
The GStreamer module v4l2src will ask the V4l2 subsystem for the raw format
you specify in the format specs. If your hardware & driver for the camera
supports it and specifically supports it for the combination of geometry
and framerate you select, then it will succeed.
Use the command "gst-inspect-1.0 v4l2src" to see the formats the module
supports on its src pad (output pad). Note that the formats listed will be
formats supported by the module, not the hardware/driver. If the driver
does not support it, it will fail.
Use the command "v4l2-ctl --all" to list supported formats by the camera
hardware/driver. You can also use "ffmpeg -hide_banner -f v4l2
-list_formats all -i /dev/video0" to get additional info. Of course this
imply the commands are installed on your platform.
>
> >> is there a way how i can save only one YUV frame without any
> >> processing?
> >> I just want to make sure.
> >>
> >
> > One way is listed below. Otherways is to use multifilesink of Gstreamer
> > etc.
> >
> > gst-launch-1.0 -e -v \
> > v4l2src device=/dev/video0 ! \
> > video/x-raw,format=Y42B,width=1280,height=800,framerate=10/1 ! \
> > fdsink fd=3 3>&1 1>&2 | od of=MyFile.yuv bs=16384000 count=1
>
> I got it.
> then tell me please, how can I be sure the frame I received is actually
> a frame in the format I specified?
>
Well, you can take it as input to a gstreamer pipeline using "filesrc"
module and specify the format to be
"video/x-raw,format=Y42B,width=1280,height=800" or similar and use a jpeg
or png encoder module and convert it to a JPEG or PNG file and iew it with
an image viewer. Or you can search Google for YUV viewing tools.
>
> > Are you trying to encode two separate streams or are you wanting to mix
> > to
> > streams into one stream and encode that?
>
You didn't answer what you are trying to do.
> now my question is only to understand why the SOM is able to encode
> video in 1920x1080 resolution, but not capable to encode video in
> 320x240 ?
> and is the hardware encoder really only available for cameras supported
> by the v4l2 subsystem?
> > I don't know your system. Assuming it is Linux, how much of the kernel
> > can
> > you compile/modify?
> I wrote about this above. Yes this is Linux, Debian-4.14.
> and the kernel sources are available to me and I can configure, rebuild
> and flash the kernel to the SOM.
> > I don't understand your challenges as I don't know your limitations. If
> > you
> > can read images from a camera into a Linux compuer, you can always loop
> > it
> > using a V4L2 loopback interfacew making it into a V4L2 device. But
> > usually
> > normal camera using USB is available through V4L2 interfaces natively?
> > Linux V4L2 has a loopback interface where you can loo
> yes, that Arducam works using v4l2 natively, but not thermal one.
> the last few days I have been trying to figure out how to create an
> additional /dev/videoN file and how to write thermal cam images into it
> from C++ code...
> > Well if you have documentation and a working compiler, you can do
> > anything
> > you brain can think of. That said and assuming your initial pipeline
> > works
> > - I don't think you answered if it does, then I am guessing you get
> > access
> > to the systems hardware encoder using the v4l2h264enc module of
> > GStreamer.
>
TL;DR. I usually solve such problems charging Euro 125 per hour.
yes, when using Arducam everything works as it should, but I'm not sure
> the Arducam produces raw frames. above I asked a question about this.
> and I still can't figure out how I can be sure that the hardware encoder
> is actually present in the SOM.
>
> I found the source of v4l2h264enc plugin:
> https://github.com/GStreamer/gst-plugins-good/tree/master/sys/v4l2
>
> but I don't see any direct interaction with the hardware encoder here.
> it looks like all the interaction with the encoder is hidden behind the
> GStreamer abstractions...
>
TL;DR. I usually solve such problems charging Euro 125 per hour. I didn't
check if hardware encoding is supported using your hardware as it isn't
completely clear what hardware you are talking about. GStreamer does
support some hardware supported encoding and some, not all, is free. Others
require paid libraries that some companies developed for a particular use.
Nothing wrong in that.
>
> > If that is true, then what they are telling you is that to use the
> > hardware
> > encode, you must use the V4L2 subsystem. And you are. You are using the
> > GStreamer V4L2 encoder module called v4l2h264enc. It does not mean that
> > you
> > have to use a camera that uses V4L2, although in many cases you will.
> > Do
> > you understand the difference? You apparently have to use the V4L2
> > subsystem to access the hardware encoder and send raw video data to it
> > and
> > retrieve encoded video from it.
>
> yes, I understand that. and in parallel I am trying to understand how to
> implement sending ARGB frames from a Thermal camera to /dev/videoN.
> but I would like to make sure that the hardware encoder does exist and
> then I would like to be able to interact with it using its API.
>
If your camera is not V4l2 supported camera and if there doesn't exist a
module for GStreamer supporting your camera, then you have to write it
yourself.
> What are you trying that works (pipeline examples) and what doesn't
> > work
> > (pipeline examples)?
>
I have lost the overview of what you have, what works, what doesn't work
and what is your problem. So for any further, a new clean mail with
description and links to what you have (hardware/software) and what works
and what doesn't work is required.
> atm only Arducam is supported by the v4l2 subsystem and I have no
> problem with this cam.
> the only reason I mention this camera at all is to try to understand:
>
> a) why SOM using v4l2h264enc plugin can encode 1080p in real time, but
> my implementation using libx264 cannot encode frames with 320x240
> resolution in real time.
>
I don't even know what your hardware is ? Is it an ARM (links????). Results
on an x86 can only logically (not perfomance wise) be compared to a result
on an ARM platform.
The GStreamer module x264enc (using the libx264) is purely software. The
v4l2h264enc is most likely a solution using the underlying hardware. On
most ARM hardware except very high performance ARMs like Apples M1 etc,
software encoding of H.264 of larger video geometries is out of the
question.
> b) I want to make sure that the hardware encoder is actually present in
> the SOM and how I can use it directly through API.
>
Your findings suggest it is likely you have hardware assisted encoding, but
I can't verify that you have not done something completely "uncomparable"
with the two encoder setups thus negating that conclusion.
Look, do you have a way make GSTreamer read video data from your camera (I
guess it is the IR camera)? Yes or no?
If yes, then you can feed it to v4l2h264enc. So what is the problem?
Unless you do a full recap of what you have, what is working and what is
not and what is your problem, I can't advice any more. This is already way
way way to confusing.
Regards.
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20210719/caae7fde/attachment.htm>
More information about the gstreamer-devel
mailing list