encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

Constantine Elster constantine.elster at valerann.com
Mon Jan 20 18:13:09 UTC 2020


Hi Nicolas,

Thank you very much!!

I removed output-io-mode=dmabuf-import from v4l2video8convert and v4l2h264enc
elements and got the *pipeline working*!
This pipeline worked for me: gst-launch-1.0 -vvv v4l2src
device="/dev/video9" num-buffers=1000 ! "video/x-raw, format=(string)UYVY,
width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1,
framerate=(fraction)45/1, colorimetry=bt709" ! v4l2video8convert !
v4l2h264enc ! h264parse ! mp4mux ! filesink location=aha.mp4

I get about 15% CPU usage with the above pipeline, though I get few dropped
frames in the output file.

So, the next step I want to perform few optimizations.
1) I set the io-mode=dmabuf-import on the v4l2src element though I get this
error:
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to
allocate required memory.
Additional debug info:
gstv4l2src.c(658): gst_v4l2src_decide_allocation ():
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed

So, I presume 0memcopy is not possible?

2) I would like to drop frames as close as possible to the v4l2 source to
achieve 10 fps at the input of the encoder. Though my v4l2 src accepts only
45 and 60 fps configuration. Is there an elegant way to reduce the frame
rate this way?

3) Any other recommendations I can pursue to optimize the pipeline?

Thank you very very much,
  -- Constantine.


On Sun, Jan 19, 2020 at 7:57 PM Nicolas Dufresne <nicolas at ndufresne.ca>
wrote:

> Le dimanche 19 janvier 2020 à 15:12 +0200, Constantine Elster a écrit :
> > Thank you very much Milian! Looks better with h264parse. Still getting
> an error, though now different one.
> >
> > I added h264parse to the pipeline:
> > gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 !
> "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import !
> v4l2h264enc output-io-mode=dmabuf-import ! h264parse ! mp4mux ! filesink
> location=aha.mp4
> >
> > Output:
> > Setting pipeline to PAUSED ...
> > Pipeline is live and does not need PREROLL ...
> > Setting pipeline to PLAYING ...
> > New clock: GstSystemClock
> > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps =
> video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> colorimetry=(string)bt709, interlace-mode=(string)progressive
> > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps =
> video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> colorimetry=(string)bt709, interlace-mode=(string)progressive
> > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src:
> caps = video/x-raw, framerate=(fraction)60/1,
> interlace-mode=(string)progressive, format=(string)I420, width=(int)640,
> height=(int)480, colorimetry=(string)bt709
> > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps =
> video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> interlace-mode=(string)progressive, colorimetry=(string)bt709
> > /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps =
> video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> interlace-mode=(string)progressive, colorimetry=(string)bt709
> > Redistribute latency...
> > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps =
> video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive,
> format=(string)I420, width=(int)640, height=(int)480,
> colorimetry=(string)bt709
> > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink:
> caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> colorimetry=(string)bt709, interlace-mode=(string)progressive
> > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps =
> video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> colorimetry=(string)bt709, interlace-mode=(string)progressive
> > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal
> data stream error.
> > Additional debug info:
> > gstbasesrc.c(3055): gst_base_src_loop ():
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> > streaming stopped, reason error (-5)
>
> Looks like the real error was not propagated (let me know which version
> you are
> using, Ubuntu tends to be far behind, or only cherry-picking fixes rather
> then
> tracking stable branches).
>
> To find our more, set GST_DEBUG="v4l2*:7" env. That will give you a lot
> more
> details. By experience, I suspect that your output-io-mode=dmabuf-import is
> faulty on the v4l2convert element (on newer GStreamer, we now have a fix
> name
> for converter too).
>
> IMX.6 platform (and IMX.8 too) didn't include an IOMMU in their platform.
> On the
> other side, USB cameras (like UVC) produces scattered memory (virtual
> memory).
> That makes importation of dmabuf from UVC to v4l2convert impossible in this
> case. You may want to try the other way around, to set
> io-mode=dmabuf-import on
> the v4l2src element. If that does not work, you'll have no choice but to
> let the
> pipelein makes a copy (default setting). Normally, I suggest to first try
> without touching these advance settings, to make it work, and then tweak
> to gain
> more performance.
>
> > Execution ended after 0:00:00.436501205
> > Setting pipeline to PAUSED ...
> > Setting pipeline to READY ...
> > Setting pipeline to NULL ...
> > Freeing pipeline ...
> >
> > Any hint on debugging would be appreciated!
> > Thank you,
> >   -- Constantine.
> >
> >
> >
> >
> >
> > On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <milian.wolff at kdab.com>
> wrote:
> > > On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> > > > Hi devs,
> > > >
> > > > I'm trying to construct a pipeline that captures frames from a USB
> camera
> > > > (YUV) and encodes them with HW encoder and saves into a file. My
> setup is
> > > > iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
> > > >
> > > > When I try a sw encoder, it works okay albeit I get very high 100%
> CPU
> > > > usage. The working pipeline based on software plugins:
> > > > gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> > > > location=aha.mp4
> > > >
> > > > My attempt to replace sw based plugins by HW based with efficient
> memory
> > > > management:
> > > > gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > colorimetry=bt709" ! *v4l2video8convert*
> *output-io-mode=dmabuf-import* !
> > > > *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> > > > location=aha.mp4
> > > >
> > > > I get the following error: "WARNING: erroneous pipeline: could not
> link
> > > > v4l2h264enc0 to mp4mux0"
> > > >
> > > > Would appreciate any ideas how to understand what's wrong, how to
> debug and
> > > > make it work.
> > >
> > > To debug, I suggest you compare the SRC of `gst-inspect-1.0
> v4l2h264enc` with
> > > the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able
> to fix
> > > the issue by adding a `h264parse` element in the middle to fix the
> alignment
> > > since `mp4mux` requires `au` alignment, whereas the encoder may output
> `nal`
> > > frames?
> > >
> > > Good luck
> > >
> > > --
> > > Milian Wolff | milian.wolff at kdab.com | Senior Software Engineer
> > > KDAB (Deutschland) GmbH, a KDAB Group company
> > > Tel: +49-30-521325470
> > > KDAB - The Qt, C++ and OpenGL Experts
> >
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > gstreamer-devel at lists.freedesktop.org
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>


-- 

Director, Software Engineering
*www.valerann.com <http://www.valerann.com>*


<https://www.ces.tech/Innovation-Awards/Honorees/2020/Best-Of/T/The-Valerann-Smart-Roads-System.aspx>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200120/b941eeb3/attachment-0001.htm>


More information about the gstreamer-devel mailing list