MJPEG USB camera to H264 Encoding Pipeline

Alexandru Stefan alexandru.n.stefan at gmail.com
Mon Jul 4 17:12:48 UTC 2022


Hi,

I'm running gstreamer on an i.MX8QM with H264 and JPEG HW
encoding/decoding. My end goal is to process two MJPEG streams from two
different USB cameras, do some light processing in OpenCV and then encode
it in H264. My first step was to simply create a gstreamer pipeline to
transcode the frames coming from the camera into H264 and write them to a
file. Small proof-of-concept. However, I'm running into issues linking
the v4l2jpegdec v4l2h264enc elements.  This is the pipeline I'm using (it's
taken from NXP's official user guide on using i.MX8 with GStreamer [1]).
Have tried a lot of variations, but the results (even with detailed
logging) have looked very similar:

gst-launch-1.0 v4l2src device=/dev/video2 ! jpegparse ! v4l2jpegdec ! queue
! videoconvert ! v4l2h264enc ! h264parse ! matroskamux ! filesink
location=out.mkv

And this is the output I'm getting:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal
data stream error.
Additional debug info:
../git/libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop ():
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)

I have a 7MB log file obtained by running GST_DEBUG_LEVEL=6, but I'm not
sure if the correct etiquette is to attach it to this email. I've also
tried using caps filtering and used gst-inspect to make sure the src of the
jpegdecoder matches the sink of the h264 encoder. Not looking for a
straight response, but a few hints on where I can look next, as I'm feeling
a bit lost.

Also, since my goal is to have the OpenCV app between the MJPEG decoder and
the H264 encoder, should I just abandon the goal of having a manual
pipeline working first and simply start testing straight with the
application?

Thirdly, is it feasible to expect the i.MX8QM to be able to decode the
streams from two MJPEG cameras (1080p at 60FPS), combine them and then encode
the result using H264? Or do I need a different processor?

Thank you!

[1]
https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/i-MX-8-GStreamer-User-Guide/ta-p/1098942
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20220704/b922fa72/attachment-0001.htm>


More information about the gstreamer-devel mailing list