MJPEG USB camera to H264 Encoding Pipeline

Nicolas Dufresne nicolas at ndufresne.ca
Tue Jul 5 13:31:39 UTC 2022


Le lundi 04 juillet 2022 à 20:12 +0300, Alexandru Stefan via gstreamer-devel a
écrit :
> Hi,
> 
> I'm running gstreamer on an i.MX8QM with H264 and JPEG HW encoding/decoding.
> My end goal is to process two MJPEG streams from two different USB cameras, do
> some light processing in OpenCV and then encode it in H264. My first step was
> to simply create a gstreamer pipeline to transcode the frames coming from the
> camera into H264 and write them to a file. Small proof-of-concept. However,
> I'm running into issues linking the v4l2jpegdec v4l2h264enc elements.  This is
> the pipeline I'm using (it's taken from NXP's official user guide on using
> i.MX8 with GStreamer [1]). Have tried a lot of variations, but the results
> (even with detailed logging) have looked very similar:
> 
> gst-launch-1.0 v4l2src device=/dev/video2 ! jpegparse ! v4l2jpegdec ! queue !
> videoconvert ! v4l2h264enc ! h264parse ! matroskamux ! filesink
> location=out.mkv
> 
> And this is the output I'm getting:
> Setting pipeline to PAUSED ...
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> New clock: GstSystemClock
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data
> stream error.
> Additional debug info:
> ../git/libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop ():
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> streaming stopped, reason not-negotiated (-4)
> 
> I have a 7MB log file obtained by running GST_DEBUG_LEVEL=6, but I'm not sure
> if the correct etiquette is to attach it to this email. I've also tried using
> caps filtering and used gst-inspect to make sure the src of the jpegdecoder
> matches the sink of the h264 encoder. Not looking for a straight response, but
> a few hints on where I can look next, as I'm feeling a bit lost.

Its most likely some colorimetry miss-match. I've seen some report coming from
Raspberry Pi folks. Though each case needs to be analyzed seperately, as there
is also a lot of driver bugs around. 

On GStreamer side, only a handful of colorimetry is supported allowed, though
JPEG colorimetry may be reported by the parser, which means it will override
what the v4l2 driver says, and may not be compatible with the following v4l2
decoder. Please, try and trace with GST_DEBUG="*CAPS*:7,v4l2*:7" and share this
trace, the 7MB traceis likely not very useful.

> 
> Also, since my goal is to have the OpenCV app between the MJPEG decoder and
> the H264 encoder, should I just abandon the goal of having a manual pipeline
> working first and simply start testing straight with the application?
> 
> Thirdly, is it feasible to expect the i.MX8QM to be able to decode the streams
> from two MJPEG cameras (1080p at 60FPS), combine them and then encode the result
> using H264? Or do I need a different processor?
> 
> Thank you!
> 
> [1] 
> https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/i-MX-8-GStreamer-
> User-Guide/ta-p/1098942
> 



More information about the gstreamer-devel mailing list