How do gstreamer interfaces with H264 hardware encoders and creates videos ?

Nicolas Dufresne nicolas at ndufresne.ca
Thu Apr 26 15:50:49 UTC 2018


Le mercredi 25 avril 2018 à 22:10 +0000, simon.zz at yahoo.com a écrit :
> Hello Rand,
> 
> Yes I recompiled the libs as 96boards suggests.
> The pipeline I use is:
> 
> gst-launch-1.0 -v -e v4l2src device=/dev/video3 ! videoconvert!
> video/x-raw,width=1920,height=1080 ! v4l2h264enc extra-
> controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse
> ! mp4mux ! filesink location=video.mp4

I strongly discourage using extra-controls to select the profiles.
Please report to 96board so they can fix their wiki. Instead, the
profile should be selected using a caps filter downstream the encoder.
Applies to profile and level. Note that neither Venus driver or
GStreamer makes any validation of the profile/level combination, so you
can easily produce invalid stream at the moment.

> 
> and this pipeline creates a quiet good video..
> In this case the output is:

That's a good news.

> 
> Setting pipeline to PAUSED ...
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps =
> video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps
> = video/x-raw, width=(int)1920, height=(int)1080,
> format=(string)NV12, framerate=(fraction)30/1, interlace-
> mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-
> ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps =
> video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstTee:t.GstTeePad:src_0: caps = video/x-raw,
> width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-
> raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps =
> video/x-h264, stream-format=(string)byte-stream,
> alignment=(string)au, profile=(string)high, level=(string)1,
> width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1
> /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps =
> video/x-h264, stream-format=(string)byte-stream,
> alignment=(string)au, profile=(string)high, level=(string)1,
> width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1
> /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-
> raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstTee:t.GstTeePad:src_1: caps = video/x-raw,
> width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> Redistribute latency...
> /GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = video/x-
> raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = video/x-
> raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstTee:t.GstPad:sink: caps = video/x-raw,
> width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstTee:t.GstPad:sink: caps = video/x-raw,
> width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps =
> video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink:
> caps = video/x-raw, width=(int)1920, height=(int)1080,
> format=(string)NV12, framerate=(fraction)30/1, interlace-
> mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-
> ratio=(fraction)1/1
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps =
> video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
> /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps =
> video/x-h264, stream-format=(string)avc, alignment=(string)au,
> profile=(string)high, level=(string)1, width=(int)1920,
> height=(int)1080, pixel-aspect-ratio=(fraction)1/1,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, parsed=(boolean)true,
> codec_data=(buffer)0164000affe100176764000aacd201e0089a100fe502b3b9ac
> a008da1426a001000568ce06e2c0
> /GstPipeline:pipeline0/GstMP4Mux:mp4mux0.GstPad:video_0: caps =
> video/x-h264, stream-format=(string)avc, alignment=(string)au,
> profile=(string)high, level=(string)1, width=(int)1920,
> height=(int)1080, pixel-aspect-ratio=(fraction)1/1,
> framerate=(fraction)30/1, interlace-mode=(string)progressive,
> colorimetry=(string)2:4:7:1, parsed=(boolean)true,
> codec_data=(buffer)0164000affe100176764000aacd201e0089a100fe502b3b9ac
> a008da1426a001000568ce06e2c0
> /GstPipeline:pipeline0/GstMP4Mux:mp4mux0.GstPad:src: caps =
> video/quicktime, variant=(string)iso
> /GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps =
> video/quicktime, variant=(string)iso
> 
> Otherwise, If I use the following pipeline to extract raw frames
> while I am recording the video
> 
> sudo gst-launch-1.0 -e v4l2src device=/dev/video3 ! videoconvert! 

There is a syntax error, missing a space after videoconvert, copy paste
error ?

> video/x-raw,width=1920,height=1080 ! tee name=t ! queue ! v4l2h264enc
> extra-controls="controls,h264_profile=4,video_bitrate=2000000;" !
> h264parse ! mp4mux ! filesink location=video.mp4 t. ! queue !
> multifilesink location=file%1d.raw

multifilesink does not support GstVideoMeta, so it's very likely that
v4l2src is forced to copy the frames to "standard" strides/offset
buffers. This is very CPU intense, which can lead to frames being
dropped. I've added fakevideosink recently to workaround similar
issues, it's a bit annoying, but maybe we need something similar for
filesink/multifilesink.

> 
> the resulting video is quiet crappy, not all frames are being
> recorded and the sequence remains locked on a single frame for a few
> seconds. It's obvious that the extracting frames task is not good
> while recording a video. If gstreamer should be using the h264
> encoder, why it gives this problems in this case ?
> I doubt about it because my C/C++ code doesn't generate a bad video
> as in this case.

The encoder bitrate adaptater is based on the provided framerate. By
dropping, we decrease the rate, we end up confusing the firmware, which
results in bad quality. We also break the motion, which makes the
encoding less efficient.

> 
> >> It is then using a parser and a muxer and a filesink to create an
> mp4 file. 
> 
> This is an interesting point. In this case gstreamer should use a mp4
> library using the h264 encoded data ? 
> This makes sense to me.
> Simon
> Il mercoledì 25 aprile 2018, 23:49:53 CEST, Rand Graham <rand.graham@
> zenith.com> ha scritto:
> 
> 
> Hello,
>  
> Did you recompile according to the release notes? Are you using the
> pipeline shown in the release notes?
>  
> To know what is being done by gstreamer, you should copy paste the
> exact pipeline you are using.
>  
> The release notes show this pipeline
>  
> gst-launch-1.0 -e v4l2src device=/dev/video3 ! video/x-
> raw,format=NV12,width=1280,height=960 ! v4l2h264enc extra-
> controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse
> ! mp4mux ! filesink location=video.mp4
>  
> It looks like this pipeline is using v4l2h264enc to do the h.264
> encoding.
>  
> It is then using a parser and a muxer and a filesink to create an mp4
> file. What this does is use an mp4 container that contains an h264
> video track.
>  
> It looks like the h264 encoder takes some parameters. You may be able
> to get better video quality by adjusting the parameters of the h264
> encoder. For example, there is typically a “high” setting that can be
> used for h264 quality. You might also try increasing the bitrate to
> see if that improves quality. (The height and width dimensions seem
> odd to me. I would expect something like 1280x720 or 1920x1080)
>  
> Regards,
> Rand
>  
>  
>  
> From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedeskt
> op.org] On Behalf Of simon.zz at yahoo.com
> Sent: Wednesday, April 25, 2018 4:33 PM
> To: Discussion of the Development of and With GStreamer <gstreamer-de
> vel at lists.freedesktop.org>
> Subject: Re: RE: How do gstreamer interfaces with H264 hardware
> encoders and creates videos ?
>  
> Hello Rand,
>  
> You are right. The board is a Dragonboard 410c by 96boards.
>  
> https://developer.qualcomm.com/hardware/snapdragon-410/tools
>  
> 96boards in their release notes
>  
> http://releases.linaro.org/96boards/dragonboard410c/linaro/debian/lat
> est/
>  
> write that the gstreamer pipeline uses the video encoder.
> But as I said before, I noticed notable differences in video results,
> which make me doubt that gstreamer really uses the encoder..
>  
> The C/C++ code I am using is based on this one:
>  
> stanimir.varbanov/v4l2-decode.git - Unnamed repository
>  
> stanimir.varbanov/v4l2-decode.git - Unnamed repository  
>  
> I basically changed the Varbanov's code to catch the camera frames
> and feed the encoder, my code works in the sense I can record an h264
> video (not mp4 as gstreamer does), but I noticed the results I
> commented in my previous mail.
>  
> So what additional processing gstreamer applies to the video hardware
> encoding ?
>  
> Regards,
> Simon
>  
> Il mercoledì 25 aprile 2018, 22:49:38 CEST, Rand Graham <rand.graham@
> zenith.com> ha scritto:
>  
>  
> Hello,
>  
> It might help if you mention which embedded board you are using.
>  
> In order to use custom hardware from a vendor such as nVidia, you
> would compile gstreamer plugins provided by the vendor and then
> specify them in your pipeline.
>  
> Regards,
> Rand
>  
> From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedeskt
> op.org] On Behalf Of simon.zz at yahoo.com
> Sent: Wednesday, April 25, 2018 1:01 PM
> To: gstreamer-devel at lists.freedesktop.org
> Subject: How do gstreamer interfaces with H264 hardware encoders and
> creates videos ?
>  
> Hello,
>  
> I am using an embedded board which has an hardware H264 encoder and I
> am testing video generation both with gst-launch and with a C++ code
> wrote by my self.
>  
> Comparing my code results to the gst-launch results, it is clear and
> obvious that gstreamer applies additional processing compared to what
> I get from the hardware encoder buffer.
> The first obvious processing is that it generates an mp4 video, while
> I can only generate an h264 video, but I am not using additional mp4
> demux in my code.
>  
> For example, the gst-launch resulting video image's quality it's
> quiet better, the video has the correct framerate rather than the
> video I obtain which results slightly "accelerated", and in
> addtition, the time-stap (minutes - seconds) is present while in the
> video I obtain from my C++ code it's not.
>  
> So I suspect that gstreamer doesn't use the hardware encoder.
> How can I be sure that gstreamer uses the hardware encoder instead of
> a h264 software library and how can I know in real time what are the
> V4L2 settings that gstreamer applies to the encoder ?
>  
> Thanks.
> Regards,
> Simon
>  
>  
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 195 bytes
Desc: This is a digitally signed message part
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20180426/25a9ee07/attachment-0001.sig>


More information about the gstreamer-devel mailing list