How do gstreamer interfaces with H264 hardware encoders and creates videos ?

Rand Graham rand.graham at zenith.com
Wed Apr 25 21:48:07 UTC 2018


Hello,

Did you recompile according to the release notes? Are you using the pipeline shown in the release notes?

To know what is being done by gstreamer, you should copy paste the exact pipeline you are using.

The release notes show this pipeline

gst-launch-1.0 -e v4l2src device=/dev/video3 ! video/x-raw,format=NV12,width=1280,height=960 ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse ! mp4mux ! filesink location=video.mp4

It looks like this pipeline is using v4l2h264enc to do the h.264 encoding.

It is then using a parser and a muxer and a filesink to create an mp4 file. What this does is use an mp4 container that contains an h264 video track.

It looks like the h264 encoder takes some parameters. You may be able to get better video quality by adjusting the parameters of the h264 encoder. For example, there is typically a “high” setting that can be used for h264 quality. You might also try increasing the bitrate to see if that improves quality. (The height and width dimensions seem odd to me. I would expect something like 1280x720 or 1920x1080)

Regards,
Rand



From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of simon.zz at yahoo.com
Sent: Wednesday, April 25, 2018 4:33 PM
To: Discussion of the Development of and With GStreamer <gstreamer-devel at lists.freedesktop.org>
Subject: Re: RE: How do gstreamer interfaces with H264 hardware encoders and creates videos ?

Hello Rand,

You are right. The board is a Dragonboard 410c by 96boards.

https://developer.qualcomm.com/hardware/snapdragon-410/tools

96boards in their release notes

http://releases.linaro.org/96boards/dragonboard410c/linaro/debian/latest/

write that the gstreamer pipeline uses the video encoder.
But as I said before, I noticed notable differences in video results, which make me doubt that gstreamer really uses the encoder..

The C/C++ code I am using is based on this one:

stanimir.varbanov/v4l2-decode.git - Unnamed repository<https://git.linaro.org/people/stanimir.varbanov/v4l2-decode.git/tree>


stanimir.varbanov/v4l2-decode.git - Unnamed repository




I basically changed the Varbanov's code to catch the camera frames and feed the encoder, my code works in the sense I can record an h264 video (not mp4 as gstreamer does), but I noticed the results I commented in my previous mail.

So what additional processing gstreamer applies to the video hardware encoding ?

Regards,
Simon

Il mercoledì 25 aprile 2018, 22:49:38 CEST, Rand Graham <rand.graham at zenith.com<mailto:rand.graham at zenith.com>> ha scritto:



Hello,



It might help if you mention which embedded board you are using.



In order to use custom hardware from a vendor such as nVidia, you would compile gstreamer plugins provided by the vendor and then specify them in your pipeline.



Regards,

Rand



From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of simon.zz at yahoo.com<mailto:simon.zz at yahoo.com>
Sent: Wednesday, April 25, 2018 1:01 PM
To: gstreamer-devel at lists.freedesktop.org<mailto:gstreamer-devel at lists.freedesktop.org>
Subject: How do gstreamer interfaces with H264 hardware encoders and creates videos ?



Hello,



I am using an embedded board which has an hardware H264 encoder and I am testing video generation both with gst-launch and with a C++ code wrote by my self.



Comparing my code results to the gst-launch results, it is clear and obvious that gstreamer applies additional processing compared to what I get from the hardware encoder buffer.

The first obvious processing is that it generates an mp4 video, while I can only generate an h264 video, but I am not using additional mp4 demux in my code.



For example, the gst-launch resulting video image's quality it's quiet better, the video has the correct framerate rather than the video I obtain which results slightly "accelerated", and in addtition, the time-stap (minutes - seconds) is present while in the video I obtain from my C++ code it's not.



So I suspect that gstreamer doesn't use the hardware encoder.

How can I be sure that gstreamer uses the hardware encoder instead of a h264 software library and how can I know in real time what are the V4L2 settings that gstreamer applies to the encoder ?



Thanks.

Regards,

Simon




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20180425/51d129f8/attachment-0001.html>


More information about the gstreamer-devel mailing list