How do gstreamer interfaces with H264 hardware encoders and creates videos ?

simon.zz at yahoo.com simon.zz at yahoo.com
Wed Apr 25 22:10:38 UTC 2018


 Hello Rand,

Yes I recompiled the libs as 96boards suggests.
The pipeline I use is:

gst-launch-1.0 -v -e v4l2src device=/dev/video3 ! videoconvert! video/x-raw,width=1920,height=1080 ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse ! mp4mux ! filesink location=video.mp4

and this pipeline creates a quiet good video..
In this case the output is:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstTee:t.GstTeePad:src_0: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstTee:t.GstTeePad:src_1: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
Redistribute latency...
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstTee:t.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstTee:t.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, parsed=(boolean)true, codec_data=(buffer)0164000affe100176764000aacd201e0089a100fe502b3b9aca008da1426a001000568ce06e2c0
/GstPipeline:pipeline0/GstMP4Mux:mp4mux0.GstPad:video_0: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:7:1, parsed=(boolean)true, codec_data=(buffer)0164000affe100176764000aacd201e0089a100fe502b3b9aca008da1426a001000568ce06e2c0
/GstPipeline:pipeline0/GstMP4Mux:mp4mux0.GstPad:src: caps = video/quicktime, variant=(string)iso
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)iso

Otherwise, If I use the following pipeline to extract raw frames while I am recording the video

sudo gst-launch-1.0 -e v4l2src device=/dev/video3 ! videoconvert! video/x-raw,width=1920,height=1080 ! tee name=t ! queue ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse ! mp4mux ! filesink location=video.mp4 t. ! queue ! multifilesink location=file%1d.raw

the resulting video is quiet crappy, not all frames are being recorded and the sequence remains locked on a single frame for a few seconds. It's obvious that the extracting frames task is not good while recording a video. If gstreamer should be using the h264 encoder, why it gives this problems in this case ?
I doubt about it because my C/C++ code doesn't generate a bad video as in this case.
>> It is then using a parser and a muxer and a filesink to create an mp4 file. 

This is an interesting point. In this case gstreamer should use a mp4 library using the h264 encoded data ? 
This makes sense to me.
Simon
    Il mercoledì 25 aprile 2018, 23:49:53 CEST, Rand Graham <rand.graham at zenith.com> ha scritto:  
 
 #yiv9593203778 #yiv9593203778 -- _filtered #yiv9593203778 {font-family:Helvetica;panose-1:2 11 6 4 2 2 2 2 2 4;} _filtered #yiv9593203778 {font-family:Helvetica;panose-1:2 11 6 4 2 2 2 2 2 4;} _filtered #yiv9593203778 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;} _filtered #yiv9593203778 {panose-1:2 11 5 2 4 2 4 2 2 3;} _filtered #yiv9593203778 {panose-1:0 0 0 0 0 0 0 0 0 0;}#yiv9593203778 #yiv9593203778 p.yiv9593203778MsoNormal, #yiv9593203778 li.yiv9593203778MsoNormal, #yiv9593203778 div.yiv9593203778MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:New serif;}#yiv9593203778 h2 {margin-right:0in;margin-left:0in;font-size:18.0pt;font-family:New serif;}#yiv9593203778 a:link, #yiv9593203778 span.yiv9593203778MsoHyperlink {color:blue;text-decoration:underline;}#yiv9593203778 a:visited, #yiv9593203778 span.yiv9593203778MsoHyperlinkFollowed {color:purple;text-decoration:underline;}#yiv9593203778 span.yiv9593203778Heading2Char {font-family:sans-serif;color:#2E74B5;}#yiv9593203778 p.yiv9593203778ydp8a39d818card-description, #yiv9593203778 li.yiv9593203778ydp8a39d818card-description, #yiv9593203778 div.yiv9593203778ydp8a39d818card-description {margin-right:0in;margin-left:0in;font-size:12.0pt;font-family:New serif;}#yiv9593203778 p.yiv9593203778msonormal, #yiv9593203778 li.yiv9593203778msonormal, #yiv9593203778 div.yiv9593203778msonormal {margin-right:0in;margin-left:0in;font-size:12.0pt;font-family:New serif;}#yiv9593203778 p.yiv9593203778msochpdefault, #yiv9593203778 li.yiv9593203778msochpdefault, #yiv9593203778 div.yiv9593203778msochpdefault {margin-right:0in;margin-left:0in;font-size:12.0pt;font-family:New serif;}#yiv9593203778 p.yiv9593203778msonormal1, #yiv9593203778 li.yiv9593203778msonormal1, #yiv9593203778 div.yiv9593203778msonormal1 {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:serif;}#yiv9593203778 p.yiv9593203778msochpdefault1, #yiv9593203778 li.yiv9593203778msochpdefault1, #yiv9593203778 div.yiv9593203778msochpdefault1 {margin-right:0in;margin-left:0in;font-size:10.0pt;font-family:New serif;}#yiv9593203778 span.yiv9593203778msohyperlink {}#yiv9593203778 span.yiv9593203778msohyperlinkfollowed {}#yiv9593203778 span.yiv9593203778emailstyle17 {}#yiv9593203778 span.yiv9593203778msohyperlink1 {color:#0563C1;text-decoration:underline;}#yiv9593203778 span.yiv9593203778msohyperlinkfollowed1 {color:#954F72;text-decoration:underline;}#yiv9593203778 span.yiv9593203778emailstyle171 {font-family:sans-serif;color:#1F497D;}#yiv9593203778 span.yiv9593203778EmailStyle29 {font-family:sans-serif;color:#1F497D;}#yiv9593203778 .yiv9593203778MsoChpDefault {font-size:10.0pt;} _filtered #yiv9593203778 {margin:1.0in 1.0in 1.0in 1.0in;}#yiv9593203778 div.yiv9593203778WordSection1 {}#yiv9593203778 
Hello,

  

Did you recompile according to the release notes? Are you using the pipeline shown in the release notes?

  

To know what is being done by gstreamer, you should copy paste the exact pipeline you are using.

  

The release notes show this pipeline

  

gst-launch-1.0 -e v4l2src device=/dev/video3 ! video/x-raw,format=NV12,width=1280,height=960 ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=2000000;" ! h264parse ! mp4mux ! filesink location=video.mp4

  

It looks like this pipeline is using v4l2h264enc to do the h.264 encoding.

  

It is then using a parser and a muxer and a filesink to create an mp4 file. What this does is use an mp4 container that contains an h264 video track.

  

It looks like the h264 encoder takes some parameters. You may be able to get better video quality by adjusting the parameters of the h264 encoder. For example, there is typically a “high” setting that can be used for h264 quality. You might also try increasing the bitrate to see if that improves quality. (The height and width dimensions seem odd to me. I would expect something like 1280x720 or 1920x1080)

  

Regards,

Rand

  

  

  

From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of simon.zz at yahoo.com
Sent: Wednesday, April 25, 2018 4:33 PM
To: Discussion of the Development of and With GStreamer <gstreamer-devel at lists.freedesktop.org>
Subject: Re: RE: How do gstreamer interfaces with H264 hardware encoders and creates videos ?

  

Hello Rand,

  

You are right. The board is a Dragonboard 410c by 96boards.

  

https://developer.qualcomm.com/hardware/snapdragon-410/tools

  

96boards in their release notes

  

http://releases.linaro.org/96boards/dragonboard410c/linaro/debian/latest/

  

write that the gstreamer pipeline uses the video encoder.

But as I said before, I noticed notable differences in video results, which make me doubt that gstreamer really uses the encoder..

  

The C/C++ code I am using is based on this one:

  

stanimir.varbanov/v4l2-decode.git - Unnamed repository

  

| 
| 
|  | 
stanimir.varbanov/v4l2-decode.git - Unnamed repository
 |

 |

 |


  

I basically changed the Varbanov's code to catch the camera frames and feed the encoder, my code works in the sense I can record an h264 video (not mp4 as gstreamer does), but I noticed the results I commented in my previous mail.

  

So what additional processing gstreamer applies to the video hardware encoding ?

  

Regards,

Simon

  

Il mercoledì 25 aprile 2018, 22:49:38 CEST, Rand Graham <rand.graham at zenith.com> ha scritto: 

  

  

Hello,

 

It might help if you mention which embedded board you are using.

 

In order to use custom hardware from a vendor such as nVidia, you would compile gstreamer plugins provided by the vendor and then specify them in your pipeline.

 

Regards,

Rand

 

From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of simon.zz at yahoo.com
Sent: Wednesday, April 25, 2018 1:01 PM
To: gstreamer-devel at lists.freedesktop.org
Subject: How do gstreamer interfaces with H264 hardware encoders and creates videos ?

 

Hello,

 

I am using an embedded board which has an hardware H264 encoder and I am testing video generation both with gst-launch and with a C++ code wrote by my self.

 

Comparing my code results to the gst-launch results, it is clear and obvious that gstreamer applies additional processing compared to what I get from the hardware encoder buffer.

The first obvious processing is that it generates an mp4 video, while I can only generate an h264 video, but I am not using additional mp4 demux in my code.

 

For example, the gst-launch resulting video image's quality it's quiet better, the video has the correct framerate rather than the video I obtain which results slightly "accelerated", and in addtition, the time-stap (minutes - seconds) is present while in the video I obtain from my C++ code it's not.

 

So I suspect that gstreamer doesn't use the hardware encoder.

How can I be sure that gstreamer uses the hardware encoder instead of a h264 software library and how can I know in real time what are the V4L2 settings that gstreamer applies to the encoder ?

 

Thanks.

Regards,

Simon

 

 
  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20180425/c07ae278/attachment-0001.html>


More information about the gstreamer-devel mailing list