How do gstreamer interfaces with H264 hardware encoders and creates videos ?

simon.zz at yahoo.com simon.zz at yahoo.com
Wed Apr 25 21:33:24 UTC 2018


 Hello Rand,
You are right. The board is a Dragonboard 410c by 96boards.
https://developer.qualcomm.com/hardware/snapdragon-410/tools

96boards in their release notes
http://releases.linaro.org/96boards/dragonboard410c/linaro/debian/latest/

write that the gstreamer pipeline uses the video encoder.But as I said before, I noticed notable differences in video results, which make me doubt that gstreamer really uses the encoder..

The C/C++ code I am using is based on this one:
stanimir.varbanov/v4l2-decode.git - Unnamed repository


| 
| 
|  | 
stanimir.varbanov/v4l2-decode.git - Unnamed repository


 |

 |

 |


I basically changed the Varbanov's code to catch the camera frames and feed the encoder, my code works in the sense I can record an h264 video (not mp4 as gstreamer does), but I noticed the results I commented in my previous mail.

So what additional processing gstreamer applies to the video hardware encoding ?
Regards,
Simon
    Il mercoledì 25 aprile 2018, 22:49:38 CEST, Rand Graham <rand.graham at zenith.com> ha scritto:  
 
 #yiv7041149014 #yiv7041149014 -- _filtered #yiv7041149014 {font-family:Helvetica;panose-1:2 11 6 4 2 2 2 2 2 4;} _filtered #yiv7041149014 {panose-1:2 4 5 3 5 4 6 3 2 4;} _filtered #yiv7041149014 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;}#yiv7041149014 #yiv7041149014 p.yiv7041149014MsoNormal, #yiv7041149014 li.yiv7041149014MsoNormal, #yiv7041149014 div.yiv7041149014MsoNormal {margin:0in;margin-bottom:.0001pt;font-size:12.0pt;font-family:New serif;}#yiv7041149014 a:link, #yiv7041149014 span.yiv7041149014MsoHyperlink {color:#0563C1;text-decoration:underline;}#yiv7041149014 a:visited, #yiv7041149014 span.yiv7041149014MsoHyperlinkFollowed {color:#954F72;text-decoration:underline;}#yiv7041149014 span.yiv7041149014EmailStyle17 {font-family:sans-serif;color:#1F497D;}#yiv7041149014 .yiv7041149014MsoChpDefault {font-size:10.0pt;} _filtered #yiv7041149014 {margin:1.0in 1.0in 1.0in 1.0in;}#yiv7041149014 div.yiv7041149014WordSection1 {}#yiv7041149014 
Hello,

  

It might help if you mention which embedded board you are using.

  

In order to use custom hardware from a vendor such as nVidia, you would compile gstreamer plugins provided by the vendor and then specify them in your pipeline.

  

Regards,

Rand

  

From: gstreamer-devel [mailto:gstreamer-devel-bounces at lists.freedesktop.org] On Behalf Of simon.zz at yahoo.com
Sent: Wednesday, April 25, 2018 1:01 PM
To: gstreamer-devel at lists.freedesktop.org
Subject: How do gstreamer interfaces with H264 hardware encoders and creates videos ?

  

Hello,

  

I am using an embedded board which has an hardware H264 encoder and I am testing video generation both with gst-launch and with a C++ code wrote by my self.

  

Comparing my code results to the gst-launch results, it is clear and obvious that gstreamer applies additional processing compared to what I get from the hardware encoder buffer.

The first obvious processing is that it generates an mp4 video, while I can only generate an h264 video, but I am not using additional mp4 demux in my code.

  

For example, the gst-launch resulting video image's quality it's quiet better, the video has the correct framerate rather than the video I obtain which results slightly "accelerated", and in addtition, the time-stap (minutes - seconds) is present while in the video I obtain from my C++ code it's not.

  

So I suspect that gstreamer doesn't use the hardware encoder.

How can I be sure that gstreamer uses the hardware encoder instead of a h264 software library and how can I know in real time what are the V4L2 settings that gstreamer applies to the encoder ?

  

Thanks.

Regards,

Simon

  

 
  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20180425/4f48cf84/attachment-0001.html>


More information about the gstreamer-devel mailing list