[Libva] gen7 h264 encode bitrate behaviour

Chris Healy cphealy at gmail.com
Sun Aug 17 18:27:12 PDT 2014


I've done some further analysis with our real stream and we experience the
same inconsistent bitrate behaviour as with the test app.  It seems to me
that the way the bitrate control works doesn't do a good job of handling
certain input video sequences and the encoded bitrate subsequently spikes
as a result of this.

To help understand what I'm dealing with, I've posted a video on youtube
showing the video being encoded:

www.youtube.com/watch?v=LpYS_9IB0jU

I've also posted a bitrate graph online too that shows what happens when
encoding the video referenced above:

http://snag.gy/imvBe.jpg <http://snag.gy/VXaBA.jpg>

In the above graph, I set the targeted encode bitrate to 3.7Mbps, CBR, and
High Profile H.264.  Most of the time the bitrate hovers around 3.7Mbps,
but sometimes the bitrate drops very low then spikes up very high.  I also
notice that when the bitrate drops down low then spikes up real high, the
"highness" seems to be a function of how much and long the bitrate was
under 3.7Mbps.  It seems that the rate control logic is taking a 20 second
running bitrate average and trying it's best to keep the aggregate bitrate
at 3.7Mbps, so if the scene complexity drops, the rate control logic reacts
by cranking the QP to a very low value (high quality) to bring the bitrate
back up.  This behaviour combined with the fact that the video goes to a
simple fixed image, then crossfades to something complex in less than 20
seconds when the QP is a very low value results in the massive spike in
bitrate.  (This is my naive understanding of what’s going on.)

The code I'm using to encode and stream is based in large part on
libva/test/encode/h264encode.c.  I'm not sure if the logic for doing rate
control is in libva, libva-driver-intel, or supposed to be driven by the
code that uses libva.  Am I dealing with an issue with the encoder itself
or is it more likely my code not correctly driving the encoder?

What can be changed to keep the actual bitrate from being so bursty given
the video behaviour?

Regards,

Chris






On Fri, Aug 15, 2014 at 6:03 PM, Chris Healy <cphealy at gmail.com> wrote:

> I've been encoding h264 content using HD 4000 HW and am not able to make
> heads or tails of the way the encoder is behaving from the standpoint of
> the data size coming out of the encoder.
>
> I have a 24 fps 720p video that is the same image for ~8 seconds, then a
> 1.5 second fade to the next image followed by another ~8 seconds on that
> image.  This goes on and on indefinitely.  I would have expected that the
> bitrate would have been pretty low, then spike for 1.5 seconds then go back
> to a similarly low value.
>
> When I look at the data coming out of the encoder with a 4Mb/s bitrate set
> and CBR, I'm seeing almost the inverse where most of the time, the bitrate
> is pretty close to 4Mb/s then it spikes above 4Mb/s (presumably for the
> fade), then it drops down to ~2Mbps for a second or so before going back up
> to ~4Mb/s.
>
> The strangest part is that for the first ~30 seconds of encode, across the
> board, the bitrate is ~2x the bitrate from second 31 -> end of encode.
> (So, I'm hitting a typical rate of 7Mbps and peaking out at 13Mbps.)
>
> Is this behaviour expected with gen7 HW?  Is there something I can do in
> the initial setup that will cap the MAX bitrate regardless of the impact on
> encode quality?
>
> Regards,
>
> Chris
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/libva/attachments/20140817/a1a4c45a/attachment.html>


More information about the Libva mailing list