[Libva] Interlaced encoding

Zhao, Yakui yakui.zhao at intel.com
Wed Jul 23 00:16:41 PDT 2014


On Wed, 2014-07-23 at 00:26 -0600, Artem Makhutov wrote:
> Hello,
> 
> Zhao, Yakui schrieb:
> > On Tue, 2014-07-22 at 03:58 -0600, Artem Makhutov wrote:
> >> Hallo,
> >>
> >> I need to encode interlaced content to interleaced H.264.
> >> De-Interleasing is not an option for me.
> >>
> >> Right now I am using Intel Media SDK for this, but I would like to go with libva.
> >
> > The libva-intel-driver doesn't support the interlace encoding as the
> > interlace was one out-of-date technology to reduce the encoding
> > bit-rate. (Although the bit-rate can be reduced, the image quality will
> > be affected very significantly).
> 
> Yes, this is correct. However in the real world interlaced encoding still plays a big role.
> It is required for "professional" applications like TV broadcasts, video conferencing and so on.
> 
> How hard is it to implement it in the driver?

 It needs to add a new path to handle the interlace. It is difficult to
implement it in the driver(For example: GPU shader and encoding setting
and so on. And the GPU shader for interlace is more complex than that
for progressive, especially when the quality is considered).


For the interlaced encoding the encoding still uses the progressive
image as the input source. And two interlaced fields are output for one
frame. In such case in order to get better quality under interlace, it
needs to do the pre-processing from progressive to interlace, which is a
converse processing of interlace to progressive.  In such case the
pre-processing from progressive to interlace will affect the encoding
quality.

Thanks.
     Yakui
> 
> Thanks, Artem
> 




More information about the Libva mailing list