[Mesa-dev] 10bit HEVC decoding for RadeonSI
Mark Thompson
sw at jkqxz.net
Thu Jan 26 12:14:14 UTC 2017
On 26/01/17 11:00, Christian König wrote:
> Hi Peter,
>
> Am 25.01.2017 um 19:45 schrieb Peter Frühberger:
>>
>>
>> Peter, Rainer any idea what I'm missing here? Do you guys use some
>> modified ffmpeg for Kodi or how does that work for you?
>>
>>
>> do you set the format correctly, e.g.: https://github.com/FernetMenta/kodi-agile/blob/master/xbmc/cores/VideoPlayer/DVDCodecs/Video/VAAPI.cpp#L2697 to create the surfaces?
>
> Well the problem here is that the VA-API interface is not consistent and I'm not sure how to implement it correctly.
>
> See your code for example:
>> VASurfaceAttrib attribs[1], *attrib;
>>
>> attrib = attribs;
>>
>> attrib->flags = VA_SURFACE_ATTRIB_SETTABLE;
>>
>> attrib->type = VASurfaceAttribPixelFormat;
>>
>> attrib->value.type = VAGenericValueTypeInteger;
>>
>> attrib->value.value.i = VA_FOURCC_NV12;
>>
>>
>
> First Kodi specifies that NV12 should be used which implies that this is a 8bit surface.
>
>> // create surfaces
>>
>> VASurfaceID surfaces[32];
>>
>> unsigned int format = VA_RT_FORMAT_YUV420;
>>
>> if (m_config.profile == VAProfileHEVCMain10)
>>
>> format = VA_RT_FORMAT_YUV420_10BPP;
> But then Kodi requests a 10bit surface. Now what is the correct thing to do here?
>
> I can either create an NV12 surface, which would be 8bit but would result in either an error message or only 8bit dithering during decode.
>
> Or I can promote the surface to 10bit, which would result in a P010 or rather P016 format.
>
> Or and that is actually what I think would be best the VA-API driver should trow an error indicating that the application requested something impossible.
I prefer the last. IMO that code is just wrong - you can't specify an 8-bit format for a 10-bit surface. (I'm not really sure what it's trying to do, I would have expected it to barf with the Intel driver as well, which doesn't have any dithering support so Main10 video must be decoded to P010 surfaces.)
>>
>> afterwards we just do drm / egl interop, via:
>> https://github.com/FernetMenta/kodi-agile/blob/master/xbmc/cores/VideoPlayer/DVDCodecs/Video/VAAPI.cpp#L1374
>
> I'm not sure if that will ever work correctly. The problem is that VA-API leaks to the application what the data layout in the surface is. As soon as we turn on tilling that will only work with rather crude hacks.
>
> I will try to get it working, but probably need help from you guys as well.
I don't think tiling should be relevant, but do correct me if this has more issues than it does on Intel.
To my mind, the PixelFormat attribute (fourcc) is only specifying the appearance of the format from the point of view of the user. That is, what you will get if you call vaDeriveImage() and then map the result. Tiling then doesn't cause any problems because it is a property of the DRM object and mapping can automagically take it into account. With the Intel driver, when a surface is mapped to userspace it goes through GTT and you see a linear layout; when a surface is mapped to some other component (OpenGL or whatever) then the tiling information is available there as a property of the object and can be handled appropriately (possibly by throwing an error if it can't handle it, but hopefully it can).
Thanks,
- Mark
More information about the mesa-dev
mailing list