NVDEC and 10-bit HEVC decode

Matthew Waters ystreet00 at gmail.com
Sat Jun 23 00:16:50 UTC 2018


On 23/06/18 02:57, Samuel Hurst wrote:
> On 19/06/18 10:04, Samuel Hurst wrote:
>> On 18/06/18 18:36, Nicolas Dufresne wrote:
>>> When I look at this decoder, I see no code to support 10bit decoding.
>>> This suggests that it outputs 10bit as if it was 8bit NV12. Please file
>>> a bug, the decoder should fail, or support this, right now it is
>>> incorrect.
>>
>> As requested, I've created a new bug for this:
>> https://bugzilla.gnome.org/show_bug.cgi?id=796629
>>
>> On 19/06/18 01:55, Matthew Waters wrote:
>>> Currently nvdec doesn't work for 10 (or 12) bit decoding and always
>>> assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
>>> didn't have support for higher bit depths when the code was written.
>>>
>>> Changing that requires adding 10-bit pixel formats (actually 10-bit in
>>> 16-bit) to libgstgl and plumbing that through for the decoder to
>>> output.
>
> I've been trying to work on this through the week, but I seem to have
> just gotten myself tied into knots and I'm now lost. I'd appreciate
> some advice here.
>
> I've managed to tell when the NVidia decoder is outputting not-8-bit
> depth video. In parser_sequence_callback, if
> format->bit_depth_luma_minus8 is non-zero, then I set the CUDA output
> format as cudaVideoSurfaceFormat_P016 (the only other format).
> According the the source I've seen, internally this relates to a set
> of 16-bit values that for 10-bit video will have their LSB packed as 0.
>
> Later in handle_pending_frames, I'm setting the GstVideoFormat as
> GST_VIDEO_FORMAT_P010_10BE that goes into
> gst_video_decoder_set_output_state. It now attempts to negotiate the
> new format downstream, which is good.
>
> I've then, for want of a better word, bodged in support for P010_10BE
> into the GL elements so it negotiates fine. At this point, the video
> output is no longer scrambled green, it's now scrambled yellow and
> purple, and it looks like it's not putting pixels in the right places.
> So possibly still making assumptions of the stream as 8-bit when
> trying to convert.
>
> Is what I'm seeing a consequence of the GL elements trying to read
> those internal buffers as 8-bit when trying to do the YUV->RGB
> conversion, and I need to force it to understand them in a different
> way? I've tried adding GST_GL_RGB10 to GstGLFormat, and I've tried
> plugging that into various places around in the gst-libs GL code but
> it's started exploding in quite varied ways so I've pulled all those
> changes back out.
>
> I'm sadly not that well versed in OpenGL so I might be trying all the
> wrong things, but I'm willing to try and fix it if someone can offer
> me some pointers as to where to look and in general what needs doing.
> Ticket 703347 says to refer back to a summary, but I can't seem to
> find one. I'm assuming this has been swallowed by bugzilla somewhere.
> It also mentions shaders, so is this something that needs to be done
> in the GLSL stuff in glcolorconvert?

Almost :)

You want to add 16-bit formats to GstGL rather than 10/12-bit formats. 
Either that or you need to convert in the decoder to 10/12-bit.

Here's a similar commit for adding ARGB64 support with RGBA16 textures:
https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/gst-libs/gst/gl?id=3cfff727b19d450898dbe7931c53ea05bc2a9ac3. 
Of which I just noticed a bug of it possibly adding the YUY2/UYVY
formats twice!

Cheers
-Matt

> Best Regards,
> Sam

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 488 bytes
Desc: OpenPGP digital signature
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20180623/33de4d81/attachment.sig>


More information about the gstreamer-devel mailing list