EGL_MESA_screen_surface proposal

Jon Smirl jonsmirl at gmail.com
Wed Mar 16 14:16:30 PST 2005


On Wed, 16 Mar 2005 16:51:20 -0500, Michel Danzer <mdanzer at ati.com> wrote:
> On Wed, 2005-03-16 at 16:35 -0500, Jon Smirl wrote:
> >
> > I think that info is quite critical. If you set 1080i you're going to
> > need two surfaces in the alternating buffer model.
> 
> That's still an implementation detail. Sure, this won't work with this
> extension alone (keep in mind what Brian and Adam have been saying about
> API extensibility though), but why make it impossible for no good
> reason?

I don't think the hardware displays interlaced video displays by
flipping between two half size buffers. Can you ask someone how it
works?

> > Just because you have 1080i input doesn't mean that it can't be
> > scanned out at 1080p.
> 
> Not without artifacts.

I don't see how this generates artifacts. If your monitor supports
1080p and 1080i exactly the same pixels will get lit in both modes. On
a digital display there is no difference with a 30Hz input source. On
a CRT the pixels will only get lit half of the time with 1080i but the
CRT knows this as automatically adjusts the brightness as part of the
multisync process.

When you refer to de-interlacing I believe that refers to file format.
My LCD does not offer any interlaced modes but I am able to watch
interlaced video without problem. I think this works because XV
deinterlaces on the fly by writing each alternating frame to
alternating lines in the scanout buffer.

I don't think graphics hardware supports anything but a progressive
buffer. But it does support scanning out a progressive buffer in
interlace mode. Of course there may be an interlace scanout buffer
mode I am unfamiliar with,

-- 
Jon Smirl
jonsmirl at gmail.com


More information about the dri-egl mailing list