[Mesa-dev] Mesa (d3d1x): d3d1x: add new Direct3D 10/11 COM state tracker for Gallium

Luca Barbieri luca at luca-barbieri.com
Mon Sep 20 16:28:03 PDT 2010


> A couple of questions - it looks like this is a drop-in for the
> d3d10/11 runtime, rather than an implementation of the DDI.
Yes.

> I think
> that makes sense, but it could also be possible to split it into two
> pieces implementing either side of the d3d10 DDI interface.  Any
> thoughts on whether that's interesting to you?

I wrote it this way first of all because it's clearly easier to just
write the code to support one interface, rather than writing two
pieces, and it avoids unnecessary reliance on Microsoft interfaces,
which often tend to be imperfectly documented.
Not going through the DDI also clearly reduces CPU overhead and keeps
the codebase simpler.

I think a DDI implementation over Gallium could just live along as a
sibling to the COM implementation, sharing common code, which is
already split out into modules such as d3d1xshader and d3d1xstutil.
The shader parser and translator can be fully shared and several
conversions (e.g. DXGI_FORMAT -> pipe_format) are already separate
from the main code, although perhaps more could be factored out.

Instead, layering the COM API over the DDI API doesn't necessarily
seem to be a win, especially because Gallium is so close to the
D3D10/11 interfaces that it's not even clear that using the DDI is
much easier than just using Gallium directly.

I don't think I'll do it myself as an hobby project though.

> Just trying to wrap my head around this, and how D3D10 works on
> linux...  Right now, the test applications must be some sort of hybrid
> between a regular posix/linux app and a win32/com/etc application,
> right?  So you've essentially taken d3d10 plus a minimal amount of
> dxgi, com, etc, sufficient to support the graphics apis, and
> implemented that on linux+gallium?

Currently the interface is the "Gallium DXGI API" and in particular
the function GalliumDXGIUseX11Display.

This function tells the DXGI implementation the X11 display to use and
how it should convert HWNDs it is passed to X11 windows, which are
then rendered to using the Gallium driver and the X11 display
interface from the EGL state tracker native.h interface.

So the test application just creates an X11 window, calls this
function, and then goes on using DXGI and D3D10/11 as a normal Windows
application.

It can also theoretically support other platform supported by EGL
(currently DRM, fbdev and GDI).

Wine DLLs will also call that function, providing a callback to do the
HWND->Window translation according to Wine x11drv.

> I can see this being a useful tool for people porting win32 games to
> linux, but at the same time there will probably be ongoing confusion
> about which bits of win32 are a part of this platform -- I guess
> that's where wine comes in.

Right now we use the Wine Windows headers, but they could be replaced
with a single "minimal Win32 for D3D" header.
This is mostly for type definitions of things like LPCSTR, RECT, and so on.

The Direct3D 10/11 interfaces themselves are fully separate from the
Windows API, and the glue code is DXGI.

> Will this codebase work on windows, ie as a drop-in replacement for
> the d3d10 runtime?  Or would it, with a bit of work?

Once Wine dlls are done, it might indeed be possible to run them on
Windows with some work, since the EGL native interface we use does
have GDI output.
It would likely be much better to add a DDI implementation for that
purpose though, or run the applications on Linux with Wine.

Along with a Gallium driver using OpenGL as a backend, it could be
used to provide accelerated Direct3D 10/11 on Windows XP, but I'm not
sure anyone cares about that nowadays.


More information about the mesa-dev mailing list