<br><br><div class="gmail_quote">On Mon, Feb 27, 2012 at 1:49 PM, David Jackson <span dir="ltr"><<a href="mailto:djackson452@gmail.com">djackson452@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br><br><div class="gmail_quote"><div class="im">On Mon, Feb 27, 2012 at 12:04 PM, Renaud Hebert <span dir="ltr"><<a href="mailto:renozyx@gmail.com" target="_blank">renozyx@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hello,<br>
<br></blockquote></div><div>
<br>As far as X12. X12 is not needed. that is because X11 provides the extension mechanism that allows you do do anything that you need to extend the protocol and do so in a backwards compatable way. That was one of then many things X did right.<br>
</div><div class="im"><blockquote class="gmail_quote" style="margin:0pt 0pt 0pt 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Also you're wrong about this:<br>
"from what i hear of wayland, applications give wayland a pixmap, meaning<br>
the application has to do all rendering on the CPU"<br>
applications give Wayland the address of a buffer in GPU's memory not<br>
in main memory so the application doesn't have to do the rendering on<br>
the CPU.<br>
Criticising before doing your homework ensure that nobody will listen:<br>
well done (not)!<br>
<br></blockquote></div><div><br>Ooops. Sorry about that. You see, i read the documents but, I guess i must have missed that detail. It was a mistaken misunderstanding and I am sorry.<br><br></div></div></blockquote><div>
<br>Ok, well, to have the GPU, render a square, you have to tell the GPU, from the app, "place a square which is 30x20 at the coordinate 40x40. Apps need a way to give a command to the GPU to render a square. How does a wayland app give a command to render a square to GPU? What about triangles, spheres, and cubes, and even more complex graphics?<br>
<br>If all the app has is GPU video buffer in their memory space, all they would be able to do is render the square *on the CPU* and then place the pixmap into the GPU video buffer. So, the CPU in that case is still doing the rendering work.<br>
<br> Also, video hardware may provide different hardware interfaces as well, even, different formats of the video buffer and any command format for sending OpenGL commands to the hardware. Applications cannot know about each kind of hardware so a driver of some sort is needed. <br>
</div></div>