Benchmark of Wayland

jonsmirl at gmail.com jonsmirl at gmail.com
Wed Nov 17 13:12:57 PST 2010


On Wed, Nov 17, 2010 at 4:06 PM, jonsmirl at gmail.com <jonsmirl at gmail.com> wrote:
> On Wed, Nov 17, 2010 at 3:56 PM, Dana Jansens <dana at orodu.net> wrote:
>> On Wednesday, November 17, 2010, Mohamed Ikbel Boulabiar
>> <boulabiar at gmail.com> wrote:
>>>
>>> On Wed, Nov 17, 2010 at 8:47 PM, jonsmirl at gmail.com <jonsmirl at gmail.com> wrote:
>>> How are apps going to handle sub-pixel anti-aliasing? We have CRTs,
>>> LCDs (hor and vet), ePaper, Sharp Yellow, PixelQi, etc all with
>>> different pixel arrangements.
>>> isn't wayland supposed to stay for the next 20 years ?What about 3D screens (with active 3d glasses) and holographic rendering ?(3d screens/projectors are already everywhere...)
>>>
>>>
>>> and holographic ones are already here too:3d interactive holographic mid-air displays <http://www.youtube.com/watch?v=jCx6dZXLe5A>
>>>
>>> 3d input can be captured with cheap devices like Kinect.Even 3d force feedback can be made:Tangible hologram projector <http://www.youtube.com/watch?v=pLa1rdfu6Bg>
>>
>> Can you point to a single production kernel driver for 3d projection ?
>>  If not this smells like trolling..
>
> How does this work?
> http://www.nvidia.com/object/3d-vision-main.html

If some has hardware like this, do a 3D desktop mock up in Wayland.
You're guaranteed to get 15 minutes of fame.

3D is not hard to do. It is just pairs of frames rendered from
slightly different angles. The shutter glasses then control which
frame is seen by the eye.

This brings up a valid point, show Wayland be manipulating everything
in 3D coordinates? Most of us who own 2D screens would just see the 2D
projection of the space - which equates to what we are seeing today.
But if you owned a 3D one, the 3D projection would become visible.

Think of coverflow in 3D!
>
>
>>
>
>
>
> --
> Jon Smirl
> jonsmirl at gmail.com
>



-- 
Jon Smirl
jonsmirl at gmail.com


More information about the wayland-devel mailing list