What I know about Nvidia Gpu's, with CUDA all it is is a C library extension, so if it is not hardware based I dont think it would be a hard thing to offload to the GPU. I am not sure though what AMD does with their APU's<br>
<br><div class="gmail_quote">On Wed, Dec 12, 2012 at 4:54 PM, Kohei Yoshida <span dir="ltr"><<a href="mailto:kohei.yoshida@gmail.com" target="_blank">kohei.yoshida@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hi there,
<br>
<br>I'm just wondering, whether our anti-aliasing is done strictly at
software level, or via hardware, and if former, what it would take to do
it via hardware so that we can take advantage of GPU to speed it up if
it's there.
<br>
<br>I could look into it to find that out myself, but I'm not familiar with
our graphic layer. I was wondering if someone knows this off the top of
his/her head.
<br>
<br>TIA,
<br>
<br>Kohei
<br>_______________________________________________<br>
LibreOffice mailing list<br>
<a href="mailto:LibreOffice@lists.freedesktop.org">LibreOffice@lists.freedesktop.org</a><br>
<a href="http://lists.freedesktop.org/mailman/listinfo/libreoffice" target="_blank">http://lists.freedesktop.org/mailman/listinfo/libreoffice</a><br>
<br></blockquote></div><br><br clear="all"><div><br></div>-- <br>Jonathan Aquilina<br>