<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Sat, Mar 24, 2018 at 2:27 PM, Marek Olšák <span dir="ltr"><<a href="mailto:maraeo@gmail.com" target="_blank">maraeo@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><span class="">On Sat, Mar 24, 2018 at 1:36 PM, Connor Abbott <span dir="ltr"><<a href="mailto:cwabbott0@gmail.com" target="_blank">cwabbott0@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">If Gallium was being lazy and not<br>
specifying the bounds for internal shaders, that needs to be fixed for<br>
the HW to work properly.<br></blockquote><div><br></div></span><div>I don't understand the sentence. Shaders don't interact with vertex indices. I also don't like the word "lazy". The proper expression is "saving time".<span class="HOEnZb"><font color="#888888"><br></font></span></div><span class="HOEnZb"></span></div></div></div></blockquote></div><br></div><div class="gmail_extra">I figured he meant for things like u_blitter. But why those things would be using an index buffer is beyond me...<br></div></div>