Trace compression improvements

Eugene Velesevich eugvelesevich at gmail.com
Tue Jun 4 13:14:51 PDT 2013


The best improvements we've seen were on an arm/android board, where
the threaded approach eliminated very noticeable periodic stutters
from compression; however, it's hard to provide quantitative data on
that. On a modern x86 system, we're seeing 7-8% better fps rates with
ipers, even with lz4hc compression that consumes 30-40% cpu in its
thread.

With regards to compression ratio the LZ4 compressed trace of ipers is
larger by 11-12% than the snappy one, but LZ4HC compresses better by
35-36% than snappy (you can check it using "apitrace repack")

On Sun, Jun 2, 2013 at 5:09 PM, José Fonseca <jose.r.fonseca at gmail.com> wrote:
> I haven't looked at the changes in detail yet -- I'll do it as soon as I
> find the time -- but sounds good in principle. Indeed trying out LZ4 has
> been in the to do for some time, so thanks for doing this.
>
> Did you get any figures (speed r& compression ratio) on how it compares with
> snappy? A good benchmark is "ipers" demo, part of mesa demos. It is what
> Zack used when he was improving the compression speed w/ snappy.
>
> Jose
>
>
>
> On Sat, Jun 1, 2013 at 2:39 PM, evel <evel at ispras.ru> wrote:
>>
>> Hello,
>>
>> This patch series improves trace compression by implementing threaded
>> compression offloading and providing alternative compression methods.  The
>> main difference from previously implemented threaded compression is that
>> locking overhead is significantly reduced thanks to using a
>> double-buffered
>> output buffer instead of a ring buffer that was locked on each call dump
>> operation.
>>
>> https://github.com/Testiki/apitrace/tree/threaded-file
>> _______________________________________________
>> apitrace mailing list
>> apitrace at lists.freedesktop.org
>> http://lists.freedesktop.org/mailman/listinfo/apitrace
>
>
>
> _______________________________________________
> apitrace mailing list
> apitrace at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/apitrace
>


More information about the apitrace mailing list