<html>
<head>
<base href="https://bugs.documentfoundation.org/">
</head>
<body><span class="vcard"><a class="email" href="mailto:fredgib@free.fr" title="fredgib@free.fr">fredgib@free.fr</a>
</span> changed
<a class="bz_bug_link
bz_status_UNCONFIRMED "
title="UNCONFIRMED - Memory leak"
href="https://bugs.documentfoundation.org/show_bug.cgi?id=107979">bug 107979</a>
<br>
<table border="1" cellspacing="0" cellpadding="8">
<tr>
<th>What</th>
<th>Removed</th>
<th>Added</th>
</tr>
<tr>
<td style="text-align:right;">Status</td>
<td>NEEDINFO
</td>
<td>UNCONFIRMED
</td>
</tr>
<tr>
<td style="text-align:right;">Ever confirmed</td>
<td>1
</td>
<td>
</td>
</tr></table>
<p>
<div>
<b><a class="bz_bug_link
bz_status_UNCONFIRMED "
title="UNCONFIRMED - Memory leak"
href="https://bugs.documentfoundation.org/show_bug.cgi?id=107979#c3">Comment # 3</a>
on <a class="bz_bug_link
bz_status_UNCONFIRMED "
title="UNCONFIRMED - Memory leak"
href="https://bugs.documentfoundation.org/show_bug.cgi?id=107979">bug 107979</a>
from <span class="vcard"><a class="email" href="mailto:fredgib@free.fr" title="fredgib@free.fr">fredgib@free.fr</a>
</span></b>
<pre>Hi
thanks for your answer.
Unfortunately, the files I am working on consist only of sensitive data. I can
however bring some more information:
* Opening an xlsx file of 45 kiB (6 sheets) with no formulas, no external link,
basically only hand-filled data, no named range, no filtering, nothing other
than plain text EXCEPT a conditional formatting involving 16 conditions/rules
makes the used RAM increase by 25 MiB. Closing this file will free 1 MiB.
Re-opening it will increase the used RAM by 6 MiB. Re-closing it will free 2
MiB. Re-re-opening it will use 5 MiB...
* Opening a 2.2 MiB xlsx file (7 sheets) with only hand-filled data, no
formula, 5000 fields linked to an external file (broken links since the other
file does not exist any more, so I answered "No" for "refreshing linked data"),
auto-filtering, no formatting except manually set background colour for
headers, will increase the used RAM by 100 MiB. Closing this file without
saving) will free 6 MiB. Re-opening it will increase used RAM by 20 MiB.
Re-closing it will free 20 MiB. Re-re-opening it will use 30 MiB. Re-re-closing
it frees 4 MiB...
* Opening a 14 kiB ods file (7 sheets of pure text, no formatting except a few
manually set background cell colour, no formula, no links, no named ranges)
that is already opened uses 6.5 more MiB. Closing it frees 5 MiB...
* Opening a CSV file of 3.1 kiB increases the RAM usage of 6 MiB. Closing it
frees 5 MiB. Re-opening it costs 5 MiB, Re-closing it frees 5 MiB.
I cannot see a predictive pattern, but it is clear that running through
iterations of opening-closing-opening the very same xsls or ods file without
modifying it makes the RAM usage inflate.
While I started with a RAM usage of around 86 MiB, working a few hours on my
8-10 files (4-5 opened simultaneously, entering text, a few add/remove lines)
plus running the little tests described above lead me to a RAM usage of 260
MiB.
I am aware that this is weak information, but I don't think I can do much
better for now. Sorry, I hope it helps a bit though.
Cheers</pre>
</div>
</p>
<hr>
<span>You are receiving this mail because:</span>
<ul>
<li>You are the assignee for the bug.</li>
</ul>
</body>
</html>