dbus-glib memory leak or ...? without GMainLoop?
thockin at hockin.org
Sat Feb 20 14:45:54 PST 2010
On Sat, Feb 20, 2010 at 1:55 PM, Colin Walters <walters at verbum.org> wrote:
> On Sat, Feb 20, 2010 at 7:10 PM, Tim Hockin <thockin at hockin.org> wrote:
>> On Fri, Feb 19, 2010 at 4:22 PM, Tim Hockin <thockin at hockin.org> wrote:
>>> I've built a trivial dbus server and client. To test I am just
>>> sending a signal in a loop. I'm seeing a memory runaway on the server
>>> app. The interesting thing I see is that it is ONLY when a client is
>>> connected that memory runs away. After a few minutes of continuous
>>> sending it is over 1GB of RSS.
> By "dbus server" do you mean service on the system/session bus? Or do you mean
> a custom DBusServer instance? Or do you mean a direct dbus-protocol connection
> between two processes?
sorry, I meant a dbus service on the system bus which emits signals.
>>> This leads me to believe that there is some sort of response data that
>>> libdbus is receiving on my behalf that I am not catching.
> I think it's a lot more likely you're not freeing something in one of
> the implementation
> methods of the service. Have you tried valgrind?
valgrind finds nothing.
I had a break on this a few minutes ago. Adding a delay between
events makes it not fail. Here's my theory: I am trying to send
events too fast, which are getting buffered in-process with no bounds.
I do not know why it does not happen with no client present. Maybe
the system daemon tells the sender whether there are clients or not as
This would also explain why valgrind finds nothing - the buffered
events get cleared out on exit().
My app is trivial and does not have any non-dbus logic at all yet. No leaks.
Is there a way to limit the amount of RAM libdbus uses for buffering?
It would help prove/disprove this.
More information about the dbus