[systemd-devel] Systemctl causes Spark native thread creation issue
Lennart Poettering
lennart at poettering.net
Tue Feb 21 08:42:47 UTC 2017
On Mon, 20.02.17 16:44, Rao Vz (raoatvz at gmail.com) wrote:
> Hi, Guys
>
> We have a Apache Spark cluster of 3 nodes, one is master and slave, the
> other two are slaves. When starting Spark worker with "systemctl start
> spark-worker", when running out apps, sometimes but not always it generates
> "java.lang.OutOfMemoryError: unable to create new native thread" error in
> Spark worker logs.
I figure the error is misleading and is not about memory at all, and
you need to bump the default TasksMax= field or even turn it off by
setting it to infinity.
Lennart
--
Lennart Poettering, Red Hat
More information about the systemd-devel
mailing list