[systemd-devel] Systemctl causes Spark native thread creation issue

Rao Vz raoatvz at gmail.com
Mon Feb 20 21:44:40 UTC 2017


Hi, Guys

We have a Apache Spark cluster of 3 nodes, one is master and slave, the
other two are slaves. When starting Spark worker with "systemctl start
spark-worker", when running out apps, sometimes but not always it generates
"java.lang.OutOfMemoryError: unable to create new native thread" error in
Spark worker logs.

If instead starting Spark worker directly (/opt/spark/sbin/start-slave.sh
spark://masterip:7077), it never causes any such error.

We tried tweaking ulimit and java options but did not have any luck.

The unit file (spark-worker.service) is like below:
[Unit]
Description=Spark Worker
After=network.target

[Service]
Type=forking
ExecStart=/opt/spark/sbin/start-slave.sh spark://masterIP:7077
ExecStop=/opt/spark/sbin/stop-slave.sh
StandardOutput=journal
StandardError=journal
LimitNOFILE=infinity
LimitMEMLOCK=infinity
LimitNPROC=infinity
LimitAS=infinity
CPUAccounting=true
CPUShares=100
Restart=always

[Install]
WantedBy=multi-user.target

Any help is appreciated.

Thanks,
Rao
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/systemd-devel/attachments/20170220/697add88/attachment.html>


More information about the systemd-devel mailing list