Insufficient space for shared memory file when I try to run nutch generate command

peter picture peter · Jan 12, 2013 · Viewed 57.9k times · Source

I have been running nutch crawling commands for the passed 3 weeks and now I get the below error when I try to run any nutch command:

Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdata_user/27050 Try using the -Djava.io.tmpdir= option to select an alternate temp location.

Error: Could not find or load main class ___.tmp.hsperfdata_user.27055

How do I solve this issue?

Answer

Kingz picture Kingz · Sep 20, 2014

Yeah this is really an issue with the space available on the volume your /tmp is mounted on. If you are running this on EC2, or any cloud platform, attach a new volume and mount your /tmp on that. If running locally, no other option besides cleaning up to make more room.

Try commands like: df -h to see the % used and available space on each volume mounted on your instance. You will see something like:

Filesystem            Size  Used Avail Use% Mounted on
/dev/xvda1            7.9G  7.9G     0 100% /
tmpfs                  30G     0   30G   0% /dev/shm
/dev/xvda3             35G  1.9G   31G   6% /var
/dev/xvda4             50G   44G  3.8G  92% /opt
/dev/xvdb             827G  116G  669G  15% /data/1
/dev/xvdc             827G  152G  634G  20% /data/2
/dev/xvdd             827G  149G  637G  19% /data/3
/dev/xvde             827G  150G  636G  20% /data/4
cm_processes           30G   22M   30G   1% /var/run/cloudera-scm-agent/process

You will begin to see this error when the disk space is full as shown in this dump.