java - Querstion regarding hadoop-env.sh -


i facing error: java heap space , error: gc overhead limit exceeded

so started looking hadoop-env.sh.

so thats understand far, please correct me if wrong.

if hadoop_heapsize=7168 in hadoop-env.sh 

this invoke datanode daemon , tasktracker daemon on datanode 7gb memory assigned each(datanode(7gb)+ tasktracker(7gb) = 14gb)

and

mapred.tasktracker.reduce.tasks.maximum = 3 mapred.tasktracker.map.tasks.maximum = 6 ,  mapred.child.java.opts -xmx1024m 

so invoke 9 child jvms 1gb memory, total of 9gb

but tasktracker invoked 7gb memory, wil conflict. max memory tasktracker , child jvms invoked tasktracker 7gb, consuming 9g.

so heap space error occured, calculation correct?


Comments

Popular posts from this blog

javascript - RequestAnimationFrame not working when exiting fullscreen switching space on Safari -

Python ctypes access violation with const pointer arguments -