I launched a hadoop cluster and submitted a job to the master. The jar file is only contained in the master. Does hadoop s开发者_如何学Chip the jar to all the slave machines at the start of the job? Is there a possibility that slave machine will run with previous version of code shipped during last run?
Thank you
BalaFrom the mapreduce tutorial:
The framework will copy the necessary files to the slave node before any tasks for the job are executed on that node. Its efficiency stems from the fact that the files are only copied once per job and the ability to cache archives which are un-archived on the slaves.
More info here:
http://hadoop.apache.org/common/docs/current/mapred_tutorial.html
精彩评论