hadoop fs -put command

skfeng picture skfeng · Aug 28, 2013 · Viewed 73.7k times · Source

I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:

sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /

But,the result depressed me:

put: '/root/MyHadoop/file1.txt': No such file or directory

I'm sure this file does exist.

Please help me,Thanks!

Answer

Alfonso Nishikawa picture Alfonso Nishikawa · Aug 28, 2013

As user hdfs, do you have access rights to /root/ (in your local hdd)?. Usually you don't. You must copy file1.txt to a place where local hdfs user has read rights before trying to copy it to HDFS.

Try:

cp /root/MyHadoop/file1.txt /tmp
chown hdfs:hdfs /tmp/file1.txt
# older versions of Hadoop
sudo -u hdfs hadoop fs -put /tmp/file1.txt /
# newer versions of Hadoop
sudo -u hdfs hdfs dfs -put /tmp/file1.txt /

--- edit:

Take a look at the cleaner roman-nikitchenko's answer bellow.