HDFS2 HDFS 데이터를 REMOTE HDFS로 COPY 할 때 hadoop distcp 위와 같은 명령을 쓰면 일반적으로 복사가 된다. 하지만 잘 복사하다가 가끔 문제가 생긴다.Error: java.io.IOException: File copy failed: hdfs://devnodem:8020/apps/hive/warehouse/logdata.db/onenavi_logtext_kt/dt=2017-07-21/000035_0 --> hdfs://10.10.82.223:8020/apps/hive/warehouse/logdata.db/onenavi_logtext_kt/dt=2017-07-21/000035_0 at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:287) at org.a.. 2018. 11. 15. Fix Under-replicated blocks in HDFS manually su - bash-4.1$ hdfs fsck / | grep 'Under replicated' | awk -F':' '{print $1}' >> /tmp/under_replicated_files -bash-4.1$ for hdfsfile in `cat /tmp/under_replicated_files`; do echo "Fixing $hdfsfile :" ; hadoop fs -setrep 3 $hdfsfile; done 2018. 11. 15. 이전 1 다음