Apache-Hadoop-Developer 무료 덤프문제 온라인 액세스
| 시험코드: | Apache-Hadoop-Developer |
| 시험이름: | Hadoop 2.0 Certification exam for Pig and Hive Developer |
| 인증사: | Hortonworks |
| 무료 덤프 문항수: | 110 |
| 업로드 날짜: | 2026-01-04 |
In a MapReduce job, you want each of your input files processed by a single map task. How do you configure a MapReduce job so that a single map task processes each input file regardless of how many blocks the input file occupies?
Analyze each scenario below and indentify which best describes the behavior of the default partitioner?
You need to move a file titled "weblogs" into HDFS. When you try to copy the file, you can't. You know you have ample space on your DataNodes. Which action should you take to relieve this situation and store more files in HDFS?