a) HashParb) Partitionerc) HashPartitionerd) None of the mentioned Answer: cExplanation: The default partitioner in Hadoop ...
View QuestionThe number of maps is usually driven by the total size of ____________
a) inputsb) outputsc) tasksd) None of the mentioned Answer: aExplanation: Total size of inputs means ...
View Question_________ maps input key/value pairs to a set of intermediate key/value pairs.
a) Mapperb) Reducerc) Both Mapper and Reducerd) None of the mentioned Answer: aExplanation: Maps are ...
View Question________ is a utility which allows users to create and run jobs with any executables as the mapper and/or the reducer.
a) Hadoop Strdatab) Hadoop Streamingc) Hadoop Streamd) None of the mentioned Answer: bExplanation: Hadoop streaming ...
View QuestionAlthough the Hadoop framework is implemented in Java, MapReduce applications need not be written in ____________
a) Javab) Cc) C#d) None of the mentioned Answer: aExplanation: Hadoop Pipes is a SWIG- ...
View QuestionPoint out the wrong statement.
a) A MapReduce job usually splits the input data-set into independent chunks which are processed by the map ...
View Question_________ function is responsible for consolidating the results produced by each of the Map() functions/tasks.
a) Reduceb) Mapc) Reducerd) All of the mentioned Answer: aExplanation: Reduce function collates the work ...
View Question___________ part of the MapReduce is responsible for processing one or more chunks of data and producing the output results.
a) Maptaskb) Mapperc) Task executiond) All of the mentioned Answer: aExplanation: Map Task in MapReduce ...
View QuestionPoint out the correct statement.
a) MapReduce tries to place the data and the compute as close as possibleb) Map Task in MapReduce ...
View QuestionA ________ node acts as the Slave and is responsible for executing a Task assigned to it by the JobTracker.
a) MapReduceb) Mapperc) TaskTrackerd) JobTracker Answer: cExplanation: TaskTracker receives the information necessary for the execution ...
View Question