Task Execution

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Although the Hadoop framework is implemented in Java , MapReduce applications need not be written in: a) Java b) C c) C# d) None of the mentioned

a) Java Hadoop Pipes is a SWIG- compatible C++ API to implement MapReduce applications (non JNITM based).

Running a ___________ program involves running mapping tasks on many or all of the nodes in our cluster. a) MapReduce b) Map c) Reducer d) All of the mentioned

a) MapReduce In some applications, component tasks need to create and/or write to side-files, which differ from the actual job-output files.

Point out the correct statement : a) MapReduce tries to place the data and the compute as close as possible b) Map Task in MapReduce is performed using the Mapper() function c) Reduce Task in MapReduce is performed using the Map() function d) All of the mentioned

a) MapReduce tries to place the data and the compute as close as possible This feature of MapReduce is "Data Locality".

__________ maps input key/value pairs to a set of intermediate key/value pairs. a) Mapper b) Reducer c) Both Mapper and Reducer d) None of the mentioned

a) Mapper Maps are the individual tasks that transform input records into intermediate records.

___________ part of the MapReduce is responsible for processing one or more chunks of data and producing the output results. a) Maptask b) Mapper c) Task execution d) All of the mentioned

a) Maptask Map Task in MapReduce is performed using the Map() function.

_________ function is responsible for consolidating the results produced by each of the Map() functions/tasks. a) Reduce b) Map c) Reducer d) All of the mentioned

a) Reduce Reduce function collates the work and resolves the results.

The number of maps is usually driven by the total size of: a) inputs b) outputs c) tasks d) none of the mentioned

a) inputs Total size of inputs means total number of blocks of the input files.

________ is a utility which allows users to create and run jobs with any executable as the mapper and/or the reducer. a) Hadoop Strdata b) Hadoop Streaming c) Hadoop Stream d) None of the mentioned

b) Hadoop Streaming Hadoop streaming is one of the most important utilities in the Apache Hadoop distribution.

Which of the following node is responsible for executing a Task assigned to it by the JobTracker? a) MapReduce b) Mapper c) TaskTracker d) JobTracker

c) TaskTracker TaskTracker receives the information necessary for execution of a Task from JobTracker, Executes the Task, and Sends the Results back to JobTracker.

Point out the wrong statement: a) A MapReduce job usually splits the input data-set into independent chunks which are processed by the map tasks in a completely parallel manner b) The MapReduce framework operates exclusively on pairs c) Applications typically implement the Mapper and Reducer interfaces to provide the map and reduce methods d) None of the mentioned

d) None of the mentioned The MapReduce framework takes care of scheduling tasks, monitoring them and re-executes the failed tasks.


Kaugnay na mga set ng pag-aaral

NU271 EAQ Evolve Elsevier NU271 HESI Prep: Fundamentals - Fundamental Skills

View Set

Lecture 1: Review of biomolecules, thermodynamics, water and acid base (Biochemistry)

View Set

Chapter 4 - Job Analysis and Competency Modeling

View Set

Erlich' MIDTERM 2018-2019, Ehrlich Study Guide

View Set

CH 51 - DKA and HHNS. CH 52 - DI, SIDAH, Pheochromocytoma, Addion's, Cushing's. CH 54 - AKI.

View Set