There might be a requirement to pass additional parameters to the mapper and reducers, besides the the inputs which they process. Lets say we are interested in Matrix multiplication and there are multiple ways/algorithms of doing it. We could send an input parameter to the mapper and reducers, based on which the appropriate way/algorithm is picked. There are multiple ways of doing this
Setting the parameter:
1. Use the -D command line option to set the parameter while running the job.
2. Before launching the job using the old MR API
3. Before launching the job using the new MR API
Getting the parameter:
1. Using the old API in the Mapper and Reducer. The JobConfigurable#configure has to be implemented in the Mapper and Reducer class.
The variable N can then be used with the map and reduce functions.
2. Using the new API in the Mapper and Reducer. The context is passed to the setup, map, reduce and cleanup functions.
Setting the parameter:
1. Use the -D command line option to set the parameter while running the job.
2. Before launching the job using the old MR API
JobConf job = (JobConf) getConf(); job.set("test", "123");
3. Before launching the job using the new MR API
Configuration conf = new Configuration(); conf.set("test", "123"); Job job = new Job(conf);
Getting the parameter:
1. Using the old API in the Mapper and Reducer. The JobConfigurable#configure has to be implemented in the Mapper and Reducer class.
private static Long N; public void configure(JobConf job) { N = Long.parseLong(job.get("test")); }
The variable N can then be used with the map and reduce functions.
2. Using the new API in the Mapper and Reducer. The context is passed to the setup, map, reduce and cleanup functions.
Configuration conf = context.getConfiguration(); String param = conf.get("test");