<bdo id='HPf0R'></bdo><ul id='HPf0R'></ul>

    <legend id='HPf0R'><style id='HPf0R'><dir id='HPf0R'><q id='HPf0R'></q></dir></style></legend>

    <tfoot id='HPf0R'></tfoot>

    1. <i id='HPf0R'><tr id='HPf0R'><dt id='HPf0R'><q id='HPf0R'><span id='HPf0R'><b id='HPf0R'><form id='HPf0R'><ins id='HPf0R'></ins><ul id='HPf0R'></ul><sub id='HPf0R'></sub></form><legend id='HPf0R'></legend><bdo id='HPf0R'><pre id='HPf0R'><center id='HPf0R'></center></pre></bdo></b><th id='HPf0R'></th></span></q></dt></tr></i><div id='HPf0R'><tfoot id='HPf0R'></tfoot><dl id='HPf0R'><fieldset id='HPf0R'></fieldset></dl></div>
    2. <small id='HPf0R'></small><noframes id='HPf0R'>

      1. Mapreduce 组合器

        时间:2023-09-26

          1. <small id='jKX7f'></small><noframes id='jKX7f'>

              <tbody id='jKX7f'></tbody>

          2. <i id='jKX7f'><tr id='jKX7f'><dt id='jKX7f'><q id='jKX7f'><span id='jKX7f'><b id='jKX7f'><form id='jKX7f'><ins id='jKX7f'></ins><ul id='jKX7f'></ul><sub id='jKX7f'></sub></form><legend id='jKX7f'></legend><bdo id='jKX7f'><pre id='jKX7f'><center id='jKX7f'></center></pre></bdo></b><th id='jKX7f'></th></span></q></dt></tr></i><div id='jKX7f'><tfoot id='jKX7f'></tfoot><dl id='jKX7f'><fieldset id='jKX7f'></fieldset></dl></div>
          3. <tfoot id='jKX7f'></tfoot>
            • <legend id='jKX7f'><style id='jKX7f'><dir id='jKX7f'><q id='jKX7f'></q></dir></style></legend>
                  <bdo id='jKX7f'></bdo><ul id='jKX7f'></ul>

                  本文介绍了Mapreduce 组合器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

                  问题描述

                  我有一个简单的 mapreduce 代码,其中包含 mapper、reducer 和 combiner.映射器的输出被传递给组合器.但是对于reducer,不是combiner的输出,而是mapper的输出.

                  I have a simple mapreduce code with mapper, reducer and combiner. The output from mapper is passed to combiner. But to the reducer, instead of output from combiner,output from mapper is passed.

                  请帮忙

                  代码:

                  package Combiner;
                  import java.io.IOException;
                  import org.apache.hadoop.conf.Configuration;
                  import org.apache.hadoop.fs.Path;
                  import org.apache.hadoop.io.DoubleWritable;
                  import org.apache.hadoop.io.LongWritable;
                  import org.apache.hadoop.io.Text;
                  import org.apache.hadoop.mapreduce.Job;
                  import org.apache.hadoop.mapreduce.Mapper;
                  import org.apache.hadoop.mapreduce.Reducer;
                  import org.apache.hadoop.mapreduce.Mapper.Context;
                  import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
                  import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
                  import org.apache.hadoop.util.GenericOptionsParser;
                  
                  public class AverageSalary
                  {
                  public static class Map extends  Mapper<LongWritable, Text, Text, DoubleWritable> 
                  {
                      public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException 
                      {    
                          String[] empDetails= value.toString().split(",");
                          Text unit_key = new Text(empDetails[1]);      
                          DoubleWritable salary_value = new DoubleWritable(Double.parseDouble(empDetails[2]));
                          context.write(unit_key,salary_value);    
                  
                      }  
                  }
                  public static class Combiner extends Reducer<Text,DoubleWritable, Text,Text> 
                  {
                      public void reduce(final Text key, final Iterable<DoubleWritable> values, final Context context)
                      {
                          String val;
                          double sum=0;
                          int len=0;
                          while (values.iterator().hasNext())
                          {
                              sum+=values.iterator().next().get();
                              len++;
                          }
                          val=String.valueOf(sum)+":"+String.valueOf(len);
                          try {
                              context.write(key,new Text(val));
                          } catch (IOException e) {
                              // TODO Auto-generated catch block
                              e.printStackTrace();
                          } catch (InterruptedException e) {
                              // TODO Auto-generated catch block
                              e.printStackTrace();
                          }
                      }
                  }
                  public static class Reduce extends Reducer<Text,Text, Text,Text> 
                  {
                      public void reduce (final Text key, final Text values, final Context context)
                      {
                          //String[] sumDetails=values.toString().split(":");
                          //double average;
                          //average=Double.parseDouble(sumDetails[0]);
                          try {
                              context.write(key,values);
                          } catch (IOException e) {
                              // TODO Auto-generated catch block
                              e.printStackTrace();
                          } catch (InterruptedException e) {
                              // TODO Auto-generated catch block
                              e.printStackTrace();
                          }
                      }
                  }
                  public static void main(String args[])
                  {
                      Configuration conf = new Configuration();
                      try
                      {
                       String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();    
                       if (otherArgs.length != 2) {      
                           System.err.println("Usage: Main <in> <out>");      
                           System.exit(-1);    }    
                       Job job = new Job(conf, "Average salary");    
                       //job.setInputFormatClass(KeyValueTextInputFormat.class);    
                       FileInputFormat.addInputPath(job, new Path(otherArgs[0]));    
                       FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));    
                       job.setJarByClass(AverageSalary.class);    
                       job.setMapperClass(Map.class);    
                       job.setCombinerClass(Combiner.class);
                       job.setReducerClass(Reduce.class);    
                       job.setOutputKeyClass(Text.class);    
                       job.setOutputValueClass(Text.class);    
                  
                          System.exit(job.waitForCompletion(true) ? 0 : -1);
                      } catch (ClassNotFoundException e) {
                          // TODO Auto-generated catch block
                          e.printStackTrace();
                      } catch (IOException e) {
                          // TODO Auto-generated catch block
                          e.printStackTrace();
                      } catch (InterruptedException e) {
                          // TODO Auto-generated catch block
                          e.printStackTrace();
                      }
                  }
                  

                  }

                  推荐答案

                  你好像忘记了combiner的重要属性:

                  It seems that you forgot about important property of a combiner:

                  键/值的输入类型和输出类型键/值必须相同.

                  the input types for the key/value and the output types of the key/value need to be the same.

                  您不能接受 Text/DoubleWritable 并返回 Text/Text.我建议您使用 Text 而不是 DoubleWritable,并在 Combiner 中进行适当的解析.

                  You can't take in a Text/DoubleWritable and return a Text/Text. I suggest you to use Text Instead DoubleWritable, and do proper parsing inside Combiner.

                  这篇关于Mapreduce 组合器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!

                  上一篇:不使用 JobConf 运行 Hadoop 作业 下一篇:Gradle 传递依赖排除未按预期工作.(我如何摆脱

                  相关文章

                  最新文章

                  • <bdo id='45bWw'></bdo><ul id='45bWw'></ul>

                  1. <i id='45bWw'><tr id='45bWw'><dt id='45bWw'><q id='45bWw'><span id='45bWw'><b id='45bWw'><form id='45bWw'><ins id='45bWw'></ins><ul id='45bWw'></ul><sub id='45bWw'></sub></form><legend id='45bWw'></legend><bdo id='45bWw'><pre id='45bWw'><center id='45bWw'></center></pre></bdo></b><th id='45bWw'></th></span></q></dt></tr></i><div id='45bWw'><tfoot id='45bWw'></tfoot><dl id='45bWw'><fieldset id='45bWw'></fieldset></dl></div>
                    <legend id='45bWw'><style id='45bWw'><dir id='45bWw'><q id='45bWw'></q></dir></style></legend>
                    <tfoot id='45bWw'></tfoot>

                  2. <small id='45bWw'></small><noframes id='45bWw'>