<small id='VEE4p'></small><noframes id='VEE4p'>

    <legend id='VEE4p'><style id='VEE4p'><dir id='VEE4p'><q id='VEE4p'></q></dir></style></legend>
    • <bdo id='VEE4p'></bdo><ul id='VEE4p'></ul>
    <tfoot id='VEE4p'></tfoot>

    1. <i id='VEE4p'><tr id='VEE4p'><dt id='VEE4p'><q id='VEE4p'><span id='VEE4p'><b id='VEE4p'><form id='VEE4p'><ins id='VEE4p'></ins><ul id='VEE4p'></ul><sub id='VEE4p'></sub></form><legend id='VEE4p'></legend><bdo id='VEE4p'><pre id='VEE4p'><center id='VEE4p'></center></pre></bdo></b><th id='VEE4p'></th></span></q></dt></tr></i><div id='VEE4p'><tfoot id='VEE4p'></tfoot><dl id='VEE4p'><fieldset id='VEE4p'></fieldset></dl></div>

      错误的键类:文本不是 IntWritable

      时间:2023-09-27
      <legend id='fEmcF'><style id='fEmcF'><dir id='fEmcF'><q id='fEmcF'></q></dir></style></legend>

      <small id='fEmcF'></small><noframes id='fEmcF'>

        <i id='fEmcF'><tr id='fEmcF'><dt id='fEmcF'><q id='fEmcF'><span id='fEmcF'><b id='fEmcF'><form id='fEmcF'><ins id='fEmcF'></ins><ul id='fEmcF'></ul><sub id='fEmcF'></sub></form><legend id='fEmcF'></legend><bdo id='fEmcF'><pre id='fEmcF'><center id='fEmcF'></center></pre></bdo></b><th id='fEmcF'></th></span></q></dt></tr></i><div id='fEmcF'><tfoot id='fEmcF'></tfoot><dl id='fEmcF'><fieldset id='fEmcF'></fieldset></dl></div>

        1. <tfoot id='fEmcF'></tfoot>
              <bdo id='fEmcF'></bdo><ul id='fEmcF'></ul>
                  <tbody id='fEmcF'></tbody>

              • 本文介绍了错误的键类:文本不是 IntWritable的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

                问题描述

                这似乎是一个愚蠢的问题,但我在我的 hadoop 的 mapreduce 代码中的类型中看不到问题

                This may seem like a stupid question, but I fail to see the problem in my types in my mapreduce code for hadoop

                正如问题中所述,问题在于它期待 IntWritable 但我在 reducer 的 collector.collect 中将一个 Text 对象传递给它.

                As stated in the question the problem is that it is expecting IntWritable but I'm passing it a Text object in the collector.collect of the reducer.

                我的作业配置有以下映射器输出类:

                My job configuration has the following mapper output classes:

                conf.setMapOutputKeyClass(IntWritable.class);
                conf.setMapOutputValueClass(IntWritable.class);
                

                以及以下减速器输出类:

                And the following reducer output classes:

                conf.setOutputKeyClass(Text.class);
                conf.setOutputValueClass(IntWritable.class);
                

                我的映射类有以下定义:

                My mapping class has the following definition:

                public static class Reduce extends MapReduceBase implements Reducer<IntWritable, IntWritable, Text, IntWritable>
                

                具有所需功能:

                public void reduce(IntWritable key, Iterator<IntWritable> values, OutputCollector<Text,IntWritable> output, Reporter reporter) 
                

                然后当我打电话时它失败了:

                And then it fails when I call:

                output.collect(new Text(),new IntWritable());
                

                我对 map reduce 还很陌生,但所有类型似乎都匹配,它编译但随后在该行上失败,说它期望 IntWritable 作为 reduce 类的键.如果重要的话,我使用的是 0.21 版本的 Hadoop

                I'm fairly new to map reduce but all the types seem to match, it compiles but then fails on that line saying its expecting an IntWritable as the key for the reduce class. If it matters I'm using 0.21 version of Hadoop

                这是我的地图类:

                public static class Map extends MapReduceBase implements Mapper<LongWritable, Text, IntWritable, IntWritable> {
                    private IntWritable node = new IntWritable();
                    private IntWritable edge = new IntWritable();
                
                    public void map(LongWritable key, Text value, OutputCollector<IntWritable, IntWritable> output, Reporter reporter) throws IOException {
                        String line = value.toString();
                        StringTokenizer tokenizer = new StringTokenizer(line);
                
                        while (tokenizer.hasMoreTokens()) {
                            node.set(Integer.parseInt(tokenizer.nextToken()));
                            edge.set(Integer.parseInt(tokenizer.nextToken()));
                            if(node.get() < edge.get())
                                output.collect(node, edge);
                        }
                    }
                }
                

                还有我的归约类:

                public static class Reduce extends MapReduceBase implements Reducer<IntWritable, IntWritable, Text, IntWritable> {
                
                    IntWritable $ = new IntWritable(Integer.MAX_VALUE);
                    Text keyText = new Text();
                
                    public void reduce(IntWritable key, Iterator<IntWritable> values, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException {
                        ArrayList<IntWritable> valueList = new ArrayList<IntWritable>();
                
                        //outputs original edge pair as key and $ for value
                        while (values.hasNext()) {
                            IntWritable value = values.next();
                            valueList.add(value);
                            keyText.set(key.get() + ", " + value.get());
                            output.collect(keyText, $);
                        }
                
                        //outputs all the 2 length pairs 
                        for(int i = 0; i < valueList.size(); i++)
                            for(int j = i+1; i < valueList.size(); j++)
                                output.collect(new Text(valueList.get(i).get() + ", " + valueList.get(j).get()), key);
                    }
                }
                

                还有我的工作配置:

                JobConf conf = new JobConf(Triangles.class);
                conf.setJobName("mapred1");
                
                conf.setMapOutputKeyClass(IntWritable.class);
                conf.setMapOutputValueClass(IntWritable.class);
                
                conf.setOutputKeyClass(Text.class);
                conf.setOutputValueClass(IntWritable.class);
                
                conf.setMapperClass(Map.class);
                conf.setCombinerClass(Reduce.class);
                conf.setReducerClass(Reduce.class);
                
                conf.setInputFormat(TextInputFormat.class);
                conf.setOutputFormat(TextOutputFormat.class);
                
                FileInputFormat.setInputPaths(conf, new Path(args[0]));
                FileOutputFormat.setOutputPath(conf, new Path("mapred1"));
                
                JobClient.runJob(conf);
                

                推荐答案

                你的问题是你把Reduce类设置为combiner

                Your problem is that you set the Reduce class as a combiner

                conf.setCombinerClass(Reduce.class);
                

                组合器在映射阶段运行,它们需要发出相同的键/值类型(在您的情况下为 IntWriteable、IntWritable)删除这一行,你应该没问题

                Combiners run in the map phase and they need to emit the same key/value type (IntWriteable, IntWritable in your case) remove this line and you should be ok

                这篇关于错误的键类:文本不是 IntWritable的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!

                上一篇:如何指定映射配置和使用 Amazon 的 EMR 在 CLI 中使 下一篇:由于任务尝试未能报告状态 600 秒,reduce 失败.杀

                相关文章

                最新文章

                <i id='FUg7A'><tr id='FUg7A'><dt id='FUg7A'><q id='FUg7A'><span id='FUg7A'><b id='FUg7A'><form id='FUg7A'><ins id='FUg7A'></ins><ul id='FUg7A'></ul><sub id='FUg7A'></sub></form><legend id='FUg7A'></legend><bdo id='FUg7A'><pre id='FUg7A'><center id='FUg7A'></center></pre></bdo></b><th id='FUg7A'></th></span></q></dt></tr></i><div id='FUg7A'><tfoot id='FUg7A'></tfoot><dl id='FUg7A'><fieldset id='FUg7A'></fieldset></dl></div>

                  1. <small id='FUg7A'></small><noframes id='FUg7A'>

                      <bdo id='FUg7A'></bdo><ul id='FUg7A'></ul>

                    <legend id='FUg7A'><style id='FUg7A'><dir id='FUg7A'><q id='FUg7A'></q></dir></style></legend><tfoot id='FUg7A'></tfoot>