做爰高潮a片〈毛片〉,尤物av天堂一区二区在线观看,一本久久A久久精品VR综合,添女人荫蒂全部过程av

最新文章專題視頻專題問答1問答10問答100問答1000問答2000關鍵字專題1關鍵字專題50關鍵字專題500關鍵字專題1500TAG最新視頻文章推薦1 推薦3 推薦5 推薦7 推薦9 推薦11 推薦13 推薦15 推薦17 推薦19 推薦21 推薦23 推薦25 推薦27 推薦29 推薦31 推薦33 推薦35 推薦37視頻文章20視頻文章30視頻文章40視頻文章50視頻文章60 視頻文章70視頻文章80視頻文章90視頻文章100視頻文章120視頻文章140 視頻2關鍵字專題關鍵字專題tag2tag3文章專題文章專題2文章索引1文章索引2文章索引3文章索引4文章索引5123456789101112131415文章專題3
當前位置: 首頁 - 科技 - 知識百科 - 正文

eclipse中開發Hadoop2.x的Map/Reduce項目

來源:懂視網 責編:小采 時間:2020-11-09 13:13:54
文檔

eclipse中開發Hadoop2.x的Map/Reduce項目

eclipse中開發Hadoop2.x的Map/Reduce項目:本文演示如何在Eclipse中開發一個Map/Reduce項目: 1、環境說明 Hadoop2.2.0 EclipseJuno SR2 Hadoop2.x-eclipse-plugin 插件的編譯安裝配置的過程參考:http://www.micmiu.com/bigdata/hadoop/hadoop2-x-eclips
推薦度:
導讀eclipse中開發Hadoop2.x的Map/Reduce項目:本文演示如何在Eclipse中開發一個Map/Reduce項目: 1、環境說明 Hadoop2.2.0 EclipseJuno SR2 Hadoop2.x-eclipse-plugin 插件的編譯安裝配置的過程參考:http://www.micmiu.com/bigdata/hadoop/hadoop2-x-eclips

本文演示如何在Eclipse中開發一個Map/Reduce項目: 1、環境說明 Hadoop2.2.0 Eclipse?Juno SR2 Hadoop2.x-eclipse-plugin 插件的編譯安裝配置的過程參考:http://www.micmiu.com/bigdata/hadoop/hadoop2-x-eclipse-plugin-build-install/ 2、新建MR工程 依次

eclipse-mr-01本文演示如何在Eclipse中開發一個Map/Reduce項目: 1、環境說明
  • Hadoop2.2.0
  • Eclipse?Juno SR2
  • Hadoop2.x-eclipse-plugin 插件的編譯安裝配置的過程參考:http://www.micmiu.com/bigdata/hadoop/hadoop2-x-eclipse-plugin-build-install/
  • 2、新建MR工程 依次點擊 File →?New →?Ohter... ?選擇 “Map/Reduce Project”,然后輸入項目名稱:micmiu_MRDemo,創建新項目: eclipse-mr-01 eclipse-mr-02 3、創建Mapper和Reducer 依次點擊?File →?New →?Ohter... 選擇Mapper,自動繼承Mapper eclipse-mr-03 eclipse-mr-04 創建Reducer的過程同Mapper,具體的業務邏輯自己實現即可。 本文就以官方自帶的WordCount為例進行測試:
    package com.micmiu.mr;
    /**
     * Licensed to the Apache Software Foundation (ASF) under one
     * or more contributor license agreements. See the NOTICE file
     * distributed with this work for additional information
     * regarding copyright ownership. The ASF licenses this file
     * to you under the Apache License, Version 2.0 (the
     * "License"); you may not use this file except in compliance
     * with the License. You may obtain a copy of the License at
     *
     * http://www.apache.org/licenses/LICENSE-2.0
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */
    import java.io.IOException;
    import java.util.StringTokenizer;
    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.Path;
    import org.apache.hadoop.io.IntWritable;
    import org.apache.hadoop.io.Text;
    import org.apache.hadoop.mapreduce.Job;
    import org.apache.hadoop.mapreduce.Mapper;
    import org.apache.hadoop.mapreduce.Reducer;
    import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
    import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
    import org.apache.hadoop.util.GenericOptionsParser;
    public class WordCount {
     public static class TokenizerMapper 
     extends Mapper{
     private final static IntWritable one = new IntWritable(1);
     private Text word = new Text();
     public void map(Object key, Text value, Context context
     ) throws IOException, InterruptedException {
     StringTokenizer itr = new StringTokenizer(value.toString());
     while (itr.hasMoreTokens()) {
     word.set(itr.nextToken());
     context.write(word, one);
     }
     }
     }
     public static class IntSumReducer 
     extends Reducer {
     private IntWritable result = new IntWritable();
     public void reduce(Text key, Iterable values, 
     Context context
     ) throws IOException, InterruptedException {
     int sum = 0;
     for (IntWritable val : values) {
     sum += val.get();
     }
     result.set(sum);
     context.write(key, result);
     }
     }
     public static void main(String[] args) throws Exception {
     Configuration conf = new Configuration();
     String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
     if (otherArgs.length != 2) {
     System.err.println("Usage: wordcount  ");
     System.exit(2);
     }
     //conf.set("fs.defaultFS", "hdfs://192.168.6.77:9000");
     Job job = new Job(conf, "word count");
     job.setJarByClass(WordCount.class);
     job.setMapperClass(TokenizerMapper.class);
     job.setCombinerClass(IntSumReducer.class);
     job.setReducerClass(IntSumReducer.class);
     job.setOutputKeyClass(Text.class);
     job.setOutputValueClass(IntWritable.class);
     FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
     FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
     System.exit(job.waitForCompletion(true) ? 0 : 1);
     }
    }
    4、準備測試數據 micmiu-01.txt:
    Hi Michael welcome to Hadoop 
    more see micmiu.com
    micmiu-02.txt:
    Hi Michael welcome to BigData
    more see micmiu.com
    micmiu-03.txt:
    Hi Michael welcome to Spark 
    more see micmiu.com
    把 micmiu 打頭的三個文件上傳到hdfs:
    micmiu-mbp:Downloads micmiu$ hdfs dfs -copyFromLocal micmiu-*.txt /user/micmiu/test/input
    micmiu-mbp:Downloads micmiu$ hdfs dfs -ls /user/micmiu/test/input
    Found 3 items
    -rw-r--r-- 1 micmiu supergroup 50 2014-04-15 14:53 /user/micmiu/test/input/micmiu-01.txt
    -rw-r--r-- 1 micmiu supergroup 50 2014-04-15 14:53 /user/micmiu/test/input/micmiu-02.txt
    -rw-r--r-- 1 micmiu supergroup 49 2014-04-15 14:53 /user/micmiu/test/input/micmiu-03.txt
    micmiu-mbp:Downloads micmiu$
    5、配置運行參數 Run As →?Run Configurations… ,在Arguments中配置運行參數,例如程序的輸入參數: eclipse-mr-05 6、運行 Run As -> Run on Hadoop ,執行完成后可以看到如下信息: eclipse-mr-06 到此Eclipse中調用Hadoop2x本地偽分布式模式執行MR演示成功。 ps:調用集群環境MR運行一直失敗,暫時沒有找到原因。 —————– ?EOF?@Michael Sun?—————–

    聲明:本網頁內容旨在傳播知識,若有侵權等問題請及時與本網聯系,我們將在第一時間刪除處理。TEL:177 7030 7066 E-MAIL:11247931@qq.com

    文檔

    eclipse中開發Hadoop2.x的Map/Reduce項目

    eclipse中開發Hadoop2.x的Map/Reduce項目:本文演示如何在Eclipse中開發一個Map/Reduce項目: 1、環境說明 Hadoop2.2.0 EclipseJuno SR2 Hadoop2.x-eclipse-plugin 插件的編譯安裝配置的過程參考:http://www.micmiu.com/bigdata/hadoop/hadoop2-x-eclips
    推薦度:
    標簽: 開發 項目 map
    • 熱門焦點

    最新推薦

    猜你喜歡

    熱門推薦

    專題
    Top 主站蜘蛛池模板: 通州市| 汉中市| 清水河县| 武功县| 阿城市| 洪湖市| 古浪县| 莆田市| 湘潭县| 平武县| 正阳县| 东莞市| 贵港市| 基隆市| 台北市| 潞西市| 定安县| 江山市| 新建县| 东台市| 乌拉特中旗| 开阳县| 怀集县| 菏泽市| 静安区| 涞水县| 柞水县| 宜州市| 大庆市| 斗六市| 天等县| 邯郸市| 凤冈县| 鄂托克旗| 临夏县| 东兴市| 武城县| 尚义县| 汾西县| 开江县| 张家港市|