LoginSignup
7
6

More than 5 years have passed since last update.

Apache Hadoop 2.7.1をソースコードからビルドする

Last updated at Posted at 2015-08-17

hadoop 2.7.1をソースコードからビルドするための手順メモです。環境はCentOS7(64bit)です。

JDK インストール

OracleのサイトからJDKのインストール用RPMをダウンロードしてきます。それを適当な場所においてrpmコマンドでインストールします。

# rpm -ivh jdk-8u51-linux-x64.rpm
準備しています...              ################################# [100%]
更新中 / インストール中...
   1:jdk1.8.0_51-2000:1.8.0_51-fcs    ################################# [100%]
Unpacking JAR files...
        rt.jar...
        jsse.jar...
        charsets.jar...
        tools.jar...
        localedata.jar...
        jfxrt.jar...
        plugin.jar...
        javaws.jar...
        deploy.jar...

alternativesjava, javacコマンドの指し先をインストールしたOracle Javaにします。

# alternatives --config java

4 プログラムがあり 'java' を提供します。

  選択       コマンド
-----------------------------------------------
   1           /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.51-1.b16.el7_1.x86_64/jre/bin/java
*+ 2           /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85-2.6.1.2.el7_1.x86_64/jre/bin/java
   3           /usr/lib/jvm/jre-1.6.0-openjdk.x86_64/bin/java
   4           /usr/java/jdk1.8.0_51/jre/bin/java

Enter を押して現在の選択 [+] を保持するか、選択番号を入力します:4
# java -version
java version "1.8.0_51"
Java(TM) SE Runtime Environment (build 1.8.0_51-b16)
Java HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)
# alternatives --config javac

4 プログラムがあり 'javac' を提供します。

  選択       コマンド
-----------------------------------------------
   1           /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.51-1.b16.el7_1.x86_64/bin/javac
*+ 2           /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85-2.6.1.2.el7_1.x86_64/bin/javac
   3           /usr/lib/jvm/java-1.6.0-openjdk.x86_64/bin/javac
   4           /usr/java/jdk1.8.0_51/bin/javac

Enter を押して現在の選択 [+] を保持するか、選択番号を入力します:4
# javac -version
javac 1.8.0_51

maven インストール

yumでインストールします。

# yum install maven
<略>
# mvn --version
Apache Maven 3.0.5 (Red Hat 3.0.5-16)
Maven home: /usr/share/maven
Java version: 1.8.0_51, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_51/jre
Default locale: ja_JP, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-229.11.1.el7.x86_64", arch: "amd64", family: "unix"

コンパイル作業用ユーザの作成

好きなユーザで良いです。とりあえずインストールも見越してhadoopユーザを作成

# useradd hadoop
# passwd hadoop

protocl buffer インストール

バージョン2.5.0のtar ballをダウンロードしてきます。
https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz

$ cd /home/hadoop/src/
$ tar zxf protobuf-2.5.0.tar.gz
$ cd protobuf-2.5.0

インストール先はhadoopユーザのホームディレクトリ配下にしておきます。

$ ./configure --prefix=/home/hadoop/bin/protoc
$ make
$ make install

PATH環境変数にインストール先のbinディレクトリを追加しておきます。

$ tail -n 3 .bash_profile
PATH=$PATH:$HOME/.local/bin:$HOME/bin:$HOME/bin/protoc/bin

export PATH

versionオプションで動作確認。

$ protoc --version
libprotoc 2.5.0

必要なライブラリのインストール

全部yumでインストールできます。

# yum install cmake
# yum install zlib-devel
# yum install openssl-devel
# yum install jansson-devel
# yum install fuse-devel

hadoop ビルド

Native ライブラリも含めたバイナリパッケージの作成実行します。
ビルド時に大量のライブラリをインターネット経由でダウンロードするので、プロキシ設定が必要ない環境での実行が楽です。

$ tar zxf hadoop-2.7.1-src.tar.gz
$ cd hadoop-2.7.1-src
$ mvn package -Pdist,native -DskipTests -Dtar

ビルドには30分程度かかりました。お風呂に入ってくるのにちょうどいいです。

SSDを積んだWindowsノートPC上の仮想環境なので、これくらいかかるのかな?
ライブラリのダウンロードは初回のビルド時しか行われないので、2回目以降はもう少し早くなるはずです。

[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [1:37.992s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [37.652s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [29.966s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.214s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [18.754s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [30.347s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2:58.356s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [2:00.952s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [9.951s]
[INFO] Apache Hadoop Common .............................. SUCCESS [3:49.372s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [7.127s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [1:30.649s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.071s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [6:47.746s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:01.649s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [56.341s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [4.201s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.051s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.049s]
[INFO] hadoop-yarn-api ................................... SUCCESS [54.403s]
[INFO] hadoop-yarn-common ................................ SUCCESS [2:07.544s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.099s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [11.254s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.571s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.204s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [9.027s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [30.189s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [6.033s]
[INFO] hadoop-yarn-client ................................ SUCCESS [6.639s]
[INFO] hadoop-yarn-server-sharedcachemanager ............. SUCCESS [3.717s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.045s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.638s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.116s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.037s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [4.931s]
[INFO] hadoop-yarn-project ............................... SUCCESS [5.193s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.054s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [26.446s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [19.436s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.734s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [9.185s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [5.368s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [13.602s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.801s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.428s]
[INFO] hadoop-mapreduce .................................. SUCCESS [3.561s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.672s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [18.428s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.371s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [5.796s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [5.066s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [2.658s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [2.596s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [2.822s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [8.798s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.781s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [54.172s]
[INFO] Apache Hadoop Azure support ....................... SUCCESS [11.695s]
[INFO] Apache Hadoop Client .............................. SUCCESS [8.984s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.086s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [4.807s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [10.210s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.022s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [47.733s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:41.819s
[INFO] Finished at: Sun Aug 16 22:55:41 JST 2015
[INFO] Final Memory: 105M/239M
[INFO] ------------------------------------------------------------------------
7
6
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
7
6