Skip to content

Commit cdbeb16

Browse files
committed
Apache Hadoop 2.0.4-alpha RC2.
git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/tags/release-2.0.4-alpha-rc1@1467479 13f79535-47bb-0310-9956-ffa450edef68
1 parent 19a3c62 commit cdbeb16

File tree

5,021 files changed

+1703091
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

5,021 files changed

+1703091
-0
lines changed

branch-2.0.4-alpha/.gitattributes

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# Auto detect text files and perform LF normalization
2+
* text=auto
3+
4+
*.cs text diff=csharp
5+
*.java text diff=java
6+
*.html text diff=html
7+
*.py text diff=python
8+
*.pl text diff=perl
9+
*.pm text diff=perl
10+
*.css text
11+
*.js text
12+
*.sql text
13+
14+
*.sh text eol=lf
15+
16+
*.bat text eol=crlf
17+
*.csproj text merge=union eol=crlf
18+
*.sln text merge=union eol=crlf

branch-2.0.4-alpha/.gitignore

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
*.iml
2+
*.ipr
3+
*.iws
4+
.idea
5+
.svn
6+
.classpath
7+
.project
8+
.settings
9+
target
10+
hadoop-hdfs-project/hadoop-hdfs/downloads
11+
hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads

branch-2.0.4-alpha/BUILDING.txt

Lines changed: 140 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,140 @@
1+
Build instructions for Hadoop
2+
3+
----------------------------------------------------------------------------------
4+
Requirements:
5+
6+
* Unix System
7+
* JDK 1.6
8+
* Maven 3.0
9+
* Findbugs 1.3.9 (if running findbugs)
10+
* ProtocolBuffer 2.4.1+ (for MapReduce and HDFS)
11+
* CMake 2.6 or newer (if compiling native code)
12+
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
13+
14+
----------------------------------------------------------------------------------
15+
Maven main modules:
16+
17+
hadoop (Main Hadoop project)
18+
- hadoop-project (Parent POM for all Hadoop Maven modules. )
19+
(All plugins & dependencies versions are defined here.)
20+
- hadoop-project-dist (Parent POM for modules that generate distributions.)
21+
- hadoop-annotations (Generates the Hadoop doclet used to generated the Javadocs)
22+
- hadoop-assemblies (Maven assemblies used by the different modules)
23+
- hadoop-common-project (Hadoop Common)
24+
- hadoop-hdfs-project (Hadoop HDFS)
25+
- hadoop-mapreduce-project (Hadoop MapReduce)
26+
- hadoop-tools (Hadoop tools like Streaming, Distcp, etc.)
27+
- hadoop-dist (Hadoop distribution assembler)
28+
29+
----------------------------------------------------------------------------------
30+
Where to run Maven from?
31+
32+
It can be run from any module. The only catch is that if not run from utrunk
33+
all modules that are not part of the build run must be installed in the local
34+
Maven cache or available in a Maven repository.
35+
36+
----------------------------------------------------------------------------------
37+
Maven build goals:
38+
39+
* Clean : mvn clean
40+
* Compile : mvn compile [-Pnative]
41+
* Run tests : mvn test [-Pnative]
42+
* Create JAR : mvn package
43+
* Run findbugs : mvn compile findbugs:findbugs
44+
* Run checkstyle : mvn compile checkstyle:checkstyle
45+
* Install JAR in M2 cache : mvn install
46+
* Deploy JAR to Maven repo : mvn deploy
47+
* Run clover : mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license]
48+
* Run Rat : mvn apache-rat:check
49+
* Build javadocs : mvn javadoc:javadoc
50+
* Build distribution : mvn package [-Pdist][-Pdocs][-Psrc][-Pnative][-Dtar]
51+
* Change Hadoop version : mvn versions:set -DnewVersion=NEWVERSION
52+
53+
Build options:
54+
55+
* Use -Pnative to compile/bundle native code
56+
* Use -Pdocs to generate & bundle the documentation in the distribution (using -Pdist)
57+
* Use -Psrc to create a project source TAR.GZ
58+
* Use -Dtar to create a TAR with the distribution (using -Pdist)
59+
60+
Snappy build options:
61+
62+
Snappy is a compression library that can be utilized by the native code.
63+
It is currently an optional component, meaning that Hadoop can be built with
64+
or without this dependency.
65+
66+
* Use -Drequire.snappy to fail the build if libsnappy.so is not found.
67+
If this option is not specified and the snappy library is missing,
68+
we silently build a version of libhadoop.so that cannot make use of snappy.
69+
This option is recommended if you plan on making use of snappy and want
70+
to get more repeatable builds.
71+
72+
* Use -Dsnappy.prefix to specify a nonstandard location for the libsnappy
73+
header files and library files. You do not need this option if you have
74+
installed snappy using a package manager.
75+
* Use -Dsnappy.lib to specify a nonstandard location for the libsnappy library
76+
files. Similarly to snappy.prefix, you do not need this option if you have
77+
installed snappy using a package manager.
78+
* Use -Dbundle.snappy to copy the contents of the snappy.lib directory into
79+
the final tar file. This option requires that -Dsnappy.lib is also given,
80+
and it ignores the -Dsnappy.prefix option.
81+
82+
Tests options:
83+
84+
* Use -DskipTests to skip tests when running the following Maven goals:
85+
'package', 'install', 'deploy' or 'verify'
86+
* -Dtest=<TESTCLASSNAME>,<TESTCLASSNAME#METHODNAME>,....
87+
* -Dtest.exclude=<TESTCLASSNAME>
88+
* -Dtest.exclude.pattern=**/<TESTCLASSNAME1>.java,**/<TESTCLASSNAME2>.java
89+
90+
----------------------------------------------------------------------------------
91+
Building components separately
92+
93+
If you are building a submodule directory, all the hadoop dependencies this
94+
submodule has will be resolved as all other 3rd party dependencies. This is,
95+
from the Maven cache or from a Maven repository (if not available in the cache
96+
or the SNAPSHOT 'timed out').
97+
An alternative is to run 'mvn install -DskipTests' from Hadoop source top
98+
level once; and then work from the submodule. Keep in mind that SNAPSHOTs
99+
time out after a while, using the Maven '-nsu' will stop Maven from trying
100+
to update SNAPSHOTs from external repos.
101+
102+
----------------------------------------------------------------------------------
103+
Importing projects to eclipse
104+
105+
When you import the project to eclipse, install hadoop-maven-plugins at first.
106+
107+
$ cd hadoop-maven-plugins
108+
$ mvn install
109+
110+
Then, generate ecplise project files.
111+
112+
$ mvn eclipse:eclipse -DskipTests
113+
114+
At last, import to eclipse by specifying the root directory of the project via
115+
[File] > [Import] > [Existing Projects into Workspace].
116+
117+
----------------------------------------------------------------------------------
118+
Building distributions:
119+
120+
Create binary distribution without native code and without documentation:
121+
122+
$ mvn package -Pdist -DskipTests -Dtar
123+
124+
Create binary distribution with native code and with documentation:
125+
126+
$ mvn package -Pdist,native,docs -DskipTests -Dtar
127+
128+
Create source distribution:
129+
130+
$ mvn package -Psrc -DskipTests
131+
132+
Create source and binary distributions with native code and documentation:
133+
134+
$ mvn package -Pdist,native,docs,src -DskipTests -Dtar
135+
136+
Create a local staging version of the website (in /tmp/hadoop-site)
137+
138+
$ mvn clean site; mvn site:stage -DstagingDirectory=/tmp/hadoop-site
139+
140+
----------------------------------------------------------------------------------

0 commit comments

Comments
 (0)