开发者

0.20.2 API hadoop version with java 5

开发者 https://www.devze.com 2023-01-02 08:35 出处:网络
I have started a maven project trying to implement the MapReduce algorithm in java 1.5.0_14. I have chosen the 0.20.2 API hadoop version. In the pom.xml i\'m using thus the following dependency:

I have started a maven project trying to implement the MapReduce algorithm in java 1.5.0_14. I have chosen the 0.20.2 API hadoop version. In the pom.xml i'm using thus the following dependency:

< dependency>

< groupId>org.apache.hadoop< /groupId>      
< artifactId>hadoop-core< /artifactId>      
< version>0.20.2< /version>

< /dependency>

But when I'm using an import to the org.apache.hadoop classes, I get the following error:

bad class file: ${HOME_DIR}\repository\org\apache\had开发者_如何学运维oop\hadoop-core\0.20.2\hadoop-core-0.20.2.jar(org/apache/hadoop/fs/Path.class) class file has wrong version 50.0, should be 49.0.

Does someone know how can I solve this issue.

Thanks.


Maven by default compiles to JDK 1.4 compatibility. You need to change this.

You need to add this to your pom.xml:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-compiler-plugin</artifactId>
    <version>2.0.2</version>
    <configuration>
        <source>1.6</source>
        <target>1.6</target>
    </configuration>
</plugin>

[Edit: thank you Sean for pointing out Hadoop requires JDK 6]


I ran into this exact same problem. Turned out sbt itself was running on Java 5, which is the default on my Mac for a silly but valid reason. Once I changed my sbt script to explicitly start with Java6, everything worked fine.


Regardless of your maven-compiler-plugin's source & target configurations (that only controls how your own source code is compiled) you must use a 1.6 JVM to run Hadoop's code since it is compiled targetting "1.6" JVM.

So, just install a 1.6 java runtime and use that to run your program.

0

精彩评论

暂无评论...
验证码 换一张
取 消