开发者

Connecting to Cloudera VM from my desktop

开发者 https://www.devze.com 2023-03-27 07:27 出处:网络
I downloaded the Cloudera VM on my Windows 7 laptop to play around.I am trying to connect to the Hadoop instance running in the VM from

I downloaded the Cloudera VM on my Windows 7 laptop to play around. I am trying to connect to the Hadoop instance running in the VM from Windows. I did an ifconfig and got the IP address of the VM. I can connect to the web interfaces running in the VM from Firefox running on my Windows box so i know i can connect at least to that.

So next, i tried to connect to Hadoop from Java.

public class FileSystemWriter
{

static
        {
                URL.setURLStreamHandlerFactory( new FsUrlStreamHandlerFactory() );
        }

        public static void main( String[] args ) throws Exception
        {
                String uri = "hdfs://192.168.171.128/user";
                Configuration conf = new Configuration();

                System.out.println( "uri: " + uri );

                FileSystem fs = FileSystem.get( URI.create( uri ), conf );
       }

} 

But i get errors.

uri: hdfs://192.168.171.128/user

Aug 9, 2011 8:29:26 AM org.apache.hadoop.ipc.Client$Connec开发者_Python百科tion
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
0 time(s).
Aug 9, 2011 8:29:28 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
1 time(s).
Aug 9, 2011 8:29:30 AM org.apache.hadoop.ipc.Client$Connection
handleConnectionFailure
INFO: Retrying connect to server: /192.168.171.128:8020. Already tried
2 time(s).

Can anyone help me out?


First, try to connect over hftp .

        uri = "hftp://172.16.xxx.xxx:50070/";

        System.out.println( "uri: " + uri );           
        Configuration conf = new Configuration();

        FileSystem fs = FileSystem.get( URI.create( uri ), conf );
        fs.printStatistics();

If you see something (no exceptions), then you are connected.

If you do not, then your problem is not HDFS, but rather, your problem is that you have a bad ip, or hadoop isn't running, or your ports are blocked... etc...


  1. Make sure your Namenode is listening on port 8020. You can check with this command:

    hadoop fs -ls hdfs://namenode(ip):8020
    
  2. If this check fails, type vim HADOOP_HOME/conf/core-site.xml and see your namenode port in this entry of fs.default.name.

  3. Change your java code:

    String uri = "hdfs://192.168.171.128:portOfNameNode/user";
    
0

精彩评论

暂无评论...
验证码 换一张
取 消