Error Connecting to Cloudera Kerberized Hadoop Cluster from Unix CLI -


i trying connect kerberos secured cloudera hadoop devbox unix cli , running issue.

below command trying run , gives me following error

./bin/hadoop fs -ls  hdfs://devhost02:8020   warn util.nativecodeloader: unable load native-hadoop library platform... using builtin-java classes applicable warn security.usergroupinformation: priviledgedactionexception as:abc (auth:kerberos) cause:javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)] warn ipc.client: exception encountered while connecting server : javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)] warn security.usergroupinformation: priviledgedactionexception as:abc (auth:kerberos) cause:java.io.ioexception: javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)] ls: failed on local exception: java.io.ioexception: javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)]; host details : local host is: "mylocalhost"; destination host is: "devhost02":8020;  

my kerberos hadoop admin created keytab file , shared it. have copied keytab file @ /home/myuser/abc.keytab on devbox.

from devbox unix cli able run below commands

 kinit -k -t /home/myuser/abc.keytab  abc@xyz.refinery.dev ---runs fine not error nor prompt enter password  klist command works ....klist command output below   ticket cache: file:/tmp/krb5cc_30074  default principal: abc@xyz.refinery.dev   valid starting     expires            service principal  07/20/17 16:11:42  07/21/17 16:11:42  krbtgt/xyz.refinery.dev@xyz.refinery.dev   renew until 07/27/17 16:11:38 

i have configured krb5.conf file on devbox under /etc/krb5.conf have below values

krb5.conf      [libdefaults]  default_realm = xyz.refinery.dev  dns_lookup_kdc = false  dns_lookup_realm = false  ticket_lifetime = 86400  renew_lifetime = 604800  forwardable = true  default_tgs_enctypes = aes256-cts-hmac-sha1-96 aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac des-hmac-sha1 des-cbc-md5  default_tkt_enctypes = aes256-cts-hmac-sha1-96 aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac des-hmac-sha1 des-cbc-md5  permitted_enctypes = aes256-cts-hmac-sha1-96 aes128-cts-hmac-sha1-96 des3-cbc-sha1 arcfour-hmac des-hmac-sha1 des-cbc-md5  udp_preference_limit = 1  [realms]  xyz.refinery.dev = {  kdc = devhost01  admin_server = devhost01  } 

below core site xml snippet of hadoop environment

<configuration>    <property>      <name>dfs.nameservices</name>      <value>namsvc10</value>    </property>    <property>      <name>dfs.ha.namenodes.namsvc10</name>      <value>namenode1,namenode2</value>    </property>     <property>       <name>dfs.namenode.rpc-address.namsvc10.namenode1</name>       <value>devhost01:8020</value>     </property>     <property>       <name>dfs.namenode.rpc-address.namsvc10.namenode2</name>       <value>devhost02:8020</value>     </property>     <property>       <name>dfs.namenode.http-address.namsvc10.namenode1</name>       <value>devhost01:50070</value>     </property>     <property>       <name>dfs.namenode.http-address.namsvc10.namenode2</name>       <value>devhost02:50070</value>     </property>     <property>       <name>dfs.client.failover.proxy.provider.namsvc10</name>       <value>org.apache.hadoop.hdfs.server.namenode.ha.configuredfailoverproxyprovider</value>     </property>   </configuration> 

i able generate ticket shellusing kinit , klistcommands shell getting above listed error. able ping hadoop hosts part of [./bin/hadoop fs -ls hdfs://devhost02:8020] command cli on devbox.

also our hadoop admin suggested need have unix local account on devbox same name kerberos user. our kerberos user "abc" , on devbox need logged in "abc" , run above command work. implemented out still continue error.

any idea missing , how go getting work,your guidance appreciated.

please let me know if need more information our end

below complete details of various artifacts have installed on devbox

java version "1.7.0_121" java(tm) se runtime environment (build 1.7.0_121-b15) java hotspot(tm) 64-bit server vm (build 24.121-b15, mixed mod 

hadoop-2.6.0-cdh5.10.1 package krb5-libs , krb5-workstation packages krb5.conf required configuration keytab file linux myhostname 2.6.32-642.13.1.el6.x86_64 #1 smp wed nov 23 16:03:01 est 2016 x86_64 x86_64 x86_64 gnu/linux

this pre-requiste making splunk hadoop integration, above task has no dependency on splunk hoping can guide.

thanking in advance.


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -