hadoop - Can't put files into HDFS -


i'm trying set hadoop multi-node cluster , i'm getting following problem. have 1 node master , 1 slave.

it seems right because when execute {jps} processes master:

{ 29983 secondarynamenode 30596 jps 29671 namenode 30142 resourcemanager } 

and ones slave:

{   18096 nodemanager   17847 datanode   18197 jps   }   

unfortunately, when try -put command, error:

hduser@master:/usr/local/hadoop/bin$ ./hdfs dfs -put /home/hduser/ejemplos/fichero /ejemplos/ 14/03/24 12:49:06 warn util.nativecodeloader: unable load native-hadoop library platform... using builtin-java classes applicable 14/03/24 12:49:07 warn hdfs.dfsclient: datastreamer exception org.apache.hadoop.ipc.remoteexception(java.io.ioexception): file /ejemplos/fichero.copying replicated 0 nodes instead of minreplication (=1). there 0 datanode(s) running , no node(s) excluded in operation.

when go webui, there 0 live nodes , don't know why! can't fix error , appreciate help!

you want check log files of data node (slave) errors in set up. if run cloudera cdh, you'll find these in /var/log/hadoop-hdfs, otherwise in directory specified in config.

the error "could replicated 0 nodes" points problem there.

also make sure slave , master can connect via ssh key authentication.

just quick question: did format namenode?


Comments

Popular posts from this blog

android - Get AccessToken using signpost OAuth without opening a browser (Two legged Oauth) -

org.mockito.exceptions.misusing.InvalidUseOfMatchersException: mockito -

google shop client API returns 400 bad request error while adding an item -