FIRST YOU NEED TO HAVE HDFS JAVA API -Go to namenode ssh namenode1 -Go home cd /home/ -check the file you have ls -download hdfs java api by copy the link of the file from the website then wget (paste the link here) -check if the file is downloaded ls -Extract the downloaded file tar -zxvf hdfs-java-api.tar.gz -check if the file is extracted. ls SECOND CONFIGURE YOUR JAVA API -Go to hdfs-java-api folder cd hdfs-java-api -Check the files available in hdfs-java-api folder ls -edit the build.xml file vi build.xml i (then change it and replace /user/local/ with /home/) save it (esc; shift; collon; wq) -edit the start.sh do the same as build.xml -now compile your Api ant THIRD CREATE INPUT FILES -We will create text file vi (filename).txt type what you save it (esc; shift; collon; wq) -check if the file exit ls -check the contet of the file cat (filename) -copy the file to HDFS(Storage for input and output) hdfs fs -put (filename).txt / FOURTH START YOUR JAVA TO EXECUTE -go to hdfs-java-api cd hdfs-java-api -search for your class(in bin folder) and copy the path ./start.sh org/apache/hadoop/examples/WordCount /(inputfilename).txt /(outputFolderName) -check the output hadoop fs -ls /(outputFolderName) -check the content of the file hadoop fs -cat /(outputFolderName)/part-r-00000