Free Shipping

Secure Payment

easy returns

24/7 support

  • Home
  • Blog
  • JAVA APIs for Copying Files from HDFS to LFS

JAVA APIs for Copying Files from HDFS to LFS

 July 14  | 0 Comments

In our previous blog, we discussed copying files from Local File System(LFS) to HDFS.

In this blog, we will be implementing the copying of a file from HDFS to Local File System.

We will start our discussion with the given code snippet which needs to be written in Eclipse and then we need to make a jar file from the given code and then execute it to copy from HDFS to Local File System.

You can refer this link to understand how to write a MapReduce program in Java and then execute it by making its jar file.

Find the below code snippet for copying file from HDFS to LFS:

import java.io.BufferedOutputStream;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
public class HdfsReader extends Configured implements Tool {
public int run(String[] args) throws Exception {
String localOutputPath = args[0];
Configuration conf = getConf();
FileSystem fs = FileSystem.get(conf);
InputStream is = fs.open(new Path("hdfs:/acadgild.txt"));
OutputStream os = new BufferedOutputStream(new FileOutputStream(localOutputPath)); // Data set is getting copied into local path in the file sysetm through buffer mechanism
IOUtils.copyBytes(is, os, conf);
return 0;
}
public static void main(String[] args) throws Exception {
int returnCode = ToolRunner.run(
new HdfsReader(), args);
System.exit(returnCode);
}
}

 

Explanation of the above code:

HdfsReader calls the method open() to open a file in HDFS, which returns an InputStream object that can be used to read the contents of the file.

Step to execute the above code:

Step 1: We need to ensure that the file which we want to copy to Local File System is present in HDFS.

 

hadoop dfs -ls /

 

Step 2: Make a jar file of the above code and run that jar file in the Hadoop environment where the file needs to be copied.

With the execution of the above script, the file acadgild.txt will be copied in the location directory test_ip in the path /home/acadgild/Desktop

hadoop jar HDFS_READ.jar /home/acadgild/Desktop/test_ip

 

Step 3: Type the command to check whether the content of the file acadgild.txt has been copied to specified location or not.

Refer to the screenshot below for details:

 

We can see that the content of the file acadgild.txt gets copied into the specified location.

We hope this blog helped you in understanding the JAVA API used for copying files from HDFS to LFS.

Keep visiting our website https://acadgild.com/blog for more Blogs on Big Data and other technologies.

>