Copying Data Using NFS for the HPE Ezmeral Data Fabric
Describes how to copy files from one data-fabric cluster to another using NFS for the HPE Ezmeral Data Fabric.
If NFS for the HPE Ezmeral Data
Fabric is installed on the data-fabric cluster, you can mount the data-fabric cluster to the HDFS cluster and then copy files from one
cluster to the other using hadoop distcp
. If you do not have NFS for the HPE Ezmeral Data Fabric
installed and a mount point configured, see Accessing Data with NFS v3 and Managing the HPE Ezmeral Data Fabric NFS Service.
<MapR NFS Server>
- the IP address or hostname of the NFS server in the data-fabric cluster<maprfs_nfs_mount>
- the NFS export mount point configured on the data-fabric cluster; default is/mapr
<hdfs_nfs_mount>
- the NFS for the HPE Ezmeral Data Fabric mount point configured on the HDFS cluster<NameNode>
- the IP address or hostname of the NameNode in the HDFS cluster<NameNode Port>
- the port on the NameNode in the HDFS cluster<HDFS path>
- the path to the HDFS directory from which you plan to copy data-
<MapR filesystem path>
- the path in the data-fabric cluster to which you plan to copy HDFS data
- Mount HDFS.Issue the following command to mount the data-fabric cluster to the HDFS NFS for the HPE Ezmeral Data Fabric mount point:
mount <Data Fabric NFS Server>:/<maprfs_nfs_mount> /<hdfs_nfs_mount>
Examplemount 10.10.100.175:/mapr /hdfsmount
- Copy data.
- Issue the following command to copy data from the HDFS cluster to the data-fabric
cluster:
hadoop distcp hdfs://<NameNode>:<NameNode Port>/<HDFS path> file:///<hdfs_nfs_mount>/<MapR filesystem path>
Example
hadoop distcp hdfs://nn1:8020/user/sara/file.txt file:///hdfsmount/user/sara
- Issue the following command from the data-fabric cluster to verify that the file was copied to the
data-fabric
cluster:
hadoop fs -ls /<MapR filesystem path>
Examplehadoop fs -ls /user/sara
- Issue the following command to copy data from the HDFS cluster to the data-fabric
cluster: