Importing an External Network File System Server

Import an external NFS server into Data Fabric to be able to transfer data from Data Fabric to the external NFS server to make it shareable across the clusters in the global namespace or cluster group.

About this task

You can import an external network file system (NFS) into the global namespace so that the external NFS server is available and accessible to all the clusters in the global namespace.

After you import an external NFS server, you are able to transfer data from the Data Fabric cluster on to the external NFS server. The data, thus, transferred is shareable between all clusters present in the global namespace.

NOTE
NFSv4 compliant servers can be imported into Data Fabric.

When you are importing an external NFS server, you can specify one of more hostnames or IP addresses that are assigned to the NFS server. If multiple network interface controllers are attached to the NFS server, the NFS server is identified by multiple IP addresses or hostnames.

Follow the steps given below to import an external NFS server into Data Fabric.

Procedure

  1. Log on to the Data Fabric UI .
  2. Select Fabric manager on the Home page.
  3. Click Global namespace.
  4. Click Import External NFS.
  5. Enter the name for the NFS server in NFS name.
  6. Enter the IP addresses or the hostnames for the external NFS server as a comma-separated string in IP address or hostname.
  7. Click Import.

Results

The NFS server is imported into Data Fabric. The NFS server is visible under the list of resources in the global namespace.

You are able to transfer data to the NFS server after importing the NFS server.

Related maprcli Commands
To implement the features described on this page, the Data Fabric UI relies on the following maprcli command. The command is provided for general reference. For more information, see maprcli Commands in This Guide.