Getting Started with Data Fabric MCP
The following sections describe the prerequisites and steps required to use the MCP server.
Prerequisites
MCP server requires the following components:
- MCP server runs within Data Access Gateway (DAG).
- To use MCP Object store endpoints, you must have Data Fabric Object store installed and configured in the cluster. See Installing the HPE Data Fabric Object Store for more information.
- To use MCP Apache Iceberg endpoints, you need Apache Spark, Apache Livy,
Apache Hive meta store installed and configured in the cluster. See the
following documentation for more information:
- For Apache Livy, see Livy.
- For Apache Spark, see Apache Spark.
- For Apache Hive, see Hive.
Security
Encryption
Data Fabric MCP server listens on an HTTPS port. The communication between LLM agent and Fabric MCP server is encrypted using SSL.
Authentication
The OAuth is recommended for Authentication. See Understanding Authorization in MCP.
NOTE
Data Fabric MCP server supports both OAuth and HTTP Basic
schemes for authentication. However, OAuth is recommended for the production
environment.Access Control
Data Fabric MCP server provides Role-based access control for the data stored in the Data Fabric cluster. Refer to the respective MCP endpoint pages (See Data Fabric MCP Endpoints for details).
Configure MCP
To configure MCP servers, see Configuring Data Fabric MCP Server.