Building a Flink Job with Maven
Describes how to build a Flink job with Maven.
Overview
Currently, only DataStream API is supported for Flink on Data Fabric Ecosystem Pack.
The application development process in DEP Flink differs from Apache Flink only in using DEP specific Maven dependencies instead of Apache specific maven dependencies.
For more information on how to write Flink jobs, see the following Apache Flink documentation:
The following sections describe the differences in the Project Configuration between Flink on Data Fabric and Apache Flink.
DF Streams integration
To enable DF Streams integration in flink-connector-kafka, use the
                applicable version of the dependency in your project. Following is an example for
                    DEP10.0.0 version:
<dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-kafka</artifactId>
        <version><4.0.0.0-dep-1000></version>
</dependency>
        DEP Schema Registry integration
If you use Schema Registry client-libraries (for example, Avro serializer), you must switch to a DEP specific version. The following is an example for DEP10.0.0 version:
<dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>7.6.0.400-dep-1000</version>
        <exclusions>
                  <!-- We want to take ZooKeeper from cluster, otherwise it might cause NoSuchMethodError -->
                  <exclusion>
                            <groupId>org.apache.zookeeper</groupId>
                            <artifactId>zookeeper</artifactId>
                  </exclusion>
        </exclusions>
</dependency>