Free Shipping

Secure Payment

easy returns

24/7 support

  • Home
  • Blog
  • Spark Streaming and Kafka Integration

Spark Streaming and Kafka Integration

 July 19  | 0 Comments

Here are the Maven dependencies of our project:

Note: In order to convert you Java project into a Maven project, right click on the project—> Configure —> Convert to Maven project

Now in the target–>pom.xml file, add the following dependency configurations. Then all the required dependencies will get downloaded automatically.

<dependencies>
  	<dependency>
  		<groupId>org.apache.spark</groupId>
  		<artifactId>spark-streaming_2.11</artifactId>
  		<version>1.6.3</version>
  	</dependency>
  	<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.11 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka_2.11</artifactId>
    <version>1.6.3</version>
</dependency>

This is how you can perform Spark streaming and Kafka Integration in a simpler way by creating the producers, topics, and brokers from the command line and accessing them from the Kafka create stream method.

We hope this blog helped you in understanding how to build an application having Spark streaming and Kafka Integration.

Enroll for Apache Spark Training conducted by Acadgild for a successful career growth.

>