Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
Spring and Kafka
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
kafka
Spring and Kafka
Commits
178f5e5b
Commit
178f5e5b
authored
Apr 06, 2023
by
Lokesh Singh
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
spring boot and kafka poc
parents
Changes
14
Hide whitespace changes
Inline
Side-by-side
Showing
14 changed files
with
793 additions
and
0 deletions
+793
-0
.gitignore
.gitignore
+40
-0
README.md
README.md
+259
-0
pom.xml
pom.xml
+55
-0
SpringBootKafkaApplication.java
...a/com/lokesh/spring/kafka/SpringBootKafkaApplication.java
+12
-0
KafkaConsumerConfig.java
...a/com/lokesh/spring/kafka/config/KafkaConsumerConfig.java
+104
-0
KafkaProducerConfig.java
...a/com/lokesh/spring/kafka/config/KafkaProducerConfig.java
+92
-0
KafkaController.java
.../com/lokesh/spring/kafka/controllers/KafkaController.java
+36
-0
Message.java
src/main/java/com/lokesh/spring/kafka/model/Message.java
+19
-0
SuperHero.java
src/main/java/com/lokesh/spring/kafka/model/SuperHero.java
+76
-0
ConsumerService.java
...java/com/lokesh/spring/kafka/service/ConsumerService.java
+27
-0
ProducerService.java
...java/com/lokesh/spring/kafka/service/ProducerService.java
+38
-0
Output.PNG
src/main/resources/Output.PNG
+0
-0
application.yml
src/main/resources/application.yml
+19
-0
SpringBootKafkaApplicationTests.java
.../lokesh/spring/kafka/SpringBootKafkaApplicationTests.java
+16
-0
No files found.
.gitignore
0 → 100644
View file @
178f5e5b
HELP.md
target/
!.mvn/wrapper/maven-wrapper.jar
!**/src/main/**
!**/src/test/**
/mvnw.bat
/mvnw.cmd
/mvnw
/.mvn/
### STS ###
.apt_generated
.classpath
.factorypath
.project
.settings
.springBeans
.sts4-cache
### IntelliJ IDEA ###
.idea
*.iws
*.iml
*.ipr
### NetBeans ###
/nbproject/private/
/nbbuild/
/dist/
/nbdist/
/.nb-gradle/
build/
### VS Code ###
.vscode/
README.md
0 → 100644
View file @
178f5e5b
# spring-boot-kafka
Spring boot kafka application with multiple Producers and multiple Consumers for String data and JSON object -
This project explains How to
**publish**
message in kafka Topic
and
**consume**
a message from Kafka Topic. Here message is in String and Json Object format.
In this application there are two publishers, i.e. one for String data and another one is for publishing object.
For those two publishers, two
`KafkaTemplate`
s are used.
To consume those messages and objects two consumers are used. Two
`@KafkaListner`
s are used to consume respective data.
## Prerequisites
-
Java
-
[
Spring Boot
](
https://spring.io/projects/spring-boot
)
-
[
Maven
](
https://maven.apache.org/guides/index.html
)
-
[
Zookeeper
](
https://zookeeper.apache.org/
)
-
[
Kafka
](
https://kafka.apache.org/documentation/
)
## Tools
-
Eclipse or IntelliJ IDEA (or any preferred IDE) with embedded Gradle
-
Maven (version >= 3.6.0)
-
Postman (or any RESTful API testing tool)
-
Kafka (any commandline tool)
<br/>
### Install Zookeeper
Step 1: Download apache-zookeeper-x.x.x from
[
Zookeeper site
](
https://zookeeper.apache.org/releases.html
)
Step 2: Extract the folder at location c:
\a
pache-zookeeper-x.x.x
Step 3: Add c:
\a
pache-zookeeper-x.x.x
\b
in path as environment variable.
### Install Kafka
Step 1: Download kafka_x.xx-x.x.x from
[
Apache kafka site
](
https://kafka.apache.org
)
Step 2: Extract the folder at location c:
\k
afka_x.xx-x.x.x
Step 3: Add c:
\k
afka_x.xx-x.x.x
\b
in path as environment variable.
##### Start the ZooKeeper and Kafka Server by using the below command
##### Start ZooKeeper
Go to the zookeeper location using terminal and hit below command like
> `cd c:\apache-zookeeper-x.x.x`
> `zkserver`
<!-- or -->
<!-- >
`.\bin\zookeeper-server-start.sh .\config\zookeeper.properties`
-->
If no error on the console means Zookeeper is started and running.
##### Start Kafka Server
Go to the kafka location using terminal and hit below command like
> `cd c:\kafka_x.xx-x.x.x`
> `.\bin\windows\kafka-server-start.bat .\config\server.properties`
If no error on the console means Apache Kafka is started and running now.
### Code Snippets
1.
#### Maven Dependencies
Need to add below dependency to enable kafka in
**pom.xml**
.
```
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
```
2. #### Properties file
Reading some properties from **application.yml** file, like bootstrap servers, group id and topics.
Here we have two topics to publish and consume data.
> message-topic (for string data)
superhero-topic (for SuperHero objects)
**src/main/resources/application.yml**
```
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: group_id
producer:
bootstrap-servers: localhost:9092
topic: message-topic
superhero-topic: superhero-topic
```
3. #### Model class
This is the model class which we will publish kafka topic using `KafkaTemplate` and consume it using `@KafkaListner` from the same topic.
**model.com.lokesh.spring.kafka.SuperHero.java**
```
public class SuperHero implements Serializable {
private String name;
private String superName;
private String profession;
private int age;
private boolean canFly;
// Constructor, Getter and Setter
}
```
4. #### Kafka Configuration
The kafka producer related configuration is under **config.com.lokesh.spring.kafka.KafkaProducerConfig.java** class.
This class is marked with `@Configuration` annotation. For JSON producer we have to set value serializer property to `JsonSerializer.class`
and have to pass that factory to KafkaTemplate.
For String producer we have to set value serializer property to `StringSerializer.class` and have to pass that factory to new KafkaTemplate.
- Json Producer configuration
```
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
```
```
@Bean
public <T> KafkaTemplate<String, T> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
```
- String Producer configuration
```
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
```
```
@Bean
public KafkaTemplate<String, String> kafkaStringTemplate() {
return new KafkaTemplate<>(producerStringFactory());
}
```
The kafka consumer related configuration is under **config.com.lokesh.spring.kafka.KafkaConsumerConfig.java** class.
This class is marked with `@Configuration` and `@EnableKafka` (mandatory to consume the message in config class or main class) annotation.
For JSON consumer we have to set value deserializer property to `JsonDeserializer.class` and have to pass that factory to ConsumerFactory.
For String consumer we have to set value deserializer property to `StringDeserializer.class` and have to pass that factory to new ConsumerFactory.
- Json Consumer configuration
```
@Bean
public ConsumerFactory<String, SuperHero> consumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
config.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
config.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
config.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(), new JsonDeserializer<>(SuperHero.class));
}
@Bean
public <T> ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerJsonFactory() {
ConcurrentKafkaListenerContainerFactory<String, SuperHero> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setMessageConverter(new StringJsonMessageConverter());
factory.setBatchListener(true);
return factory;
}
- String Consumer configuration
```
@Bean
public ConsumerFactory<String, String> stringConsumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
config.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
config.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
config.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(), new StringDeserializer());
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerStringFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(stringConsumerFactory());
factory.setBatchListener(true);
return factory;
}
5. #### Publishing data to Kafka Topic
In **service.com.lokesh.spring.kafka.ProducerService.java** class both String and JSON `KafkaTemplate`s are autowired
and using send() method we can publish data to kafka topics.
- Publishing Json Object
```
@Autowired
private KafkaTemplate<String, T> kafkaTemplateSuperHero;
public void sendSuperHeroMessage(T superHero) {
logger.info("#### -> Publishing SuperHero :: {}", superHero);
kafkaTemplateSuperHero.send(superHeroTopic, superHero);
}
```
- Publishing String message
```
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void sendMessage(String message) {
logger.info("#### -> Publishing message -> {}", message);
kafkaTemplate.send(topic, message);
}
```
6. #### Consuming data from Kafka Topic
In **service.com.lokesh.spring.kafka.ConsumerService.java** class, we are consuming data from topics using `@KafkaListener` annotation.
We are binding consumer factory from **KafkaConsumerConfig.java** class to **containerFactory** in KafkaListener.
```
// String Consumer
@KafkaListener(topics = {"${spring.kafka.topic}"}, containerFactory = "kafkaListenerStringFactory", groupId = "group_id")
public void consumeMessage(String message) {
logger.info("**** -> Consumed message -> {}", message);
}
// Object Consumer
@KafkaListener(topics = {"${spring.kafka.superhero-topic}"}, containerFactory = "kafkaListenerJsonFactory", groupId = "group_id")
public void consumeSuperHero(SuperHero superHero) {
logger.info("**** -> Consumed Super Hero :: {}", superHero);
}
```
### API Endpoints
> **GET Mapping** http://localhost:8080/kafka/publish?message=test message
> **POST Mapping** http://localhost:8080/kafka/publish
Request Body
```
{
"name": "Tony",
"superName": "Iron Man",
"profession": "Business",
"age": 50,
"canFly": true
}
```
### Console Output
```
2020-08-12 16:10:30.737 INFO 7376 ---
[
nio-8080-exec-4
]
service.com.lokesh.spring.kafka.ProducerService : #### -> Publishing message -> test message
2020-08-12 16:10:30.744 INFO 7376 ---
[
ntainer#1-0-C-1
]
service.com.lokesh.spring.kafka.ConsumerService :
****
-> Consumed message -> test message
2020-08-12 16:10:35.615 INFO 7376 ---
[
nio-8080-exec-5
]
service.com.lokesh.spring.kafka.ProducerService : #### -> Publishing SuperHero :: SuperHero
[
name=Tony, superName=Iron Man, profession=Business, age=50, canFly=true
]
2020-08-12 16:10:35.626 INFO 7376 ---
[
ntainer#0-0-C-1
]
service.com.lokesh.spring.kafka.ConsumerService :
****
-> Consumed Super Hero :: SuperHero
[
name=Tony, superName=Iron Man, profession=Business, age=50, canFly=true
]
```

pom.xml
0 → 100644
View file @
178f5e5b
<?xml version="1.0" encoding="UTF-8"?>
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<groupId>
com.lokesh.spring.kafka
</groupId>
<artifactId>
spring-boot-kafka
</artifactId>
<version>
0.0.1-SNAPSHOT
</version>
<packaging>
jar
</packaging>
<name>
spring-boot-kafka
</name>
<description>
POC project for Spring Boot Kafka
</description>
<parent>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-parent
</artifactId>
<version>
2.3.2.RELEASE
</version>
<relativePath/>
<!-- lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>
UTF-8
</project.build.sourceEncoding>
<project.reporting.outputEncoding>
UTF-8
</project.reporting.outputEncoding>
<java.version>
1.8
</java.version>
</properties>
<dependencies>
<dependency>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-web
</artifactId>
</dependency>
<dependency>
<groupId>
org.springframework.kafka
</groupId>
<artifactId>
spring-kafka
</artifactId>
</dependency>
<dependency>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-test
</artifactId>
<scope>
test
</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-maven-plugin
</artifactId>
</plugin>
</plugins>
</build>
</project>
src/main/java/com/lokesh/spring/kafka/SpringBootKafkaApplication.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
;
import
org.springframework.boot.SpringApplication
;
import
org.springframework.boot.autoconfigure.SpringBootApplication
;
@SpringBootApplication
public
class
SpringBootKafkaApplication
{
public
static
void
main
(
String
[]
args
)
{
SpringApplication
.
run
(
SpringBootKafkaApplication
.
class
,
args
);
}
}
src/main/java/com/lokesh/spring/kafka/config/KafkaConsumerConfig.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
config
;
import
com.lokesh.spring.kafka.model.SuperHero
;
import
org.apache.kafka.clients.consumer.ConsumerConfig
;
import
org.apache.kafka.common.serialization.StringDeserializer
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.kafka.annotation.EnableKafka
;
import
org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory
;
import
org.springframework.kafka.core.ConsumerFactory
;
import
org.springframework.kafka.core.DefaultKafkaConsumerFactory
;
import
org.springframework.kafka.listener.KafkaListenerErrorHandler
;
import
org.springframework.kafka.support.converter.StringJsonMessageConverter
;
import
org.springframework.kafka.support.serializer.JsonDeserializer
;
import
java.util.HashMap
;
import
java.util.Map
;
@Configuration
@EnableKafka
public
class
KafkaConsumerConfig
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
getClass
());
@Value
(
"${spring.kafka.consumer.bootstrap-servers: localhost:9092}"
)
private
String
bootstrapServers
;
@Value
(
"${spring.kafka.consumer.group-id: group_id}"
)
private
String
groupId
;
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// Json Consumer
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
@Bean
public
ConsumerFactory
<
String
,
SuperHero
>
consumerFactory
()
{
Map
<
String
,
Object
>
config
=
new
HashMap
<>();
config
.
put
(
ConsumerConfig
.
BOOTSTRAP_SERVERS_CONFIG
,
bootstrapServers
);
config
.
put
(
ConsumerConfig
.
GROUP_ID_CONFIG
,
groupId
);
config
.
put
(
ConsumerConfig
.
AUTO_OFFSET_RESET_CONFIG
,
"earliest"
);
config
.
put
(
ConsumerConfig
.
ENABLE_AUTO_COMMIT_CONFIG
,
false
);
config
.
put
(
ConsumerConfig
.
KEY_DESERIALIZER_CLASS_CONFIG
,
StringDeserializer
.
class
);
config
.
put
(
ConsumerConfig
.
VALUE_DESERIALIZER_CLASS_CONFIG
,
JsonDeserializer
.
class
);
return
new
DefaultKafkaConsumerFactory
<>(
config
,
new
StringDeserializer
(),
new
JsonDeserializer
<>(
SuperHero
.
class
));
}
@Bean
public
<
T
>
ConcurrentKafkaListenerContainerFactory
<?,
?>
kafkaListenerJsonFactory
()
{
ConcurrentKafkaListenerContainerFactory
<
String
,
SuperHero
>
factory
=
new
ConcurrentKafkaListenerContainerFactory
<>();
factory
.
setConsumerFactory
(
consumerFactory
());
factory
.
setMessageConverter
(
new
StringJsonMessageConverter
());
factory
.
setBatchListener
(
true
);
return
factory
;
}
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// String Consumer
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
@Bean
public
ConsumerFactory
<
String
,
String
>
stringConsumerFactory
()
{
Map
<
String
,
Object
>
config
=
new
HashMap
<>();
config
.
put
(
ConsumerConfig
.
BOOTSTRAP_SERVERS_CONFIG
,
bootstrapServers
);
config
.
put
(
ConsumerConfig
.
GROUP_ID_CONFIG
,
groupId
);
config
.
put
(
ConsumerConfig
.
AUTO_OFFSET_RESET_CONFIG
,
"earliest"
);
config
.
put
(
ConsumerConfig
.
ENABLE_AUTO_COMMIT_CONFIG
,
false
);
config
.
put
(
ConsumerConfig
.
KEY_DESERIALIZER_CLASS_CONFIG
,
StringDeserializer
.
class
);
config
.
put
(
ConsumerConfig
.
VALUE_DESERIALIZER_CLASS_CONFIG
,
StringDeserializer
.
class
);
return
new
DefaultKafkaConsumerFactory
<>(
config
,
new
StringDeserializer
(),
new
StringDeserializer
());
}
@Bean
public
ConcurrentKafkaListenerContainerFactory
<
String
,
String
>
kafkaListenerStringFactory
()
{
ConcurrentKafkaListenerContainerFactory
<
String
,
String
>
factory
=
new
ConcurrentKafkaListenerContainerFactory
<>();
factory
.
setConsumerFactory
(
stringConsumerFactory
());
factory
.
setBatchListener
(
true
);
return
factory
;
}
@Bean
public
KafkaListenerErrorHandler
myTopicErrorHandler
()
{
return
(
m
,
e
)
->
{
logger
.
error
(
"❌Got an error {}"
,
e
.
getMessage
());
return
"some info about the failure"
;
};
}
}
src/main/java/com/lokesh/spring/kafka/config/KafkaProducerConfig.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
config
;
import
org.apache.kafka.clients.producer.ProducerConfig
;
import
org.apache.kafka.common.serialization.StringSerializer
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.context.annotation.Primary
;
import
org.springframework.kafka.core.DefaultKafkaProducerFactory
;
import
org.springframework.kafka.core.KafkaTemplate
;
import
org.springframework.kafka.core.ProducerFactory
;
import
org.springframework.kafka.support.serializer.JsonSerializer
;
import
java.util.HashMap
;
import
java.util.Map
;
@Configuration
public
class
KafkaProducerConfig
{
@Value
(
"${spring.kafka.producer.bootstrap-servers: localhost:9092}"
)
private
String
bootstrapServers
;
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// Json Producer
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
@Bean
public
<
T
>
ProducerFactory
<
String
,
T
>
producerFactory
()
{
Map
<
String
,
Object
>
configProps
=
new
HashMap
<>();
configProps
.
put
(
ProducerConfig
.
BOOTSTRAP_SERVERS_CONFIG
,
bootstrapServers
);
configProps
.
put
(
ProducerConfig
.
ACKS_CONFIG
,
"all"
);
configProps
.
put
(
ProducerConfig
.
RETRIES_CONFIG
,
3
);
configProps
.
put
(
ProducerConfig
.
KEY_SERIALIZER_CLASS_CONFIG
,
StringSerializer
.
class
);
configProps
.
put
(
ProducerConfig
.
VALUE_SERIALIZER_CLASS_CONFIG
,
JsonSerializer
.
class
);
// **___ configure the following three settings for SSL Encryption ___**
//configProps.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SSL");
//configProps.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "/var/private/ssl/kafka.client.truststore.jks");
//configProps.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, "test1234");
// **___configure the following three settings for SSL Authentication ___**
//configProps.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, "/var/private/ssl/kafka.client.keystore.jks");
//configProps.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, "test1234");
//configProps.put(SslConfigs.SSL_KEY_PASSWORD_CONFIG, "test1234");
return
new
DefaultKafkaProducerFactory
<>(
configProps
);
}
@Primary
@Bean
public
<
T
>
KafkaTemplate
<
String
,
T
>
kafkaTemplate
()
{
return
new
KafkaTemplate
<>(
producerFactory
());
}
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
// String Producer
// +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
@Bean
public
ProducerFactory
<
String
,
String
>
producerStringFactory
()
{
Map
<
String
,
Object
>
configProps
=
new
HashMap
<>();
configProps
.
put
(
ProducerConfig
.
BOOTSTRAP_SERVERS_CONFIG
,
bootstrapServers
);
configProps
.
put
(
ProducerConfig
.
ACKS_CONFIG
,
"all"
);
configProps
.
put
(
ProducerConfig
.
RETRIES_CONFIG
,
3
);
configProps
.
put
(
ProducerConfig
.
KEY_SERIALIZER_CLASS_CONFIG
,
StringSerializer
.
class
);
configProps
.
put
(
ProducerConfig
.
VALUE_SERIALIZER_CLASS_CONFIG
,
StringSerializer
.
class
);
// **___ configure the following three settings for SSL Encryption ___**
//configProps.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SSL");
//configProps.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "/var/private/ssl/kafka.client.truststore.jks");
//configProps.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, "test1234");
// **___configure the following three settings for SSL Authentication ___**
//configProps.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, "/var/private/ssl/kafka.client.keystore.jks");
//configProps.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, "test1234");
//configProps.put(SslConfigs.SSL_KEY_PASSWORD_CONFIG, "test1234");
return
new
DefaultKafkaProducerFactory
<>(
configProps
);
}
@Bean
public
KafkaTemplate
<
String
,
String
>
kafkaStringTemplate
()
{
return
new
KafkaTemplate
<>(
producerStringFactory
());
}
}
src/main/java/com/lokesh/spring/kafka/controllers/KafkaController.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
controllers
;
import
com.lokesh.spring.kafka.model.SuperHero
;
import
com.lokesh.spring.kafka.service.ProducerService
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.web.bind.annotation.*
;
import
java.util.HashMap
;
import
java.util.Map
;
@RestController
@RequestMapping
(
value
=
"/kafka"
)
public
class
KafkaController
{
@Autowired
private
ProducerService
<
SuperHero
>
producerService
;
@PostMapping
(
value
=
"/publish/message"
)
public
String
sendMessageToKafkaTopic
(
@RequestParam
(
"message"
)
String
message
)
{
producerService
.
sendMessage
(
message
);
return
"🍻🍻Successfully publisher message..!"
;
}
@PostMapping
(
value
=
"/publish"
)
public
Map
<
String
,
Object
>
sendObjectToKafkaTopic
(
@RequestBody
SuperHero
superHero
)
{
producerService
.
sendSuperHeroMessage
(
superHero
);
Map
<
String
,
Object
>
map
=
new
HashMap
<>();
map
.
put
(
"message"
,
"🍻🍻🍻Successfully publisher Super Hero..!"
);
map
.
put
(
"payload"
,
superHero
);
return
map
;
}
}
src/main/java/com/lokesh/spring/kafka/model/Message.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
model
;
public
class
Message
{
private
static
final
long
serialVersionUID
=
1L
;
private
String
message
;
public
Message
()
{}
public
Message
(
String
message
)
{
this
.
message
=
message
;
}
public
String
getMessage
()
{
return
message
;
}
public
void
setMessage
(
String
message
)
{
this
.
message
=
message
;
}
}
src/main/java/com/lokesh/spring/kafka/model/SuperHero.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
model
;
import
java.io.Serializable
;
public
class
SuperHero
implements
Serializable
{
private
static
final
long
serialVersionUID
=
1L
;
private
String
name
;
private
String
superName
;
private
String
profession
;
private
int
age
;
private
boolean
canFly
;
public
SuperHero
()
{
}
public
SuperHero
(
String
name
,
String
superName
,
String
profession
,
int
age
,
boolean
canFly
)
{
super
();
this
.
name
=
name
;
this
.
superName
=
superName
;
this
.
profession
=
profession
;
this
.
age
=
age
;
this
.
canFly
=
canFly
;
}
public
String
getName
()
{
return
name
;
}
public
void
setName
(
String
name
)
{
this
.
name
=
name
;
}
public
String
getSuperName
()
{
return
superName
;
}
public
void
setSuperName
(
String
superName
)
{
this
.
superName
=
superName
;
}
public
String
getProfession
()
{
return
profession
;
}
public
void
setProfession
(
String
profession
)
{
this
.
profession
=
profession
;
}
public
int
getAge
()
{
return
age
;
}
public
void
setAge
(
int
age
)
{
this
.
age
=
age
;
}
public
boolean
isCanFly
()
{
return
canFly
;
}
public
void
setCanFly
(
boolean
canFly
)
{
this
.
canFly
=
canFly
;
}
@Override
public
String
toString
()
{
return
"SuperHero [name="
+
name
+
", superName="
+
superName
+
", profession="
+
profession
+
", age="
+
age
+
", canFly="
+
canFly
+
"]"
;
}
}
\ No newline at end of file
src/main/java/com/lokesh/spring/kafka/service/ConsumerService.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
service
;
import
com.lokesh.spring.kafka.model.SuperHero
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.springframework.kafka.annotation.KafkaListener
;
import
org.springframework.stereotype.Service
;
@Service
public
class
ConsumerService
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
getClass
());
@KafkaListener
(
topics
=
{
"${spring.kafka.topic}"
},
containerFactory
=
"kafkaListenerStringFactory"
,
groupId
=
"group_id"
)
public
void
consumeMessage
(
String
message
)
{
logger
.
info
(
"**** -> Consumed message -> {}"
,
message
);
}
@KafkaListener
(
topics
=
{
"${spring.kafka.superhero-topic}"
},
containerFactory
=
"kafkaListenerJsonFactory"
,
groupId
=
"group_id"
)
public
void
consumeSuperHero
(
SuperHero
superHero
)
{
logger
.
info
(
"**** -> 😋😋 Consumed Super Hero :: {}"
,
superHero
);
}
}
src/main/java/com/lokesh/spring/kafka/service/ProducerService.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
.
service
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.kafka.core.KafkaTemplate
;
import
org.springframework.stereotype.Service
;
@Service
public
class
ProducerService
<
T
>
{
private
final
Logger
logger
=
LoggerFactory
.
getLogger
(
getClass
());
@Value
(
"${spring.kafka.topic}"
)
private
String
topic
;
@Value
(
"${spring.kafka.superhero-topic}"
)
private
String
superHeroTopic
;
@Autowired
private
KafkaTemplate
<
String
,
String
>
kafkaTemplate
;
@Autowired
private
KafkaTemplate
<
String
,
T
>
kafkaTemplateSuperHero
;
public
void
sendMessage
(
String
message
)
{
logger
.
info
(
"#### -> Publishing message -> {}"
,
message
);
kafkaTemplate
.
send
(
topic
,
message
);
}
public
void
sendSuperHeroMessage
(
T
superHero
)
{
logger
.
info
(
"#### -> Publishing SuperHero :: {}"
,
superHero
);
kafkaTemplateSuperHero
.
send
(
superHeroTopic
,
superHero
);
}
}
src/main/resources/Output.PNG
0 → 100644
View file @
178f5e5b
18.5 KB
src/main/resources/application.yml
0 → 100644
View file @
178f5e5b
server
:
port
:
8095
spring
:
kafka
:
consumer
:
bootstrap-servers
:
localhost:9092
group-id
:
group_id
# auto-offset-reset: earliest
# key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
# value-deserializer: org.apache.kafka.common.serialization.StringDeserializer
producer
:
bootstrap-servers
:
localhost:9092
# key-serializer: org.apache.kafka.common.serialization.StringSerializer
# value-serializer: org.apache.kafka.common.serialization.StringSerializer
topic
:
message-topic
superhero-topic
:
superhero-topic
\ No newline at end of file
src/test/java/com/lokesh/spring/kafka/SpringBootKafkaApplicationTests.java
0 → 100644
View file @
178f5e5b
package
com
.
lokesh
.
spring
.
kafka
;
import
org.junit.Test
;
import
org.junit.runner.RunWith
;
import
org.springframework.boot.test.context.SpringBootTest
;
import
org.springframework.test.context.junit4.SpringRunner
;
@RunWith
(
SpringRunner
.
class
)
@SpringBootTest
public
class
SpringBootKafkaApplicationTests
{
@Test
public
void
contextLoads
()
{
}
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment