国产av日韩一区二区三区精品,成人性爱视频在线观看,国产,欧美,日韩,一区,www.成色av久久成人,2222eeee成人天堂

Table of Contents
3. Adding Kafka Dependencies in Java (Maven)
4. Creating a Kafka Producer in Java
5. Creating a Kafka Consumer in Java
6. Best Practices for EDA with Kafka & Java
7. Extending with Spring Boot (Optional but Recommended)
Final Thoughts
Home Java javaTutorial Implementing Event-Driven Architecture with Java and Apache Kafka

Implementing Event-Driven Architecture with Java and Apache Kafka

Jul 23, 2025 am 03:51 AM
java kafka

Understand core components: Producers publish events to Topics, Consumers subscribe and process events, and Kafka Broker manages message storage and delivery; 2. Locally build Kafka: use Docker to quickly start ZooKeeper and Kafka services, expose port 9092; 3. Java integration Kafka: introduce kafka-clients dependencies, or use Spring Kafka to improve development efficiency; 4. Write Producer: configure KafkaProducer to send JSON format order events to orders topics; 5. Write Consumer: Subscribe to orders topics through KafkaConsumer, process messages asynchronously and implement business logic; 6. Follow best practices: standardize event structure, ensure consumer idempotence, reasonably naming topics, monitor consumption lag, and elegantly handle errors; 7. Optional Spring Boot extension: simplify development with @KafkaListener and KafkaTemplate to improve maintainability. This architecture achieves high scalability and responsiveness through event decoupling services, and is suitable for modern distributed systems.

Implementing Event-Driven Architecture with Java and Apache Kafka

Event-Driven Architecture (EDA) is a powerful design pattern that enables systems to communicate through events—changes in state that are published, routed, and consumed asynchronously. When combined with Java and Apache Kafka, it becomes a scalable, resilient, and high-performance solution for modern distributed applications.

Implementing Event-Driven Architecture with Java and Apache Kafka

Here's how you can implement Event-Driven Architecture using Java and Apache Kafka effectively.


1. Understanding the Core Components

Before jumping into code, understand the key pieces:

Implementing Event-Driven Architecture with Java and Apache Kafka
  • Producers : Java applications that publish events (messages) to Kafka topics.
  • Topics : Named streams of records; categories or feeds to which events are written.
  • Consumers : Java applications that subscribe to topics and process events.
  • Kafka Broker : The server that manages topics and stores messages.
  • ZooKeeper / Kafka Raft (KRaft) : Coordinates brokers (ZooKeeper in older versions; KRaft in newer ones).

Events typically represent business actions like OrderCreated , PaymentProcessed , or UserRegistered .


2. Setting Up Apache Kafka Locally

You'll need Kafka running. Use Docker for quick setup:

Implementing Event-Driven Architecture with Java and Apache Kafka
 # docker-compose.yml
version: '3'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    Ports:
      - "9092:9092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

Run with:

 docker-compose up -d

Now your Kafka broker is accessible at localhost:9092 .


3. Adding Kafka Dependencies in Java (Maven)

Use the official Kafka clients:

 <dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>3.7.0</version>
</dependency>

For better productivity, consider Spring Boot with Spring Kafka :

 <dependency>
    <groupId>org.springframework.kafka</groupId>
    <artifactId>spring-kafka</artifactId>
</dependency>

But here we'll focus on raw Kafka clients to show core concepts.


4. Creating a Kafka Producer in Java

A producer sends events to a topic:

 import org.apache.kafka.clients.producer.*;
import java.util.Properties;

public class OrderProducer {
    public static void main(String[] args) {
        Properties props = new Properties();
        props.put("bootstrap.servers", "localhost:9092");
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

        Producer<String, String> producer = new KafkaProducer<>(props);

        String topic = "orders";
        String key = "order-123";
        String value = "{\"orderId\": \"123\", \"status\": \"CREATED\", \"amount\": 99.9}";

        ProducerRecord<String, String> record = new ProducerRecord<>(topic, key, value);

        producer.send(record, (metadata, exception) -> {
            if (exception != null) {
                System.err.println("Send failed: " exception.getMessage());
            } else {
                System.out.printf("Sent to %s partition %d offset %d%n",
                        metadata.topic(), metadata.partition(), metadata.offset());
            }
        });

        producer.flush();
        producer.close();
    }
}

This sends an OrderCreated event as JSON to the orders topic.

? Tip: Use Avro, Protobuf, or JSON Schema for structured, versioned events in production.


5. Creating a Kafka Consumer in Java

The consumer reads events from the same topic:

 import org.apache.kafka.clients.consumer.*;
import java.time.Duration;
import java.util.Collections;
import java.util.Properties;

public class OrderConsumer {
    public static void main(String[] args) {
        Properties props = new Properties();
        props.put("bootstrap.servers", "localhost:9092");
        props.put("group.id", "order-processing-group");
        props.put("auto.offset.reset", "earliest");
        props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

        Consumer<String, String> consumer = new KafkaConsumer<>(props);
        consumer.subscribe(Collections.singletonList("orders"));

        try {
            while (true) {
                ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
                for (ConsumerRecord<String, String> record : records) {
                    System.out.printf("Received: key=%s, value=%s, topic=%s, partition=%d, offset=%d%n",
                            record.key(), record.value(), record.topic(), record.partition(), record.offset());

                    // Process business logic here
                    handleOrderEvent(record.value());
                }
            }
        } finally {
            consumer.close();
        }
    }

    private static void handleOrderEvent(String value) {
        // Parse JSON and react accordingly
        System.out.println("Processing order event: " value);
        // eg, update DB, trigger payment service, send email
    }
}

Note:

  • group.id : Enables consumer groups for scalability and failover.
  • auto.offset.reset=earliest : Starts reading from beginning if no offset exists.

6. Best Practices for EDA with Kafka & Java

? Use meaningful topic names
eg, user-signups , payments-failed , not topic1 .

? Structure event data consistently
Include metadata like event type, timestamp, version:

 {
  "eventId": "abc-123",
  "eventType": "OrderCreated",
  "version": "1.0",
  "timestamp": "2025-04-05T10:00:00Z",
  "data": { "orderId": "123", "amount": 99.9 }
}

? Make consumers idealpotent
Since Kafka guaranteees at-least-once delivery, duplicates may occur.

? Scale horizontally
Add more consumers in the same group—they'll split partitions automatically.

? Monitor lag and throughput
Use tools like Kafka Manager, Confluent Control Center, or Prometheus Grafana.

? Handle errors gracefully
Don't crash on bad messages. Log, retry (with backoff), or send to a dead-letter topic.


With Spring Kafka, things get simpler:

 @Component
public class OrderEventListener {

    @KafkaListener(topics = "orders", groupId = "order-processing-group")
    public void listen(String message) {
        System.out.println("Received via Spring: " message);
        // Business logic
    }
}

And auto-configured producer:

 @Autowired
private KafkaTemplate<String, String> kafkaTemplate;

public void sendOrderEvent(String key, String payload) {
    kafkaTemplate.send("orders", key, payload);
}

Much cleaner and integrates well with DI, logging, metrics, etc.


Final Thoughts

Implementing Event-Driven Architecture with Java and Kafka lets you build loosely coupled, scalable, and responsive systems. Start small—produce one event, consume it—and gradually expand to pipelines involving multiple services.

Key takeaways:

  • Kafka acts as the central nervous system.
  • Producers emit facts; consumers react.
  • Design events around business semantics.
  • Use async processing to decouple components.

It's not just about tech—it's about changing how your services think and talk to each other.

Basically, once you get the flow down, it scales beautifully.

The above is the detailed content of Implementing Event-Driven Architecture with Java and Apache Kafka. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undress AI Tool

Undress AI Tool

Undress images for free

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Hot Topics

PHP Tutorial
1502
276
How to handle transactions in Java with JDBC? How to handle transactions in Java with JDBC? Aug 02, 2025 pm 12:29 PM

To correctly handle JDBC transactions, you must first turn off the automatic commit mode, then perform multiple operations, and finally commit or rollback according to the results; 1. Call conn.setAutoCommit(false) to start the transaction; 2. Execute multiple SQL operations, such as INSERT and UPDATE; 3. Call conn.commit() if all operations are successful, and call conn.rollback() if an exception occurs to ensure data consistency; at the same time, try-with-resources should be used to manage resources, properly handle exceptions and close connections to avoid connection leakage; in addition, it is recommended to use connection pools and set save points to achieve partial rollback, and keep transactions as short as possible to improve performance.

Understanding the Java Virtual Machine (JVM) Internals Understanding the Java Virtual Machine (JVM) Internals Aug 01, 2025 am 06:31 AM

TheJVMenablesJava’s"writeonce,runanywhere"capabilitybyexecutingbytecodethroughfourmaincomponents:1.TheClassLoaderSubsystemloads,links,andinitializes.classfilesusingbootstrap,extension,andapplicationclassloaders,ensuringsecureandlazyclassloa

How to work with Calendar in Java? How to work with Calendar in Java? Aug 02, 2025 am 02:38 AM

Use classes in the java.time package to replace the old Date and Calendar classes; 2. Get the current date and time through LocalDate, LocalDateTime and LocalTime; 3. Create a specific date and time using the of() method; 4. Use the plus/minus method to immutably increase and decrease the time; 5. Use ZonedDateTime and ZoneId to process the time zone; 6. Format and parse date strings through DateTimeFormatter; 7. Use Instant to be compatible with the old date types when necessary; date processing in modern Java should give priority to using java.timeAPI, which provides clear, immutable and linear

Comparing Java Frameworks: Spring Boot vs Quarkus vs Micronaut Comparing Java Frameworks: Spring Boot vs Quarkus vs Micronaut Aug 04, 2025 pm 12:48 PM

Pre-formanceTartuptimeMoryusage, Quarkusandmicronautleadduetocompile-Timeprocessingandgraalvsupport, Withquarkusoftenperforminglightbetterine ServerLess scenarios.2.Thyvelopecosyste,

Understanding Network Ports and Firewalls Understanding Network Ports and Firewalls Aug 01, 2025 am 06:40 AM

Networkportsandfirewallsworktogethertoenablecommunicationwhileensuringsecurity.1.Networkportsarevirtualendpointsnumbered0–65535,withwell-knownportslike80(HTTP),443(HTTPS),22(SSH),and25(SMTP)identifyingspecificservices.2.PortsoperateoverTCP(reliable,c

How does garbage collection work in Java? How does garbage collection work in Java? Aug 02, 2025 pm 01:55 PM

Java's garbage collection (GC) is a mechanism that automatically manages memory, which reduces the risk of memory leakage by reclaiming unreachable objects. 1.GC judges the accessibility of the object from the root object (such as stack variables, active threads, static fields, etc.), and unreachable objects are marked as garbage. 2. Based on the mark-clearing algorithm, mark all reachable objects and clear unmarked objects. 3. Adopt a generational collection strategy: the new generation (Eden, S0, S1) frequently executes MinorGC; the elderly performs less but takes longer to perform MajorGC; Metaspace stores class metadata. 4. JVM provides a variety of GC devices: SerialGC is suitable for small applications; ParallelGC improves throughput; CMS reduces

Comparing Java Build Tools: Maven vs. Gradle Comparing Java Build Tools: Maven vs. Gradle Aug 03, 2025 pm 01:36 PM

Gradleisthebetterchoiceformostnewprojectsduetoitssuperiorflexibility,performance,andmoderntoolingsupport.1.Gradle’sGroovy/KotlinDSLismoreconciseandexpressivethanMaven’sverboseXML.2.GradleoutperformsMaveninbuildspeedwithincrementalcompilation,buildcac

go by example defer statement explained go by example defer statement explained Aug 02, 2025 am 06:26 AM

defer is used to perform specified operations before the function returns, such as cleaning resources; parameters are evaluated immediately when defer, and the functions are executed in the order of last-in-first-out (LIFO); 1. Multiple defers are executed in reverse order of declarations; 2. Commonly used for secure cleaning such as file closing; 3. The named return value can be modified; 4. It will be executed even if panic occurs, suitable for recovery; 5. Avoid abuse of defer in loops to prevent resource leakage; correct use can improve code security and readability.

See all articles