Apache Kafka has become one of the most popular distributed event streaming platforms for building scalable and high-performance applications. In modern microservice architectures, Kafka enables asynchronous communication between services, improves reliability, and handles millions of events efficiently.
What is Kafka?
Apache Kafka is an open-source distributed event streaming platform used for:
- Real-time data streaming
- Event-driven architecture
- Message brokering
- Log aggregation
- Data pipelines
Kafka works using three major components:
- Producer → Sends messages
- Consumer → Receives messages
- Broker → Kafka server managing topics and messages
Why Use Kafka with .NET Core?
Using Kafka in .NET applications provides several benefits:
- High throughput
- Fault tolerance
- Scalability
- Loose coupling between services
- Asynchronous processing
- Real-time event streaming
Typical use cases include:
- Order processing systems
- Notification services
- Payment systems
- Audit logging
- Microservices communication
Prerequisites
Before starting, make sure you have:
- .NET 8 SDK installed
- Docker installed
- Kafka running locally
- Basic knowledge of ASP.NET Core Web API
Step 1: Run Kafka Using Docker
Create a docker-compose.yml file:
version: '3'
services:
zookeeper:
image: confluentinc/cp-zookeeper:latest
environment:
ZOOKEEPER_CLIENT_PORT: 2181
kafka:
image: confluentinc/cp-kafka:latest
ports:
- "9092:9092"
environment:
KAFKA_BROKER_ID: 1
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
Run the containers:
docker-compose up -d
This will start:
- Zookeeper
- Kafka Broker
Step 2: Create ASP.NET Core Web API
Create a new project:
dotnet new webapi -n KafkaDemoAPI
Navigate to the project:
cd KafkaDemoAPI
Step 3: Install Kafka NuGet Package
Install the Kafka client package:
dotnet add package Confluent.Kafka
The official Kafka .NET client is maintained by Confluent.
Package documentation:
Step 4: Create Kafka Producer Service
Create a folder named Services.
KafkaProducerService.cs
using Confluent.Kafka;
using System.Text.Json;
namespace KafkaDemoAPI.Services
{
public class KafkaProducerService
{
private readonly ProducerConfig _config;
public KafkaProducerService()
{
_config = new ProducerConfig
{
BootstrapServers = "localhost:9092"
};
}
public async Task ProduceMessageAsync<T>(string topic, T message)
{
using var producer =
new ProducerBuilder<Null, string>(_config).Build();
var jsonMessage =
JsonSerializer.Serialize(message);
var result = await producer.ProduceAsync(
topic,
new Message<Null, string>
{
Value = jsonMessage
});
Console.WriteLine(
$"Message sent: {result.Value}");
}
}
}
Step 5: Register Service in Program.cs
using KafkaDemoAPI.Services;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddSingleton<KafkaProducerService>();
builder.Services.AddControllers();
var app = builder.Build();
app.MapControllers();
app.Run();
Step 6: Create API Controller
Create Controllers/OrderController.cs
using KafkaDemoAPI.Services;
using Microsoft.AspNetCore.Mvc;
namespace KafkaDemoAPI.Controllers
{
[ApiController]
[Route("api/[controller]")]
public class OrderController : ControllerBase
{
private readonly KafkaProducerService _producer;
public OrderController(
KafkaProducerService producer)
{
_producer = producer;
}
[HttpPost]
public async Task<IActionResult> CreateOrder(
[FromBody] object order)
{
await _producer.ProduceMessageAsync(
"order-topic",
order);
return Ok(new
{
Message = "Order published to Kafka"
});
}
}
}
Step 7: Create Kafka Consumer
Now create a background consumer service.
KafkaConsumerService.cs
using Confluent.Kafka;
namespace KafkaDemoAPI.Services
{
public class KafkaConsumerService : BackgroundService
{
private readonly ConsumerConfig _config;
public KafkaConsumerService()
{
_config = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "order-group",
AutoOffsetReset = AutoOffsetReset.Earliest
};
}
protected override Task ExecuteAsync(
CancellationToken stoppingToken)
{
return Task.Run(() =>
{
using var consumer =
new ConsumerBuilder<Ignore, string>(_config)
.Build();
consumer.Subscribe("order-topic");
while (!stoppingToken.IsCancellationRequested)
{
var consumeResult =
consumer.Consume(stoppingToken);
Console.WriteLine(
$"Received Message: {consumeResult.Message.Value}");
}
}, stoppingToken);
}
}
}
Step 8: Register Consumer Service
Update Program.cs
builder.Services.AddHostedService<KafkaConsumerService>();
Step 9: Run the Application
Start the API:
dotnet run
Send a POST request:
POST /api/order
Request Body:
{
"orderId": 101,
"product": "Laptop",
"price": 75000
}
You will see the message consumed in the console.
Kafka Message Flow
The workflow looks like this:
Client Request
↓
.NET Core API
↓
Kafka Producer
↓
Kafka Topic
↓
Kafka Consumer
↓
Background Processing
Best Practices for Kafka in .NET
1. Use Strongly Typed Models
Instead of object, use DTOs.
public class OrderModel
{
public int OrderId { get; set; }
public string Product { get; set; }
public decimal Price { get; set; }
}
2. Store Kafka Configurations in appsettings.json
"Kafka": {
"BootstrapServers": "localhost:9092",
"Topic": "order-topic",
"GroupId": "order-group"
}
3. Implement Retry Policies
Use retry mechanisms for transient failures.
4. Enable Logging and Monitoring
Monitor:
- consumer lag
- failed messages
- broker health
- throughput
Useful tools:
Advantages of Kafka Architecture
| Feature | Benefit |
|---|---|
| Scalability | Handles huge traffic |
| Durability | Messages persisted safely |
| High Throughput | Millions of messages |
| Fault Tolerance | Distributed brokers |
| Loose Coupling | Better microservice architecture |
Common Real-World Kafka Use Cases
Companies use Kafka for:
- Order processing
- Banking transactions
- Real-time analytics
- Fraud detection
- Notification systems
- Event sourcing
- Activity tracking
Many large-scale platforms including Netflix, Uber, and LinkedIn use Kafka extensively.
Conclusion
Implementing Kafka in a .NET Core API is straightforward using the Confluent.Kafka package. Kafka enables highly scalable and asynchronous communication between services, making it ideal for modern distributed systems and microservices.
By integrating Kafka with .NET Core, you can build resilient and event-driven applications capable of handling enterprise-scale workloads efficiently.