Kafka + Avro: Messaging Done Right

Lessons from integrating Kafka with Avro schemas in a trading system

  • Kafka
  • Avro
  • Event-Driven
  • Backend

When I designed the OMS, I knew messaging would be the backbone. Orders, trades, and quotes all needed to flow asynchronously between services. I chose Kafka as the event bus and Avro as the serialization format, with Schema Registry to enforce contracts. This combination gave me both performance and discipline.

Why Kafka + Avro?

  • Kafka provides durability, replayability, and high throughput — essential for trading workflows where every event matters.
  • Avro enforces schemas without bloating payloads. Unlike JSON, it’s compact and supports schema evolution.
  • Schema Registry ensures producers and consumers agree on message formats, preventing silent failures.

Topic Design

I mapped the trade lifecycle into three core topics:

  • quotes{ symbol, bid, ask, last, timestamp, venue }
  • orders{ orderId, symbol, side, qty, price, tif, timestamp }
  • trades{ tradeId, orderIds, symbol, qty, price, timestamp }

Each topic has its own Avro schema, versioned and stored in Schema Registry. This makes it possible to evolve fields (e.g., adding venue or orderType) without breaking consumers.

Integration Patterns

  • 📝 Order Service produces to orders.
  • 🛡️ Risk Service consumes orders and quotes, then produces validated orders.
  • ⚖️ Matching Engine consumes validated orders, produces trades.
  • 🗄️ Core Service consumes both orders and trades for persistence.
  • 📡 Market Data Service produces quotes at a configurable rate.

This flow ensures each service is decoupled — they only know about topics, not each other.

Things I Tried

  • ✅ Using Avro code generation to create DTOs directly from schemas, avoiding manual boilerplate.
  • 🔄 Testing schema evolution by adding optional fields and verifying backward compatibility.
  • 🐳 Running Kafka + Schema Registry in Docker Compose with shared networks for easy local orchestration.
  • 📖 Documenting topic contracts in the README so recruiters and teammates can see the data flow at a glance.

Reflections

Working with Kafka and Avro taught me that contracts are as important as code. By treating schemas as first-class citizens, I reduced integration risk and made the system more future-proof. This approach also mirrors how real trading systems evolve — new fields, new asset classes, new venues — without breaking existing flows.