← All tools

Messaging & Streaming

Kafka Producer

Publish events to a Kafka topic.

kafka_produce

Overview

Sends a message onto Kafka, the widely-used event bus that many companies run as the backbone of their systems. Lets agents publish events — decisions, alerts, state changes — so downstream services can react in real time.

How it works

Publishes a string-valued message to a Kafka topic with optional key, headers, and partition. Accepts extra producer config (SASL, SSL, acks, compression) and enforces idempotent delivery with configurable retry and timeout controls.

Example

When a user asks:

Publish a 'deployment-complete' event to our releases topic.

the agent calls the tool:

kafka_produce(topic="releases", key="webapp", value="{\"version\":\"2.3.1\",\"env\":\"prod\"}")

and gets back: a confirmation with the offset and partition the record landed in.

Configuration

Set these before calling the tool. Values marked required must be present or the tool call will fail.

KAFKA_BOOTSTRAP_SERVERS required

Comma-separated list of Kafka bootstrap servers. Also exposed as swarmai.tools.kafka.bootstrap-servers or the per-call 'bootstrap_servers' parameter.

Use it in a workflow

Wire this tool into a SwarmAI crew. Use the YAML DSL for declarative workflows, or the Java builder API when you want full programmatic control.

YAML DSL

# event-publisher.yaml
name: event-publisher-crew
process: SEQUENTIAL

agents:
  - id: publisher
    role: Event Publisher
    goal: Publish workflow events to the enterprise event bus
    tools:
      - kafka_produce

tasks:
  - id: event-publisher-task
    agent: publisher
    description: Publish a 'deployment-complete' event to the 'releases' topic with the build metadata as payload.

Java

import ai.intelliswarm.swarmai.agent.Agent;
import ai.intelliswarm.swarmai.task.Task;
import ai.intelliswarm.swarmai.swarm.Swarm;
import ai.intelliswarm.swarmai.swarm.SwarmOutput;
import ai.intelliswarm.swarmai.process.ProcessType;
import ai.intelliswarm.swarmai.tool.messaging.KafkaProducerTool;
import org.springframework.ai.chat.client.ChatClient;
import org.springframework.beans.factory.annotation.Autowired;

@Autowired ChatClient chatClient;
@Autowired KafkaProducerTool kafkaProducerTool;

Agent publisher = Agent.builder()
    .role("Event Publisher")
    .goal("Publish workflow events to the enterprise event bus")
    .chatClient(chatClient)
    .tool(kafkaProducerTool)
    .build();

Task publisherTask = Task.builder()
    .description("Publish a 'deployment-complete' event to the 'releases' topic with the build metadata as payload.")
    .agent(publisher)
    .build();

SwarmOutput result = Swarm.builder()
    .agent(publisher)
    .task(publisherTask)
    .process(ProcessType.SEQUENTIAL)
    .build()
    .kickoff();

What it's good for

Real scenarios where agents put this tool to work.

Emit agent-generated signals and alerts onto the enterprise event bus
Fan out workflow state changes to downstream consumers in real time
Bridge a swarm's decisions into existing Kafka-based microservices
Publish audit events whenever an agent updates a skill

Source

Implementation lives at swarmai-tools/src/main/java/ai/intelliswarm/swarmai/tool/messaging/KafkaProducerTool.java in the swarm-ai repository.

Open kafka_produce on GitHub →