Skip to main content
This guide will walk you through installing Kanal and building your first streaming pipeline.

Installation

Option 1: Docker Compose

Create a docker-compose.yml file with your Kafka configuration:
services:
  kanal:
    image: streemlined/kanal:latest
    ports:
      - "8080:8080"
    environment:
      KAFKA_BOOTSTRAP_SERVERS: localhost:9092
      # Authentication (optional)
      KAFKA_SECURITY_PROTOCOL: SASL_PLAINTEXT
      KAFKA_SASL_MECHANISM: PLAIN
      KAFKA_SASL_JAAS_CONFIG: >
        org.apache.kafka.common.security.plain.PlainLoginModule required
        username="admin"
        password="secret";
Run Kanal with Docker Compose:
docker compose up -d

Option 2: Download the JAR

Download the latest release and run it directly:
# Download the latest release
curl -LO https://github.com/streemlined/kanal/releases/latest/download/kanal.jar
Before running Kanal, you need to configure your Kafka cluster. Create an application.yml file:
kafka:
  bootstrap.servers: localhost:9092
  
  # Authentication (optional)
  security.protocol: SASL_PLAINTEXT
  sasl.mechanism: PLAIN
  sasl.jaas.config: >
    org.apache.kafka.common.security.plain.PlainLoginModule required
    username="admin"
    password="secret";
Then run Kanal with the configuration file:
java -jar kanal.jar -Dmicronaut.config.files=application.yml

Access the Editor

Once Kanal is running, open your browser and navigate to:
http://localhost:8080/ui
You should see the visual pipeline editor.

Build Your First Pipeline

Let’s create a simple pipeline that reads from a Kafka topic, transforms the data, and writes to a database.
1

Add a Kafka Consumer

Drag a Kafka Consumer node onto the canvas from the left sidebar.Configure it:
  • Topic: orders
  • Schema Type: JSON or AVRO_SR if using Schema Registry
2

Add a Transform Node

Drag a Transform node onto the canvas and connect it to your Kafka Consumer.Add a JSONata expression to transform the data:
{
  "order_id": id,
  "customer_name": customer.name,
  "total_amount": items.price ~> $sum(),
  "processed_at": $now()
}
See Transformations for more on JSONata.
3

Add a JDBC Sink

Drag a JDBC Sink node onto the canvas and connect it to the Transform node.Configure it:
  • Database: Select your configured database
  • Table: processed_orders
4

Run the Pipeline

Click the Play button in the toolbar to start the pipeline.Monitor the throughput and any errors in the metrics panel.

Verify It’s Working

You can verify your pipeline is working by:
  1. Producing test messages to your Kafka topic
  2. Checking the metrics in the Kanal UI
  3. Querying your database to see the processed records
# Produce a test message (using kafka-console-producer)
echo '{"id": 1, "customer": {"name": "Alice"}, "items": [{"price": 10}, {"price": 20}]}' | \
  kafka-console-producer --topic orders --bootstrap-server localhost:9092

What’s Next?