Skip to main content
This feature is actively being developed and may change before release.
For automated testing in CI/CD pipelines, Kanal supports declarative test definitions using YAML. Define test scenarios with input messages and expected outputs to validate your pipeline logic automatically.

Test File Structure

Create a tests.yaml file with your test definitions:
tests.yaml
name: Order Processing Tests
tests:
  - name: Basic order transformation
    steps:
      - send:
          node: kafka-consumer-1
          message:
            id: 1
            customer:
              name: "Alice"
            items:
              - price: 10
              - price: 20
      
      - assert:
          node: jdbc-sink-1
          output:
            order_id: 1
            customer_name: "Alice"
            total_amount: 30

  - name: High-value order routing
    steps:
      - send:
          node: kafka-consumer-1
          message:
            id: 2
            customer:
              name: "Bob"
              tier: "gold"
            items:
              - price: 500
              - price: 750
      
      - assert:
          node: branch-1
          output: high-value
          message:
            order_id: 2
            total_amount: 1250
Then reference this test file in your application.yml:
kafka:
  bootstrap.servers: localhost:9092

databases:
  default:
    connection.url: jdbc:postgresql://localhost:5432/mydb
    connection.user: postgres
    connection.password: secret

tests:
  file: tests.yaml        # Path to test definitions file
  run: true               # Enable test execution on startup

Test Steps

Each test consists of sequential steps. Two step types are available:

Send Step

Pushes a message to an input node (source stub):
- send:
    node: kafka-consumer-1    # Target source node ID
    message:                   # JSON message to send
      id: 123
      data: "test"

Assert Step

Validates the output at a specific node:
- assert:
    node: jdbc-sink-1         # Node to check
    output:                    # Expected output (partial match)
      order_id: 123
      status: "processed"
For branch nodes, you can assert which output was taken:
- assert:
    node: branch-1
    output: high-value        # Branch output name
    message:                   # Expected message on that branch
      tier: "premium"

Running Tests

Tests are automatically executed when Kanal starts if the tests.run configuration is enabled. Configure the test file location and execution options in application.yml or via environment variables.

From the Command Line

tests:
  file: tests.yaml       # Path to test definitions
  run: true              # Enable test execution on startup

In CI/CD

Add Kanal tests to your CI pipeline by configuring test execution:
- name: Run Kanal Pipeline Tests
  run: |
    java -jar kanal.jar
  env:
    KAFKA_BOOTSTRAP_SERVERS: localhost:9092
    DATABASES_DEFAULT_CONNECTION_URL: jdbc:postgresql://localhost:5432/testdb
    DATABASES_DEFAULT_CONNECTION_USER: postgres
    DATABASES_DEFAULT_CONNECTION_PASSWORD: secret
    TESTS_FILE: pipelines/tests.yaml
    TESTS_RUN: "true"
    TESTS_VERBOSE: "true"

Test Output

Running: Order Processing Tests
  ✓ Basic order transformation (23ms)
  ✓ High-value order routing (18ms)
  ✗ Edge case: empty order (45ms)
    Expected output at jdbc-sink-1:
      { "status": "rejected" }
    Actual:
      { "status": "error", "reason": "no items" }

Tests: 2 passed, 1 failed

Assertion Options

OptionDescription
outputExpected output object (partial match by default)
exactMatchRequire exact match instead of partial
absentAssert that no output was produced
countAssert number of output records
# Partial match (default) - passes if output contains these fields
- assert:
    node: sink-1
    output:
      status: "ok"

# Exact match - output must match exactly
- assert:
    node: sink-1
    exactMatch: true
    output:
      status: "ok"
      timestamp: "2024-01-15T10:00:00Z"

# No output expected
- assert:
    node: sink-1
    absent: true

# Multiple outputs expected
- assert:
    node: sink-1
    count: 3
Tests run in isolation. Each test starts with a fresh pipeline state, so tests don’t affect each other.

Best Practices

Use Interactive Testing to build and debug your transformations interactively. Once you’re confident in the logic, codify the scenarios as CI/CD tests in your tests.yaml file.
Include tests for:
  • Empty arrays and null values
  • Missing optional fields
  • Boundary conditions (e.g., exactly at threshold values)
  • Invalid data that should be rejected
Each test should verify one specific behavior. This makes failures easier to diagnose and tests more maintainable.
Name tests to describe the scenario being tested: “High-value orders route to priority queue” is better than “Test 1”.

Next Steps