This page documents every node available in Streemlined. Nodes are the building blocks of a pipeline and fall into four categories.Documentation Index
Fetch the complete documentation index at: https://docs.streemlined.io/llms.txt
Use this file to discover all available pages before exploring further.
| Category | Nodes | Purpose |
|---|---|---|
| Sources | Kafka Consumer, CSV Source | Ingest data into the pipeline |
| Processors | Transform, Branch, Explode, Lookup, JDBC Request-Reply, Peek, Data Masking | Transform, route, enrich, and inspect data |
| Sinks | Kafka Producer, JDBC Sink, Generic Connect Sink | Output data to external systems |
| Utility | Comment | Annotate the canvas |
Sources
Source nodes ingest data into your pipeline from external systems. Every source has a single output port.Kafka Consumer
Consumes records from one or more partitions of a Kafka topic.| Property | Type | Required | Description |
|---|---|---|---|
cluster | string | Yes | Kafka cluster name (defined in configuration) |
topic | string | Yes | Kafka topic to consume from |
schemaType | enum | Yes | JSON, AVRO_SR, JSON_SR, or PROTO_SR |
schemaId | number | Conditional | Schema Registry ID — required for AVRO_SR, JSON_SR, and PROTO_SR |
propertiesText | string | No | Extra consumer properties (key=value, one per line) |
stubbed | boolean | No | When true, runs as a stub during interactive testing |
CSV Source
Reads records from a local CSV file — useful for development, testing, and seeding reference data.| Property | Type | Required | Description |
|---|---|---|---|
filePath | string | Yes | Path to the CSV file |
separator | string | No | Field separator (default ,) |
skipHeader | boolean | No | Treat the first row as a header (default true) |
skipHeader is false).
CSV Source is primarily intended for local development and testing. For production ingestion, prefer Kafka Consumer or a dedicated connector.
Processors
Processor nodes transform, route, filter, or enrich data as it flows through the pipeline. Each processor has at least one input and one output port.Transform
Transforms records using JSONata expressions.| Property | Type | Required | Description |
|---|---|---|---|
mapping | string | Yes | JSONata expression defining the transformation |
- Map between schemas
- Reshape data structures
- Compute derived fields
- Filter out unwanted fields
- Combine multiple fields
JSONata examples
JSONata examples
Rename fields:Flatten nested objects:Conditional logic:Aggregate arrays:
Branch
Routes records to different outputs based on conditions.| Property | Type | Required | Description |
|---|---|---|---|
branches | array | Yes | List of { id, label, condition } objects |
id— Unique identifierlabel— Display namecondition— JSONata expression that returnstrueorfalse
default output
Records are evaluated against each condition in order. The first matching condition routes the record to that branch’s output. Records that match no condition go to default.
Explode
Expands an array field into individual records (a flatMap operation).| Property | Type | Required | Description |
|---|---|---|---|
arrayToFlatMap | string | Yes | Name of the array field to explode |
Before / after example
Before / after example
Before — one record:After — two records:
Lookup
Enriches records by joining with cached reference data (stream-table join).| Property | Type | Required | Description |
|---|---|---|---|
lookupKey | string | Yes | Expression evaluated on the incoming record to produce the join key |
cacheKey | string | Yes | Expression evaluated on the reference data to produce the cache key |
fieldName | string | Yes | Name for the enriched field added to the output |
lookupFailureBehavior | enum | No | REJECT (default) or CONTINUE |
- input (left) — Main data stream
- reference (top) — Reference data, cached in memory
- output — Enriched records
- reject — Records with no match (only when behavior is
REJECT)
cacheKey. For each incoming record, lookupKey is evaluated and matched against the cache.
Choose
CONTINUE if missing reference data is acceptable — the lookup field will be null. Choose REJECT to route unmatched records to a separate output for error handling or dead-letter queues.JDBC Request-Reply
Enriches each record by executing a parameterized SQL query against a relational database.| Property | Type | Required | Description |
|---|---|---|---|
query | string | Yes | SQL query with ? placeholders |
parameters | array | No | List of { expression, reconcileColumn } objects mapping record fields to query placeholders |
fieldName | string | No | Name for the result field (default jdbc_result) |
lookupFailureBehavior | enum | No | REJECT (default) or CONTINUE |
batchSupport | boolean | No | Enable batched query execution for throughput |
- output — Record enriched with the query result in
fieldName - reject — Records where the query returned no rows (only when behavior is
REJECT)
Peek
Observes records without modifying them — useful for debugging and monitoring.| Property | Type | Required | Description |
|---|---|---|---|
logLevel | enum | No | DEBUG, INFO, or WARN |
- Debug pipeline behavior during development
- Log records at specific points in the pipeline
- Inspect data shapes between processing steps
Data Masking
Masks sensitive fields in each record’s value — redaction, nulling, numeric jitter, or synthetic replacement via Datafaker expressions.| Property | Type | Required | Description |
|---|---|---|---|
maskingRules | array | Yes | List of rules. Each rule has field, mask, and optionally varianceRange (for VARIANCE) or fakerExpression (for FAKER) |
schema | string | Conditional | JSON string of the output Connect/JSON schema — the editor includes this when exporting so the runner can attach the correct valueSchema |
value
Output: Same record shape with masked fields applied; if there are no rules, or the value is missing, the record passes through unchanged.
Field paths use dot notation for nested structs. Append [] to a segment to apply the rule to every element of an array at that path (for example, items[].email masks email inside each item).
mask | Behavior |
|---|---|
REDACT | Replace the value with *** |
NULL | Set the field to JSON null |
VARIANCE | Add a random delta in [-varianceRange, +varianceRange] to numeric fields (integers or floats). If varianceRange is omitted or null, it is treated as 0 |
FAKER | Replace with the result of the Datafaker expression. A blank or missing expression, or a failed evaluation, falls back to *** |
mask values are matched case-insensitively at runtime.
Unknown
mask values are logged and the field is left unchanged. VARIANCE on non-numeric fields is skipped with a warning.Sinks
Sink nodes write data from your pipeline to external systems. Every sink has a single input port and no outputs (terminal nodes).Kafka Producer
Produces records to a Kafka topic.| Property | Type | Required | Description |
|---|---|---|---|
cluster | string | Yes | Kafka cluster name (defined in configuration) |
topic | string | Yes | Kafka topic to produce to |
schemaType | enum | Yes | JSON, AVRO_SR, JSON_SR, or PROTO_SR |
schemaId | number | Conditional | Schema Registry ID — required for AVRO_SR, JSON_SR, and PROTO_SR |
keyExpression | string | No | JSONata expression for the record key |
propertiesText | string | No | Extra producer properties (key=value, one per line) |
stubbed | boolean | No | When true, runs as a stub during interactive testing |
JDBC Sink
Writes records to a relational database table.| Property | Type | Required | Description |
|---|---|---|---|
table | string | Yes | Target table name |
mode | enum | No | INSERT, UPSERT, or UPDATE |
propertiesText | string | No | Extra connection properties (key=value, one per line) |
stubbed | boolean | No | When true, runs as a stub during interactive testing |
Generic Connect Sink
Uses any Kafka Connect sink connector for output.| Property | Type | Required | Description |
|---|---|---|---|
connectorClass | string | Yes | Fully qualified connector class name |
propertiesText | string | Yes | Connector configuration (key=value, one per line) |
stubbed | boolean | No | When true, runs as a stub during interactive testing |
libs/ directory and configure it here.
Utility
Utility nodes help organize and document your pipeline but do not affect data processing.Comment
Adds a text annotation to the pipeline canvas.| Property | Type | Required | Description |
|---|---|---|---|
text | string | No | Comment text displayed on the canvas |
width | number | No | Box width in pixels (default 200) |
height | number | No | Box height in pixels (default 100) |
- Document the purpose of a pipeline section
- Leave notes for teammates
- Mark areas that need future work
Comments are saved as part of the pipeline definition but have no effect on execution.
Next Steps
Transformations
Learn JSONata syntax and best practices
Configuration
Configure Kafka clusters, databases, and more
Interactive Testing
Test your pipeline with stubbed sources and sinks
Core Concepts
Understand schemas, ports, and data flow