Expanso + Makoto Integration Concept

Note: This page explores how Makoto Levels (Makoto) attestation concepts could be implemented using a data pipeline platform like Expanso. This is a conceptual integration proposal demonstrating one possible approach. Other platforms and tools could provide similar capabilities—the concepts here are not tied to any specific vendor.

This document explores the potential for integrating Makoto attestation capabilities with Expanso, a distributed data pipeline platform. The goal is to show how pipeline primitives could be extended to support data provenance and DBOM generation. While we use Expanso as an example, these patterns could be implemented with any capable data pipeline or stream processing platform.

What is Expanso?

Expanso is a data pipeline platform for processing data at the edge. It uses a declarative YAML configuration with inputs, processors, and outputs.

Expanso and similar platforms offer powerful primitives for building data pipelines that can address a wide range of attestation and provenance needs. Example pipelines could include: cryptographic data signatures computed at the edge for tamper detection; firmware attestation using TPM-backed hardware verification; complete data lineage tracking from source through every transformation; provenance metadata capturing who, what, when, and where for each data point; and AI/ML integration for anomaly detection, quality scoring, and automated classification. These capabilities aren't unique to Expanso—platforms like Apache NiFi, Kafka Streams, Flink, or custom solutions built on message queues could achieve similar results. We present Expanso as one illustrative example, not as a required choice.

Edge Processing Run pipelines close to data sources
Bloblang Powerful transformation language
200+ Connectors Kafka, MQTT, S3, databases, etc.
PII Detection Built-in data governance tools

Potential Integration Approaches

Makoto attestation could be added to Expanso pipelines through several approaches, ranging from simple Bloblang mappings to dedicated processor plugins.

Integration Options

  • Bloblang Mappings — Use existing mapping processors to add attestation metadata and compute content hashes. Works today with no changes to Expanso.
  • Command Processor — Shell out to external signing tools (e.g., cosign, sigstore) for L2 cryptographic signatures.
  • Custom Processor Plugin — Build a dedicated DPL processor that handles attestation generation, signature, and DBOM verification natively.
  • Platform Integration — Deep integration at the Expanso platform level for automatic DBOM generation for all pipeline outputs.

Conceptual Pipeline Examples

These examples show how Makoto attestation metadata could be added to Expanso pipelines using existing Bloblang capabilities. The YAML syntax is valid Expanso configuration, but the Makoto-specific fields are conceptual.

Concept: L1 Origin Attestation (DBOM)

Adding provenance metadata to incoming sensor data using Bloblang mappings

# Conceptual: Adding Makoto metadata via Bloblang
# This uses real Expanso syntax with conceptual DPL fields

input:
  mqtt:
    urls: ["mqtt://sensors.example.com:1883"]
    topics: ["sensors/+/temperature"]

pipeline:
  processors:
    # Add origin attestation metadata
    - mapping: |
        root = this
        root.dpl = {
          "type": "origin/v1",
          "level": 1,
          "source": meta("mqtt_topic"),
          "timestamp": now(),
          "content_hash": this.hash("sha256")
        }

output:
  aws_s3:
    bucket: "sensor-data-lake"
    path: 'data/${!timestamp_unix()}.json'

Concept: L2 Signed Transform

Adding cryptographic signatures via external signing tool

# Conceptual: L2 with external signing
# Uses command processor to invoke signing tool

input:
  kafka:
    addresses: ["kafka:9092"]
    topics: ["raw-data"]

pipeline:
  processors:
    # Transform data
    - mapping: |
        root = this
        root.user_id = this.user_id.hash("xxhash64")

    # Add transform attestation
    - mapping: |
        root.dpl = {
          "type": "transform/v1",
          "level": 2,
          "transforms": ["pii_hash"],
          "content_hash": this.hash("sha256")
        }

    # Sign with external tool (conceptual)
    - command:
        name: "cosign"
        args_mapping: '["sign-blob", "--yes", "-"]'

output:
  sql_insert:
    driver: "postgres"
    dsn: "${DATABASE_URL}"
    table: "processed_data"

What Would Be Needed

To make Makoto a first-class feature of Expanso, the following would need to be developed:

Makoto Processor Plugin

A native Expanso processor that generates Makoto-compliant attestations, handles key management, and supports all three provenance levels.

Signing Integration

Built-in support for Sigstore/cosign, Vault transit, or HSM-based signing without shelling out to external commands.

Attestation Storage

Output connector for attestation registries (Rekor, custom stores) with automatic sidecar attestation publishing.

Verification Input

Input processor that validates incoming data has valid DBOMs before allowing it into the pipeline.

Potential Use Cases

If Makoto were integrated with Expanso, it could enable DBOM generation and data provenance tracking for various industries:

IoT & Manufacturing

Track sensor data provenance from factory floor to analytics platform. Verify measurements haven't been tampered with.

Healthcare Data

HIPAA-compliant data sharing with attestations that prove transformations maintained integrity without exposing PHI.

Financial Services

Audit-ready transaction logs with unforgeable attestations for regulatory compliance.

Supply Chain

End-to-end visibility for product journey with cryptographic proof at each handoff.

Interested in This Integration?

This is a conceptual proposal. If you're interested in seeing Makoto support in Expanso, reach out to discuss collaboration opportunities.

Learn About Expanso Read Makoto Spec