Skip to main content

Components

Expanso provides 200+ components for building powerful data pipelines. Browse, search, and filter components to find exactly what you need.

Showing 200 of 200 components
amqp_0_9
Input
Connects to an AMQP (0.91) queue. AMQP is a messaging protocol used by various message brokers, including RabbitMQ.
ServicesMessaging
amqp_0_9
Output
Sends messages to an AMQP (0.91) exchange. AMQP is a messaging protocol used by various message brokers, including RabbitMQ.Connects to an AMQP (0.91)...
ServicesMessaging
amqp_1
Input
Reads messages from an AMQP (1.0) server.
ServicesMessaging
amqp_1
Output
Sends messages to an AMQP (1.0) server.
ServicesMessaging
archive
Processor
Archives all the messages of a batch into a single message according to the selected archive format.
ParsingUtility
avro
Processor
Performs Avro based operations on messages based on a schema.
ParsingTransform
awk
Processor
Executes an AWK program on messages. This processor is very powerful as it offers a range of custom functions for querying and mutating message conten...
Mapping
aws_dynamodb
Output
Inserts items into or deletes items from a DynamoDB table.
ServicesAWSCloud+2
aws_dynamodb_partiql
Processor
:::caution EXPERIMENTAL
IntegrationAWSCloud+2
aws_kinesis
Input
Receive messages from one or more Kinesis streams.
ServicesAWSCloud
aws_kinesis
Output
Sends messages to a Kinesis stream.
ServicesAWSCloud
aws_kinesis_firehose
Output
Sends messages to a Kinesis Firehose delivery stream.
ServicesAWSCloud
aws_lambda
Processor
Invokes an AWS lambda for each message. The contents of the message is the payload of the request, and the result of the invocation will become the ne...
IntegrationAWSCloud
aws_s3
Input
Downloads objects within an Amazon S3 bucket, optionally filtered by a prefix, either by walking the items in the bucket or by streaming upload notifi...
ServicesAWSCloud+1
aws_s3
Processor
Performs an S3 GetObject operation using the `bucket` + `key` provided in the config and replaces the original message parts with the content retrieve...
ServicesAWSCloud+1
aws_s3
Output
Sends message parts as objects to an Amazon S3 bucket. Each object is uploaded with the path specified with the `path` field.
ServicesAWSCloud+1
aws_sns
Output
Sends messages to an AWS SNS topic.
ServicesAWSCloud
aws_sqs
Input
Consume messages from an AWS SQS URL.
ServicesAWSCloud
aws_sqs
Output
Sends messages to an SQS queue.
ServicesAWSCloud
azure_blob_storage
Input
Downloads objects within an Azure Blob Storage container, optionally filtered by a prefix.
ServicesAzureCloud+1
azure_blob_storage
Output
Sends message parts as objects to an Azure Blob Storage Account container. Each object is uploaded with the filename specified with the `container` fi...
ServicesAzureCloud+1
azure_cosmosdb
Input
Executes a SQL query against Azure CosmosDB and creates a batch of messages from each page of items.
AzureCloud
azure_cosmosdb
Processor
Creates or updates messages as JSON documents in Azure CosmosDB.
AzureCloud
azure_cosmosdb
Output
Creates or updates messages as JSON documents in Azure CosmosDB.
AzureCloud
azure_queue_storage
Input
Dequeue objects from an Azure Storage Queue.
ServicesAzureCloud
azure_queue_storage
Output
Sends messages to an Azure Storage Queue.
ServicesAzureCloud
azure_table_storage
Input
:::caution BETA
ServicesAzureCloud
azure_table_storage
Output
:::caution BETA
ServicesAzureCloud
batched
Input
Consumes data from a child input and applies a batching policy to the stream.
Utility
beanstalkd
Input
:::caution EXPERIMENTAL
Services
beanstalkd
Output
:::caution EXPERIMENTAL
Services
bounds_check
Processor
Removes messages (and batches) that do not fit within certain size boundaries.
Utility
branch
Processor
The `branch` processor allows you to create a new request message via a Bloblang mapping, execute a list of processors on the request messages, and, f...
CompositionTransformRouting
broker
Input
Allows you to combine multiple inputs into a single stream of data, where each input will be read in parallel.
Utility
broker
Output
Allows you to route messages to multiple child outputs using a range of brokering patterns.
Utility
cache
Processor
Performs operations against a cache resource for each message, allowing you to store or retrieve data within message payloads.
Integration
cache
Output
Stores each message in a cache.
Services
cached
Processor
Cache the result of applying one or more processors to messages identified by a key. If the key already exists within the cache the contents of the me...
Utility
cassandra
Input
:::caution EXPERIMENTAL
ServicesDatabaseNoSQL
cassandra
Output
Runs a query against a Cassandra database for each message in order to insert data.
DatabaseNoSQL
catch
Processor
Applies a list of child processors _only_ when a previous processing step has failed.
Composition
cockroachdb_changefeed
Input
:::caution EXPERIMENTAL
Integration
command
Processor
Executes a command for each message.
Integration
compress
Processor
Compresses messages according to the selected algorithm. Supported compression algorithms are: [flate gzip lz4 pgzip snappy zlib]
Parsing
couchbase
Processor
:::caution EXPERIMENTAL
Integration
csv
Input
Reads one or more CSV files as structured records following the format described in RFC 4180.
LocalParsingTransform
cypher
Input
:::caution EXPERIMENTAL
Services
cypher
Output
:::caution EXPERIMENTAL
Services
decompress
Processor
Decompresses messages according to the selected algorithm. Supported decompression algorithms are: [bzip2 flate gzip lz4 pgzip snappy zlib]
Parsing
dedupe
Processor
Deduplicates messages by storing a key value in a cache using the `add` operator. If the key already exists within the cache it is dropped.
Utility
discord
Input
:::caution EXPERIMENTAL
ServicesSocial
discord
Output
:::caution EXPERIMENTAL
ServicesSocial
drop
Output
Drops all messages.
Utility
drop_on
Output
Attempts to write messages to a child output and if the write fails for one of a list of configurable reasons the message is dropped (acked) instead o...
Utility
dynamic
Input
A special broker type where the inputs are identified by unique labels and can be created, changed and removed during runtime via a REST HTTP interfac...
Utility
dynamic
Output
A special broker type where the outputs are identified by unique labels and can be created, changed and removed during runtime via a REST API.
Utility
elasticsearch_v2
Output
Publishes messages into an Elasticsearch index. If the index does not exist then it is created with a dynamic mapping.
ServicesDatabaseSearch+1
etcd
Input
:::caution BETA
Services
fallback
Output
Attempts to send each message to a child output, starting from the first output on the list. If an output attempt fails then the next output in the li...
Utility
file
Input
Consumes data from files on disk, emitting messages according to a chosen codec.
LocalFiles
file
Output
Writes messages to files on disk based on a chosen codec.
LocalFiles
for_each
Processor
A processor that applies a list of child processors to messages of a batch as though they were each a batch of one message.
Composition
gcp_bigquery
Output
Sends messages as new rows to a Google Cloud BigQuery table.
GCPServicesCloud
gcp_bigquery_select
Input
Executes a `SELECT` query against BigQuery and creates a message for each row received.
ServicesGCPCloud
gcp_bigquery_select
Processor
:::caution EXPERIMENTAL
IntegrationGCPCloud
gcp_bigquery_write_api
Output
Sends messages as new rows to a Google Cloud BigQuery table using the BigQuery Storage Write API.
GCPServicesCloud
gcp_cloud_storage
Input
Downloads objects within a Google Cloud Storage bucket, optionally filtered by a prefix.
ServicesGCPCloud
gcp_cloud_storage
Output
Sends message parts as objects to a Google Cloud Storage bucket. Each object is uploaded with the path specified with the `path` field.
ServicesGCPCloud
gcp_pubsub
Input
Consumes messages from a GCP Cloud Pub/Sub subscription.
ServicesGCPCloud
gcp_pubsub
Output
Sends messages to a GCP Cloud Pub/Sub topic. Metadata from messages are sent as attributes.
ServicesGCPCloud
gcp_spanner_cdc
Input
:::caution BETA
ServicesGCPCloud
generate
Input
Generates messages at a given interval using a Bloblang mapping executed without a context. This allows you to generate messages for testing your pipe...
Utility
grok
Processor
Parses messages into a structured format by attempting to apply a list of Grok expressions, the first expression to result in at least one value repla...
Parsing
group_by
Processor
Splits a batch of messages into N batches, where each resulting batch contains a group of messages determined by a Bloblang query.
Composition
group_by_value
Processor
Splits a batch of messages into N batches, where each resulting batch contains a group of messages determined by a function interpolated string evalua...
Composition
hdfs
Input
Reads files from a HDFS directory, where each discrete file will be consumed as a single message payload.
Services
hdfs
Output
Sends message parts as files to a HDFS directory.
Services
http
Processor
Performs an HTTP request using a message batch as the request body, and replaces the original message parts with the body of the response.
IntegrationNetworkHTTP
http_client
Input
Connects to a server and continuously performs requests for a single message.
NetworkHTTP
http_client
Output
Sends messages to an HTTP server.
NetworkHTTP
http_server
Input
Receive messages POSTed over HTTP(S). HTTP 2.0 is supported when using TLS, which is enabled when key and cert files are specified.
NetworkHTTP
http_server
Output
Sets up an HTTP server that will send messages over HTTP(S) GET requests. HTTP 2.0 is supported when using TLS, which is enabled when key and cert fil...
NetworkHTTP
inproc
Input
input:
Utility
inproc
Output
output:
Utility
insert_part
Processor
Insert a new message into a batch at an index. If the specified index is greater than the length of the existing batch it will be appended to the end.
Composition
javascript
Processor
:::caution EXPERIMENTAL
Mapping
jmespath
Processor
Executes a JMESPath query on JSON documents and replaces the message with the resulting document.
Mapping
jq
Processor
Transforms and filters messages using jq queries.
Mapping
json_schema
Processor
Checks messages against a provided JSONSchema definition but does not change the payload under any circumstances. If a message does not match the sche...
MappingParsingTransform
kafka
Input
Connects to Kafka brokers and consumes one or more topics.
ServicesMessagingStreaming
kafka
Output
The kafka output type writes a batch of messages to Kafka brokers and waits for acknowledgement before propagating it back to the input.
ServicesMessagingStreaming
kafka_franz
Input
A Kafka input using the Franz Kafka client library.
ServicesMessagingStreaming
kafka_franz
Output
A Kafka output using the Franz Kafka client library.
ServicesMessagingStreaming
log
Processor
Prints a log event for each message. Messages always remain unchanged. The log message can be set using function interpolations described here which a...
Utility
mapping
Processor
Executes a Bloblang mapping on messages, creating a new document that replaces (or filters) the original message.
MappingParsingTransform+1
metric
Processor
Emit custom metrics by extracting values from messages.
Utility
mongodb
Input
Executes a query and creates a message for each document received.
ServicesDatabaseNoSQL
mongodb
Processor
Performs operations against MongoDB for each message, allowing you to store or retrieve data within message payloads.
ServicesDatabaseNoSQL
mongodb
Output
Inserts items into a MongoDB collection.
ServicesDatabaseNoSQL
mqtt
Input
Subscribe to topics on MQTT brokers.
ServicesMessagingIoT
mqtt
Output
Pushes messages to an MQTT broker.
ServicesMessagingIoT
msgpack
Processor
:::caution BETA
Parsing
mutation
Processor
Executes a Bloblang mapping and directly transforms the contents of messages, mutating (or deleting) them.
MappingParsing
nanomsg
Input
Consumes messages via Nanomsg sockets (scalability protocols).
Network
nanomsg
Output
Send messages over a Nanomsg socket.
Network
nats
Input
Subscribe to a NATS subject.
ServicesMessagingStreaming
nats
Output
Publish to an NATS subject.
ServicesMessagingStreaming
nats_jetstream
Input
Reads messages from NATS JetStream subjects.
ServicesMessagingStreaming
nats_jetstream
Output
Write messages to a NATS JetStream subject.
ServicesMessagingStreaming
nats_kv
Input
Watches for updates in a NATS key-value bucket.
ServicesMessagingStreaming
nats_kv
Processor
Perform operations on a NATS key-value bucket.
ServicesMessagingStreaming
nats_kv
Output
Put messages in a NATS key-value bucket.
ServicesMessagingStreaming
nats_object_store
Input
:::caution EXPERIMENTAL
ServicesMessagingStreaming
nats_object_store
Processor
:::caution EXPERIMENTAL
ServicesMessagingStreaming
nats_object_store
Output
:::caution EXPERIMENTAL
ServicesMessagingStreaming
nats_request_reply
Processor
Sends a message to a NATS subject and expects a reply, from a NATS subscriber acting as a responder, back.
ServicesMessagingStreaming
nlp_classify_text
Processor
:::caution BETA
Machine LearningNLP
nlp_classify_tokens
Processor
:::caution BETA
Machine LearningNLP
nlp_extract_features
Processor
:::caution BETA
Machine LearningNLP
nlp_zero_shot_classify
Processor
:::caution BETA
Machine LearningNLP
noop
Processor
Noop is a processor that does nothing, the message passes through unchanged. Why? Sometimes doing nothing is the braver option.
nsq
Input
Subscribe to an NSQ instance topic and channel.
Services
nsq
Output
Publish to an NSQ topic.
Services
opensearch
Output
Publishes messages into an Elasticsearch index. If the index does not exist then it is created with a dynamic mapping.
Services
opensnowcat
Processor
:::caution EXPERIMENTAL
Parsing
parallel
Processor
A processor that applies a list of child processors to messages of a batch as though they were each a batch of one message (similar to the `for_each` ...
Composition
parquet
Input
:::caution EXPERIMENTAL
Local
parquet_decode
Processor
:::caution EXPERIMENTAL
Parsing
parquet_encode
Processor
:::caution EXPERIMENTAL
Parsing
parse_log
Processor
Parses common log formats into structured data. This is easier and often much faster than `grok`.
Parsing
processors
Processor
A processor grouping several sub-processors.
Composition
protobuf
Processor
Performs conversions to or from a protobuf message. This processor uses reflection, meaning conversions can be made directly from the target .proto fi...
ParsingTransform
pulsar
Input
Reads messages from an Apache Pulsar server.
ServicesMessagingStreaming
pulsar
Output
Write messages to an Apache Pulsar server.
ServicesMessagingStreaming
pusher
Output
:::caution EXPERIMENTAL
Services
questdb
Output
:::caution EXPERIMENTAL
Services
rate_limit
Processor
Throttles the throughput of a pipeline according to a specified `rate_limit` resource. Rate limits are shared across components and therefore apply gl...
Utility
read_until
Input
Reads messages from a child input until a consumed message passes a Bloblang query, at which point the input closes. It is also possible to configure ...
Utility
redis
Processor
Performs actions against Redis that aren't possible using a `cache` processor. Actions are
IntegrationMessagingCaching
redis_hash
Output
Sets Redis hash objects using the HMSET command.
ServicesMessagingCaching
redis_list
Input
Pops messages from the beginning of a Redis list using the BLPop command.
ServicesMessagingCaching
redis_list
Output
Pushes messages onto the end of a Redis list (which is created if it doesn't already exist) using the RPUSH command.
ServicesMessagingCaching
redis_pubsub
Input
Consume from a Redis publish/subscribe channel using either the SUBSCRIBE or PSUBSCRIBE commands.
ServicesMessagingCaching
redis_pubsub
Output
Publishes messages through the Redis PubSub model. It is not possible to guarantee that messages have been received.
ServicesMessagingCaching
redis_scan
Input
Scans the set of keys in the current selected database and gets their values, using the Scan and Get commands.
ServicesMessagingCaching
redis_script
Processor
Performs actions against Redis using LUA scripts.
IntegrationMessagingCaching
redis_streams
Input
Pulls messages from Redis (v5.0+) streams with the XREADGROUP command. The `client_id` should be unique for each consumer of a group.
ServicesMessagingCaching
redis_streams
Output
Pushes messages to a Redis (v5.0+) Stream (which is created if it doesn't already exist) using the XADD command.
ServicesMessagingCaching
reject
Output
Rejects all messages, treating them as though the output destination failed to publish them.
Utility
reject_errored
Output
Rejects messages that have failed their processing steps, resulting in nack behaviour at the input level, otherwise sends them to a child output.
Utility
resource
Input
Resource is an input type that channels messages from a resource input, identified by its name.
Utility
resource
Processor
Resource is a processor type that runs a processor resource identified by its label.
Utility
resource
Output
Resource is an output type that channels messages to a resource output, identified by its name.
Utility
retry
Processor
Attempts to execute a series of child processors until success.
Composition
retry
Output
Attempts to write messages to a child output and if the write fails for any reason the message is retried either until success or, if the retries or m...
Utility
s2
Input
:::caution BETA
Services
s2
Output
:::caution BETA
Services
schema_registry_decode
Processor
Automatically decodes and validates messages with schemas from a Confluent Schema Registry service.
ParsingIntegration
schema_registry_encode
Processor
Automatically encodes and validates messages with schemas from a Confluent Schema Registry service.
ParsingIntegration
select_parts
Processor
Cherry pick a set of messages from a batch by their index. Indexes larger than the number of messages are simply ignored.
Utility
sentry_capture
Processor
:::caution EXPERIMENTAL
sequence
Input
Reads messages from a sequence of child inputs, starting with the first and once that input gracefully terminates starts consuming from the next, and ...
Utility
sftp
Input
:::caution BETA
NetworkFiles
sftp
Output
:::caution BETA
NetworkFiles
sleep
Processor
Sleep for a period of time specified as a duration string for each message. This processor will interpolate functions within the `duration` field, you...
Utility
snowflake_put
Output
Sends messages to Snowflake stages and, optionally, calls Snowpipe to load this data into one or more tables.
Services
socket
Input
Connects to a tcp or unix socket and consumes a continuous stream of messages.
Network
socket
Output
Connects to a (tcp/udp/unix) server and sends a continuous stream of data, dividing messages according to the specified codec.
Network
socket_server
Input
Creates a server that receives a stream of messages over a tcp, udp or unix socket.
Network
split
Processor
Breaks message batches (synonymous with multiple part messages) into smaller batches. The size of the resulting batches are determined either by a dis...
Utility
splunk_hec
Output
Writes messages to a Splunk HTTP Endpoint Collector.
Services
sql_insert
Processor
Inserts rows into an SQL database for each message, and leaves the message unchanged.
Integration
sql_insert
Output
Inserts a row into an SQL database for each message.
Services
sql_raw
Input
Executes a select query and creates a message for each row received.
Services
sql_raw
Processor
Runs an arbitrary SQL query against a database and (optionally) returns the result as an array of objects, one for each row returned.
Integration
sql_raw
Output
Executes an arbitrary SQL query for each message.
Services
sql_select
Input
Executes a select query and creates a message for each row received.
Services
sql_select
Processor
Runs an SQL select query against a database and returns the result as an array of objects, one for each row returned, containing a key for each column...
Integration
stdin
Input
Consumes data piped to stdin, chopping it into individual messages according to the specified scanner.
Local
stdout
Output
Prints messages to stdout as a continuous stream of data.
Local
subprocess
Input
Executes a command, runs it as a subprocess, and consumes messages from it over stdout.
Utility
subprocess
Processor
Executes a command as a subprocess and, for each message, will pipe its contents to the stdin stream of the process followed by a newline.
Integration
subprocess
Output
Executes a command, runs it as a subprocess, and writes messages to it over stdin.
Utility
switch
Processor
Conditionally processes messages based on their contents.
CompositionTransformRouting
switch
Output
The switch output type allows you to route messages to different outputs based on their contents.
UtilityTransformRouting
sync_response
Processor
Adds the payload in its current state as a synchronous response to the input source, where it is dealt with according to that specific input type.
Utility
sync_response
Output
Returns the final message payload back to the input origin of the message, where it is dealt with according to that specific input type.
Utility
try
Processor
Executes a list of child processors on messages only if no prior processors have failed (or the errors have been cleared).
Composition
twitter_search
Input
:::caution EXPERIMENTAL
ServicesSocial
unarchive
Processor
Unarchives messages according to the selected archive format into multiple messages within a batch.
ParsingUtility
wasm
Processor
:::caution EXPERIMENTAL
Utility
websocket
Input
Connects to a websocket server and continuously receives messages.
NetworkStreaming
websocket
Output
Sends messages to an HTTP server via a websocket connection.
NetworkStreaming
while
Processor
A processor that checks a Bloblang query against each batch of messages and executes child processors on them for as long as the query resolves to tru...
Composition
workflow
Processor
Executes a topology of [`branch` processors][processors.branch], performing them in parallel where possible.
Composition
xml
Processor
:::caution BETA
ParsingTransform
zmq4
Input
Consumes messages from a ZeroMQ socket.
Network
zmq4
Output
Writes messages to a ZeroMQ socket.
Network
zmq4n
Input
Consumes messages from a ZeroMQ socket.
Network
zmq4n
Output
Writes messages to a ZeroMQ socket.
Network

Component Categories

📥 Inputs

Receive data from various sources including:

  • Messaging Systems: Kafka, NATS, RabbitMQ, Redis
  • Cloud Storage: AWS S3, GCS, Azure Blob
  • Databases: PostgreSQL, MySQL, MongoDB
  • HTTP: Webhooks, REST APIs
  • Files: Local files, SFTP
  • And many more...

⚙️ Processors

Transform and manipulate your data:

  • Mapping: Transform data with Bloblang
  • Filtering: Route and filter messages
  • Aggregation: Window, batch, and aggregate
  • Enrichment: Lookup and enrich data
  • Encoding: JSON, CSV, Avro, Protobuf
  • And many more...

📤 Outputs

Send data to destinations:

  • Messaging Systems: Kafka, NATS, RabbitMQ, Redis
  • Cloud Storage: AWS S3, GCS, Azure Blob
  • Databases: PostgreSQL, MySQL, Elasticsearch
  • HTTP: Webhooks, REST APIs
  • Observability: Prometheus, Datadog, New Relic
  • And many more...

Quick Navigation

Use the categories above to browse components by type, or use the filters below to find components by:

  • Cloud Provider: AWS, GCP, Azure, Generic
  • Use Case: Streaming, Storage, Transform, Observability
  • Difficulty: Beginner, Intermediate, Advanced

Here are some of the most commonly used components:

Database Integration

Connect to databases for edge analytics, caching, and data synchronization:

Popular database integrations:

  • MySQL and PostgreSQL for cloud databases
  • SQLite for local caching, offline storage, and edge analytics
  • MongoDB for document storage
  • ClickHouse for time-series data
  • Snowflake and BigQuery for cloud data warehouses

Getting Started

New to Expanso? Start with these guides:


Frequently Asked Questions

How many components are available in Expanso?

Expanso provides 200+ pre-built components across three categories: 61 inputs for receiving data, 74 processors for transforming data, and 71 outputs for sending data to destinations. All components are production-ready and maintained by the Expanso team.

Can I use custom components or extend existing ones?

While you can't add completely custom components, you can achieve most custom logic using the Mapping processor with Bloblang language. Bloblang provides extensive functions for parsing, transforming, and routing data. For more complex needs, you can chain multiple components together or contact support for enterprise custom component development.

What's the difference between inputs, processors, and outputs?

Inputs receive data from sources (like Kafka, S3, or HTTP). Processors transform data in transit (filtering, mapping, enrichment). Outputs send data to destinations (databases, cloud storage, APIs). Every pipeline needs at least one input and one output, with zero or more processors in between.

Are all components available in the free tier?

Yes! All 200+ components are available in every tier, including the free tier. There are no feature restrictions on which components you can use - only limits on data volume and number of agents. Check our pricing page for tier details.

How do I find the right component for my use case?

Use the component catalog filters above to search by name, category, or use case. You can also browse by cloud provider (AWS, GCP, Azure) or difficulty level. The most popular components section highlights commonly used components. If you're unsure, check our use cases section for real-world examples.