Skip to main content

Log Processing Examples

Process logs at the edge to reduce bandwidth costs, improve performance, and ensure compliance. These examples demonstrate different approaches to log processing, from simple filtering to production-ready pipelines.

Available Examples

Filter by Severity

Basic log filtering that routes ERROR and WARN messages to separate destinations. Perfect for understanding the fundamentals of log routing.

View Example →

Enrich & Export to S3

Add metadata and lineage tracking to logs before batching and exporting to Amazon S3. Demonstrates enrichment patterns and efficient cloud uploads.

View Example →

Production Pipeline

Complete production-ready pipeline with HTTP input, JSON parsing, validation, enrichment, PII redaction, and multi-destination routing (Elasticsearch, S3, local backup).

View Example →

Common Patterns

These examples demonstrate key patterns used in log processing:

  • Severity filtering - Route based on log level (ERROR, WARN, INFO, DEBUG)
  • Metadata enrichment - Add node ID, region, pipeline info for traceability
  • Multi-destination routing - Send to real-time search, archival storage, and backups
  • Batching - Optimize bandwidth and API costs by batching uploads
  • PII redaction - Remove sensitive data before logs leave the edge