Build Reliable, Scalable Data Pipelines That Power Your Analytics
Expert ETL consultant in Switzerland specializing in modern data pipeline development with dbt, Apache Airflow, and cloud-native integration tools. I design and implement production-grade ETL/ELT workflows that transform raw data into analytics-ready assets. From legacy pipeline modernization to real-time streaming architectures, I deliver data integration solutions that are reliable, observable, and built to scale.
Modern ELT stack expertise (dbt, Airflow, Dagster, Prefect)
Real-time streaming pipelines with Kafka and Spark Streaming
Data quality frameworks with automated testing and validation
Pipeline monitoring and alerting for 99.9% uptime SLAs
Experience processing 50M+ records daily in production
Cloud-native pipelines on AWS, Azure, and GCP
Legacy ETL modernization (Informatica, SSIS, Talend to modern stack)
Cost-optimized batch and micro-batch processing
Based in Switzerland with experience in banking and pharma data
Design and build robust ETL/ELT pipelines using dbt, Airflow, and cloud-native services. I create modular, testable, and version-controlled data transformations that are easy to maintain, debug, and extend as your data needs evolve.
Connect disparate data sources — APIs, databases, SaaS platforms, files, and streaming sources — into a unified data platform. I implement CDC (Change Data Capture), incremental loading, and idempotent processing for reliable data integration at any scale.
Set up and optimize dbt projects with best-practice folder structures, modular SQL models, automated testing, documentation generation, and CI/CD pipelines. Transform your data warehouse into a well-governed, self-documenting analytics platform.
Build real-time data pipelines using Apache Kafka, Spark Streaming, or cloud-native services like Kinesis and Pub/Sub. Implement event-driven architectures for use cases requiring sub-second latency such as fraud detection and live dashboards.
Implement comprehensive monitoring for your data pipelines including data freshness tracking, row count validation, schema drift detection, and automated alerting. Ensure data SLAs are met with dashboards that give full visibility into pipeline health.
Modernize legacy ETL systems (Informatica, SSIS, Talend, stored procedures) to cloud-native architectures. I assess existing pipelines, design the target architecture, and execute phased migrations with parallel validation to ensure zero data loss.