Interested in this use case?
Register your interest if you'd like us to focus on building this template next. You can also head to the Quix Community Slack if you have any questions.
Register interest
Project template
Use case
Code snippet

Consolidate, map and normalize data from multiple sources

Use cases:
Data integration
ETL
IoT
Architecture diagram showing the data integration template pipeline
Interested in this use case?
Register your interest if you'd like us to focus on building this template next. You can also head to the Quix Community Slack if you have any questions.
Register interest

About this template

With this project you can:

  • Migrate data from multiple source systems to Apache Iceberg
  • Standardize data formats across different input sources
  • Support both streaming and database change data capture (CDC)
  • Process data from IoT devices using MQTT and Telegraf
  • Maintain data consistency during migration
  • Track resource usage with built-in monitoring
  • Scale each component independently with containerized deployment

Using this template

To use this template, fork the project repo, sign up for a Quix account and create a project based on your new fork. For more details, see this guide to creating a project.

To write to S3 you’ll need to provide your AWS credentials as application secrets.

Once you’re set up, you can connect it to whatever data sources you like using one of our connectors. Then, adapt the normalization logic to fit your data.

Interested in this use case?
If you'd like us to focus on building this template next, register your interest and let us know. You can also head over to the Quix Community Slack if you've got any questions.
Register interest
Built on Quix with: