Project template
Use case
Code snippet
Cut Kinesis costs by using Quix to pre-process data and write it to BigQuery
AWS to BigQuery via Quix. 10x cheaper than Firehose
- Ingest large volumes of data from your operational estate on AWS while saving dramatically on AWS costs.
- Process data using Quix Streams instead of Kinesis Data Analytics.
- Get data into BigQuery cheaply by using the Quix BigQuery connector instead of Kinesis Firehose.
Use cases:
Data integration
Created by:
Quix
QuixMain project components
IoT Data Feeds
Collect sensor data from IoT devices and send it to AWS.
AWS IoT Core MQTT Broker
Receive and buffer MQTT data from remote devices.
AWS Kinesis Data Streams
Write data coming from the IoT Core into a Kinesis Data Stream.
Kinesis Data Streams Connector (Source)
Read from a Kinesis Data Stream and write to a Kafka topic in Quix Cloud or Quix Edge.
Data Transformation Service
Reads from Kafka and calculates KPIs and performance metrics using Python-based statistical functions.
BigQuery Connector (Sink)
Reads from kafka and writes processed metrics to Google BigQuery long-term storage, analysis and visualization.
Technologies used
Using this template
- Easily read data from different AWS services and sink it to BigQuery
- Prepare data for BigQuery (data transformation, schema validation, downsampling, etc)
- Move data between systems in a highly performant, cost-effective manner
Interested in this use case?
If you'd like us to focus on building this template next, register your interest and let us know. You can also head over to the Quix Community Slack if you've got any questions.
Register interest
Built on Quix with: