Cut the Cost of Handling Industrial IoT Data in the Cloud
The technologies you choose to handle your data processing needs can have a big impact on performance, maintainability and cost.
In this template we show you how to save 10x on getting IoT data from AWS to BigQuery:
- Ingest large volumes of data from your operational estate on AWS while saving dramatically on AWS costs.
- Process, map and analyze data using Quix Streams instead of Kinesis Data Analytics.
- Get data into BigQuery cheaply by using the Quix BigQuery connector instead of Kinesis Firehose.

Main project components
IoT Data Feeds
Collect sensor data from IoT devices and send it to AWS.
AWS IoT Core MQTT Broker
Receive and buffer MQTT data from remote devices.
AWS Kinesis Data Streams
Write data coming from the IoT Core into a Kinesis Data Stream.
Kinesis Data Streams Connector (Source)
Read from a Kinesis Data Stream and write to a Kafka topic in Quix Cloud or Quix Edge.
Data Transformation Service
Reads from Kafka and calculates KPIs and performance metrics using Python-based statistical functions.
BigQuery Connector (Sink)
Reads from kafka and writes processed metrics to Google BigQuery long-term storage, analysis and visualization.
Technologies used
Using this template
- Easily read data from different AWS services and sink it to BigQuery
- Prepare data for BigQuery (data transformation, schema validation, downsampling, etc)
- Move data between systems in a highly performant, cost-effective manner