This project gives an example of how to stream data from Quix to a BigQuery database, it handles both parameter and event data.
How to run
Create a Quix account or log-in and visit the Library to use this project.
Setup & deploy on the library item, deploys a pre-built container in Quix. Complete the environment variables to configure the container.
Edit code on the library item, forks the project to your own Git repo so you can customize it before deploying.
The code sample uses the following environment variables:
- input: Name of the input topic to read from.
- PROJECT_ID: The BigQuery GCP Project ID.
- DATASET_ID: The target Bigquery dataset ID.
- DATASET_LOCATION: Location of BigQuery dataset.
- SERVICE_ACCOUNT_JSON: The service account json string for the BigQuery GCP project. Tutorial on how to create service account.
- MAX_QUEUE_SIZE: Max queue size for the sink ingestion.
- BigQuery fails to immediately recognize new Schema changes such as adding a new field when streaming insert data.
- BigQuery doesn't allow deleting data when streaming insert data.
Submit forked projects to the Quix GitHub repo. Any new project that we accept will be attributed to you and you'll receive $200 in Quix credit.
This project is open source under the Apache 2.0 license and available in our GitHub repo.
Please star us and mention us on social to show your appreciation.