Transform your data efficiently for optimal ingestion to databases, lakes and warehouses
Source
Ingest data from any source, including popular streaming technologies like Apache Kafka, AWS MSK or AWS Kinesis. Use out-of-the-box connectors, or when that’s not enough, you can quickly customise a connector by forking the nearest example. Building custom connectors is easy with Quix’s pure Python Source API.
Transformation
Prepare your data with Quix Streams, an open source Python library for processing data with streaming DataFrames. Use in-built operators for aggregation, windowing, filtering, group-by, branching, merging and more. Integrate and enrich your data before loading to Iceberg by connecting caches and external systems.
Destination
Sink data to cloud blob stores in Iceberg format, including AWS S3, GCS and Azure Blob Storage. Other databases, lake formats and warehouses are also supported. Quix sink connectors will automatically handle back pressure and checkpointing to ensure no data is duplicated or lost and your database is not overloaded.