Skip to content

Blob storage connections

Connect your cluster to a bucket/container so Quix can enable Quix Lake or any other managed service that requires a Blob storage connection.

Connections list

One connection per cluster

Each cluster supports one blob storage connection.
You can configure different connections for different clusters.

Quix Lake at a glance

Summary - Quix Lake persists Kafka topic data as Avro/Parquet in your own bucket (S3, GCS, Azure Blob, MinIO), partitioned for fast discovery and full-fidelity Replay.

Why it exists - Preserve exact Kafka messages (timestamps, headers, partitions, offsets, gaps) with indexed metadata so Catalog, Replay, Sinks, and future services operate on open formats you control.

Key properties - Portable - open Avro & Parquet - Efficient - Hive-style partitions + Parquet metadata - Flexible - historical + live workflows - Replay - preserves order, partitions, timestamps, headers, gaps

Flow - Ingest (Avro) → Index (Parquet metadata) → Discover (Data Catalog & Metadata API) → Replay (full fidelity back to Kafka) → Use (explore, combine historical + live, run queries/export).

Learn more about Quix Lake →

Create a connection

  1. Settings → Blob storage connections → Create
  2. Pick Cluster, set Display name, choose Provider, fill the fields
  3. Test connection (below)
  4. Save

Test before saving

Testing connection

When you click Test connection, Quix runs a short round-trip check to make sure your details are correct and that the platform can both see and use your storage.

Here’s what happens:

  1. Connect - Quix creates a storage client using the details you entered.
  2. Upload - it writes a small temporary file into a tmp/ folder in your bucket or container.
  3. Check visibility - it confirms the file shows up in the storage listing.
  4. Query - it runs a simple check to ensure the file is discoverable for later Quix Lake operations.
  5. Clean up - the temporary file is deleted so your storage stays tidy.

Success
Each step is shown in the dialog. Successful steps are marked with a ✓, and you’ll see confirmation when everything checks out.

Failure
If a step fails, you’ll see ✗ next to it along with the reason (for example, “Access denied” or “Wrong region”). This makes it easy to fix permissions or update your settings.

Access denied example

Providers

  1. Log in to the AWS Management Console.
  2. Go to IAM.
  3. Open Users.
  4. Select an existing user or click Add user to create a new one.
  5. Permissions
  6. In the Permissions tab, attach a policy that allows bucket access.
  7. Security credentials
  8. Open the Security credentials tab.
  9. Click Create access key.
  10. Save credentials
  11. Copy the Access Key ID and Secret Access Key (the secret appears only once).
  12. Copy the information into the Quix S3 form.
  13. Click Test Connection.
  1. Ensure access
  2. Have Google Cloud project owner or similar permissions where your bucket resides or will be created.
  3. Create a service account and assign it to the bucket with R/W (e.g., roles/storage.objectAdmin) or equivalent minimal object roles.
  4. Open Cloud Storage settings
  5. In the Google Cloud Console, go to Storage → Settings.
  6. Interoperability tab
  7. Select Interoperability.
  8. If disabled, click Enable S3 interoperability.
  9. Create (or view) keys
  10. Under Access keys for service accounts, click Create key and follow the process to assign one to the service account.
  11. Save credentials
  12. Copy the Access key and Secret (the secret is shown only once).
  13. Paste this information into the Quix S3 connector form.
  14. Click Test Connection.
  1. Ensure access
  2. Your Azure user must have at least the Storage Blob Data Contributor role (or higher).
  3. Open the Azure Portal and go to your Storage account.
  4. Navigate to credentials
  5. In the left menu, expand Security + networking.
  6. Click Access keys.
  7. Copy credentials
  8. Note the Storage account name.
  9. Copy Key1 (or Key2) value.
  10. Paste the information into the Quix Azure Blob connector form.
  11. Click Test Connection.
  1. Ensure access
  2. Your MinIO user or role must include permissions to create and list access keys (e.g., consoleAdmin or a custom PBAC policy).
  3. Log in to the MinIO Console.
  4. Go to Access keys
  5. Select Access keys in the left menu.
  6. Create a new key
  7. Click Create access key to generate an Access Key and Secret Key.
  8. Save credentials
  9. Copy the Access Key and Secret Key - the secret is shown only once.
  10. Copy the information into the Quix MinIO connector form.
  11. Click Test Connection.

Security & operations

  • Dedicated principals per connection (IAM user / Service Account / MinIO user)
  • Scope credentials to one bucket/container
  • Rotate keys regularly; store secrets securely
  • Consider server-side encryption and access logging

See more