A KDQ Connection links a Workspace to one of your data sources. Each connection stores the credentials and configuration needed for KDQ to connect to a database, warehouse, or data platform and execute data quality checks against it.
Credentials are stored as Secrets within the Workspace — encrypted at rest and decrypted only at check runtime. They are never exposed in connection strings or logs.
Prerequisites
Before adding a connection:
-
A KDQ Workspace must already exist
-
You must have Workspace Admin access (see Configuring KDQ Workspace Access)
-
The source must already be profiled in K
Configuring a Connection
-
Open the KDQ portal and navigate to your Workspace
-
Select the Secrets tab and add the required secret values for your data source (see connection string options below)
-
Use a descriptive secret name (e.g.
SF_ACCOUNT) and the corresponding secret value
-
-
Select the Connections tab and click Add Connection
-
Complete the connection details:
-
Connection name — a clear, descriptive identifier
-
Connection string — constructed from your secret values (see table below)
-
Linked database — select the matching database already profiled in K
-
-
Click Save
Connection String Options
|
Source |
Secrets Required |
Connection String |
Notes |
|---|---|---|---|
|
Snowflake |
SF_USERNAME, SF_ACCOUNT, SF_DATABASE, SF_KEY, SF_PASSPHRASE, SF_ROLE, SF_WAREHOUSE |
|
Key pair only — password-only option is deprecated |
|
Azure SQL / SQL Server |
SQLSERVER_HOST, SQLSERVER_DATABASE, SQLSERVER_USERNAME, SQLSERVER_PASSWORD |
|
|
|
Azure Synapse |
SQLSERVER_HOST, SQLSERVER_DATABASE, SQLSERVER_USERNAME, SQLSERVER_PASSWORD |
|
|
|
PostgreSQL |
POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_HOST, POSTGRES_PORT, POSTGRES_DB |
|
|
💡 Next step: Once a connection is configured, create a Dataset to define which tables or queries to run checks against.