KDQ is designed to integrate with your existing workflow orchestration tooling. Once a Workspace is published, Jobs can be triggered by any external scheduler — such as Apache Airflow, CRON, Azure Data Factory, or dbt orchestration tools.
Prerequisites
-
A published KDQ Workspace with at least one Job configured
-
The Workspace secret key (generated when the Workspace was first created; regenerable from Workspace settings if lost)
Running Jobs via the Command Line
The checkpoint script is located within your published Workspace at:
/var/workspace/workspace_<workspace_id>/gx/run_checkpoints.py
Run all tests in all jobs (full workspace)
python /<path-to-published-gx-folder>/run_checkpoints.py -i secrets.rc -s <workspace-secret>
Run all tests in a specific job
python /<path-to-published-gx-folder>/run_checkpoints.py -i secrets.rc -s <workspace-secret> -j "Job-name"
Parameters
|
Parameter |
Description |
Required |
|---|---|---|
|
|
Path to the secrets file relative to the run script (usually |
Yes |
|
|
Workspace secret key |
Yes |
|
|
Job name — use quotes if the name contains spaces |
No |
Workspace Secret Key
-
The secret key is displayed when a Workspace is first created — save it at that point
-
If lost, regenerate it from Workspace Settings → Regenerate Secret
⚠️ Important: After regenerating the secret key, you must republish the Workspace for the new key to take effect. Update any scheduler configuration that references the old key.
💡 Next step: After configuring your scheduler, ensure KDQ Results are synced to K so quality outcomes are visible to all data consumers on the relevant Data Profile Pages.