K Knowledge Base
Breadcrumbs

Integrating KDQ Jobs with your Scheduler

KDQ is designed to integrate with your existing workflow orchestration tooling. Once a Workspace is published, Jobs can be triggered by any external scheduler — such as Apache Airflow, CRON, Azure Data Factory, or dbt orchestration tools.


Prerequisites

  • A published KDQ Workspace with at least one Job configured

  • The Workspace secret key (generated when the Workspace was first created; regenerable from Workspace settings if lost)


Running Jobs via the Command Line

The checkpoint script is located within your published Workspace at:

/var/workspace/workspace_<workspace_id>/gx/run_checkpoints.py

Run all tests in all jobs (full workspace)

Bash
python /<path-to-published-gx-folder>/run_checkpoints.py -i secrets.rc -s <workspace-secret>

Run all tests in a specific job

Bash
python /<path-to-published-gx-folder>/run_checkpoints.py -i secrets.rc -s <workspace-secret> -j "Job-name"

Parameters

Parameter

Description

Required

-i

Path to the secrets file relative to the run script (usually secrets.rc)

Yes

-s

Workspace secret key

Yes

-j

Job name — use quotes if the name contains spaces

No


Workspace Secret Key

  • The secret key is displayed when a Workspace is first created — save it at that point

  • If lost, regenerate it from Workspace SettingsRegenerate Secret

⚠️ Important: After regenerating the secret key, you must republish the Workspace for the new key to take effect. Update any scheduler configuration that references the old key.


💡 Next step: After configuring your scheduler, ensure KDQ Results are synced to K so quality outcomes are visible to all data consumers on the relevant Data Profile Pages.