SQL Server
This page will guide you through the setup of SQL Server in K using the direct connect method.
Integration details
Scope | Included | Comments |
---|---|---|
Metadata | YES | |
Lineage | YES | Requires logging to be enabled |
Usage | YES | |
Sensitive Data Scanner | NO | Sensitive data scanner does not currently support SQL Server |
Known limitations
Queries, macros and procedures must include fully qualified names in order to be correctly parsed.
Step 1) SQL Server Access
Setting up SQL Server for metadata extraction is a 2 step process.
Step 1: Establish SQLServer Access
Note down the SQL Server connection details
Create a SQL Server user with read access per SQL Server database.
INFORMATION_SCHEMA.ROUTINES
INFORMATION_SCHEMA.VIEWS
INFORMATION_SCHEMA.TABLE_CONSTRAINTS
INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE
INFORMATION_SCHEMA.TABLES
INFORMATION_SCHEMA.COLUMNS
INFORMATION_SCHEMA.VIEWS
sys.foreign_key_columns
sys.object
sys.tables
sys.schemas
sys.columns
VIEW SERVER STATE permission on the server
Required for Extended Event log
VIEW Definition
All databases
CODEUSE master GO GRANT VIEW ANY DEFINITION TO Kadauser
Selected databases. Repeat for each database
CODEUSE <REPLACE WITH A DATABASE> GO GRANT VIEW ANY DEFINITION TO Kadauser
Step 2: Setup Extended Event Logging
A SQL Server Admin will need to setup an extended events process to capture Query Execution in SQLServer.
Some tuning of the logging parameters may be needed depending on processing volume on your SQLServer instance.
Example script to setup Extended Events logging.
--Query To Create Extended Events Session
CREATE EVENT SESSION [KADA] ON SERVER ADD EVENT sqlserver.sp_statement_completed (
ACTION(package0.collect_system_time, package0.event_sequence, sqlos.task_time, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.nt_username, sqlserver.query_hash, sqlserver.server_instance_name, sqlserver.server_principal_name, sqlserver.server_principal_sid, sqlserver.session_id, sqlserver.session_nt_username, sqlserver.transaction_id, sqlserver.username) WHERE (
(
[statement] LIKE '%CREATE %'
OR [statement] LIKE '%DROP %'
OR [statement] LIKE '%MERGE %'
OR [statement] LIKE '%FROM %'
)
AND [sqlserver].[server_principal_name] <> N'USERS_TO_EXCLUDE'
AND [sqlserver].[is_system] = (0)
AND NOT [statement] LIKE 'Insert into % Values %'
AND [sqlserver].[Query_hash] <> (0)
)
), ADD EVENT sqlserver.sql_statement_completed (
SET collect_statement = (1) ACTION(package0.collect_system_time, package0.event_sequence, sqlos.task_time, sqlserver.client_app_name, sqlserver.client_hostname, sqlserver.database_id, sqlserver.database_name, sqlserver.nt_username, sqlserver.query_hash, sqlserver.server_instance_name, sqlserver.server_principal_name, sqlserver.session_id, sqlserver.session_nt_username, sqlserver.transaction_id, sqlserver.username) WHERE (
(
[statement] LIKE '%CREATE %'
OR [statement] LIKE '%DROP %'
OR [statement] LIKE '%MERGE %'
OR [statement] LIKE '%FROM %'
)
AND [sqlserver].[server_principal_name] <> N'N'USERS_TO_EXCLUDE'
AND [sqlserver].[is_system] = (0)
AND NOT [statement] LIKE 'Insert into % Values %'
AND [sqlserver].[Query_hash] <> (0)
)
) ADD TARGET package0.event_file (SET filename = N'G:\extended events\Extendedevents.xel', max_file_size = (20), max_rollover_files = (100))
WITH (MAX_MEMORY = 4096 KB, EVENT_RETENTION_MODE = ALLOW_MULTIPLE_EVENT_LOSS, MAX_DISPATCH_LATENCY = 30 SECONDS, MAX_EVENT_SIZE = 0 KB, MEMORY_PARTITION_MODE = NONE, TRACK_CAUSALITY = ON, STARTUP_STATE = ON)
GO
-- Check if the session is dropping events and see other data about the session
-- https://sqlperformance.com/2019/10/extended-events/understanding-event-loss-with-extended-events
SELECT
s.name,
s.total_regular_buffers,
s.regular_buffer_size,
s.total_large_buffers,
s.large_buffer_size,
s.dropped_event_count,
s.dropped_buffer_count,
s.largest_event_dropped_size
FROM sys.dm_xe_sessions AS s;
-- Also check log growth rate. Apply filters to remove noise.
-- some filters:
-- [sqlserver].[server_principal_name] = N'name of principal'
-- [sqlserver].[is_system] = (0)
-- [sqlserver].[client_app_name] = N'name of app'
Step 2) Create the Source in K
Select Platform Settings in the side bar
In the pop-out side panel, under Integrations click on Sources
Click Add Source and select SQL Server
Select Direct Connect and add your SQL Server details and click Next
Fill in the Source Settings and click Next
Name: The name you wish to give your SQL Server
Host: Add the server location for the SQL Server instance
Version number: Set the SQL Server version
Extract Meta Only: Set this if extended events is not enabled
Add the Connection details and click Save & Next when connection is successful
Host: Add the SQL Server location
Username: Add the SQL Server User created in Step 1
Password: Add the User password created in Step 1
Test your connection and click Save
Return to the Sources page and locate the new SQL Server source that you created
Click on the clock icon to select Edit Schedule and set your preferred schedule for the SQL Server load
Note that scheduling a source can take up to 15 minutes to propagate the change.
Step 3) Manually run an ad hoc load to test SQL Server setup
Next to your new Source, click on the Run manual load icon
Confirm how you want the source to be loaded
After the source load is triggered, a pop up bar will appear taking you to the Monitor tab in the Batch Manager page. This is the usual page you visit to view the progress of source loads
A manual source load will also require a manual run of
DAILY
GATHER_METRICS_AND_STATS
To load all metrics and indexes with the manually loaded metadata. These can be found in the Batch Manager page
Troubleshooting failed loads
If the job failed at the extraction step
Check the error. Contact KADA Support if required.
Rerun the source job
If the job failed at the load step, the landing folder failed directory will contain the file with issues.
Find the bad record and fix the file
Rerun the source job