Link to Data Quality Blog: The Foundations of AI Success Part I: Why Data Quality Metrics Are Critical for AI Solutions – Ross McNeely

Business Rule: Data Asset Quality Scoring

  • Each data asset can be evaluated using one or more quality metrics
  • Each quality metric can only score one data asset at a time
  • Some data assets may have multiple quality scores (from different metrics)
  • Some data assets might not have any quality scores yet

In simpler terms: Think of it like grading papers with different rubrics. A single paper (data asset) can be graded using multiple rubrics (quality metrics) like “accuracy,” “completeness,” or “timeliness.” However, each rubric can only be used to grade one paper at a time. So while a customer database might be scored for both accuracy and completeness, the “accuracy” metric is only evaluating that one database right now.

This creates a one-to-many relationship where one data asset can have multiple quality assessments, but each individual quality metric focuses on just one data asset at a time.


Business Rule: Data Quality Triggers

  • Data assets (like customer files, sales reports, or inventory lists) get quality scores
  • Each quality measurement focuses on just one data asset at a time
  • A single data asset can receive multiple quality scores from different measurements
  • Some data assets might not have any quality scores yet

In simpler terms: Imagine your customer database gets checked for quality. You might run separate tests for:

  • Accuracy (are addresses correct?)
  • Completeness (are phone numbers missing?)
  • Freshness (how recent is the data?)

Each test evaluates only your customer database, but your customer database gets three different quality scores. Meanwhile, your sales database might get its own separate accuracy test.

One quality check = one data asset, but one data asset can have many different quality checks run on it.


Data Quality Organization

1. Data Assets and Domains

  • Data assets (like databases, files, or datasets) belong to business domains
  • Each data asset fits into only one domain (Finance, HR, Sales, etc.)
  • Multiple data assets can share the same domain

Example: Your “Employee Records” and “Payroll Database” both belong to the HR domain, while “Customer Orders” belongs to the Sales domain.

2. Data Assets and Fields

  • Data assets are made up of individual data fields (columns/attributes)
  • Each field belongs to only one data asset
  • A data asset typically contains many different fields

Example: Your “Customer Database” contains fields like Customer_Name, Email_Address, Phone_Number, and Purchase_History. These fields don’t appear in any other database.

3. Data Fields and Quality Metrics

  • Individual data fields get evaluated for quality
  • Each quality check focuses on just one specific field
  • A single field can be tested with multiple quality metrics

Example: The Email_Address field might be checked for:

  • Format validation (is it a proper email format?)
  • Completeness (are there blank emails?)
  • Uniqueness (are there duplicate emails?)

Full Data Quality Schema

-- =====================================================
-- Data Quality Metrics Tracking Schema
-- Based on the Eight Pillars of Data Quality
-- =====================================================

-- Create schema for data quality objects
IF NOT EXISTS (SELECT * FROM sys.schemas WHERE name = 'dq')
BEGIN
    EXEC('CREATE SCHEMA dq')
END
GO

-- =====================================================
-- Core Reference Tables
-- =====================================================

-- Data Quality dimensions (The Eight Pillars)
CREATE TABLE dq.dimensions (
    dimension_id TINYINT PRIMARY KEY,
    dimension_name NVARCHAR(50) NOT NULL UNIQUE,
    description NVARCHAR(500) NOT NULL,
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE()
);

-- Insert the eight core dimensions
INSERT INTO dq.dimensions (dimension_id, dimension_name, description) VALUES
(1, 'Accuracy', 'Data correctly represents the real-world values or facts it is intended to describe'),
(2, 'Completeness', 'All required data fields and records are present without missing values or gaps'),
(3, 'Consistency', 'Data values and formats remain uniform across different systems, datasets, and time periods'),
(4, 'Integrity', 'Data maintains proper relationships and constraints between related fields and tables'),
(5, 'Reasonability', 'Data values fall within expected ranges and make logical sense given the context'),
(6, 'Timeliness', 'Data is available when needed and reflects the most current information for its intended use'),
(7, 'Uniqueness', 'Each data record appears only once without unwanted duplicates in the dataset'),
(8, 'Validity', 'Data conforms to defined formats, standards, and business rules for its specific field or domain');

-- Data Sources (Systems, Applications, APIs)
CREATE TABLE dq.data_sources (
    data_source_id INT IDENTITY(1,1) PRIMARY KEY,
    source_name NVARCHAR(100) NOT NULL UNIQUE,
    source_type NVARCHAR(50) NOT NULL, -- Database, API, File, Stream, etc.
    system_owner NVARCHAR(100),
    connection_string NVARCHAR(500),
    description NVARCHAR(500),
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE()
);

-- Data Domains (Customer, Product, Order, etc.)
CREATE TABLE dq.data_domains (
    data_domain_id INT IDENTITY(1,1) PRIMARY KEY,
    domain_name NVARCHAR(100) NOT NULL UNIQUE,
    description NVARCHAR(500),
    business_owner NVARCHAR(100),
    technical_owner NVARCHAR(100),
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE()
);

-- Data Assets (Tables, Views, Datasets)
CREATE TABLE dq.data_assests (
    data_asset_id INT IDENTITY(1,1) PRIMARY KEY,
    data_source_id INT NOT NULL,
    data_domain_id INT NOT NULL,
    asset_name NVARCHAR(100) NOT NULL,
    schema_name NVARCHAR(50),
    asset_type NVARCHAR(50) NOT NULL, -- Table, View, Dataset, File
    description NVARCHAR(500),
    record_count BIGINT,
    last_refresh_date DATETIME2,
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_data_assests_DataSource FOREIGN KEY (data_source_id) 
        REFERENCES dq.data_sources(data_source_id),
    CONSTRAINT FK_data_assests_DataDomain FOREIGN KEY (data_domain_id) 
        REFERENCES dq.data_domains(data_domain_id),
    CONSTRAINT UQ_data_assests_Source_Asset UNIQUE (data_source_id, asset_name)
);

-- Data Fields/Columns
CREATE TABLE dq.data_fields (
    data_field_id INT IDENTITY(1,1) PRIMARY KEY,
    data_asset_id INT NOT NULL,
    field_name NVARCHAR(100) NOT NULL,
    data_type NVARCHAR(50),
    max_length INT,
    is_nullable BIT,
    is_primary_key BIT NOT NULL DEFAULT 0,
    is_foreign_key BIT NOT NULL DEFAULT 0,
    business_rules NVARCHAR(1000),
    description NVARCHAR(500),
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_data_fields_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT UQ_data_fields_Asset_Field UNIQUE (data_asset_id, field_name)
);

-- =====================================================
-- Data Quality Rules and Thresholds
-- =====================================================

-- Quality Rules Definition
CREATE TABLE dq.quality_rules (
    quality_rule_id INT IDENTITY(1,1) PRIMARY KEY,
    rule_name NVARCHAR(100) NOT NULL UNIQUE,
    dimension_id TINYINT NOT NULL,
    data_asset_id INT NULL, -- NULL for global rules
    data_field_id INT NULL, -- NULL for asset-level rules
    rule_type NVARCHAR(50) NOT NULL, -- Range, Format, Lookup, Custom, etc.
    rule_expression NVARCHAR(2000) NOT NULL, -- SQL expression or validation logic
    expected_value NVARCHAR(500),
    severity NVARCHAR(20) NOT NULL DEFAULT 'Medium', -- Critical, High, Medium, Low
    description NVARCHAR(500),
    is_active BIT NOT NULL DEFAULT 1,
    created_by NVARCHAR(100) NOT NULL,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_by NVARCHAR(100),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_quality_rules_Dimension FOREIGN KEY (dimension_id) 
        REFERENCES dq.dimensions(dimension_id),
    CONSTRAINT FK_quality_rules_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_quality_rules_DataField FOREIGN KEY (data_field_id) 
        REFERENCES dq.data_fields(data_field_id),
    CONSTRAINT CK_quality_rules_severity CHECK (severity IN ('Critical', 'High', 'Medium', 'Low'))
);

-- Quality Thresholds
CREATE TABLE dq.quality_thresholds (
    threhold_id INT IDENTITY(1,1) PRIMARY KEY,
    data_asset_id INT NOT NULL,
    dimension_id TINYINT NOT NULL,
    min_acceptable_score DECIMAL(5,2) NOT NULL, -- 0.00 to 100.00
    target_score DECIMAL(5,2) NOT NULL,
    max_acceptable_score DECIMAL(5,2) NOT NULL DEFAULT 100.00,
    alert_theshold DECIMAL(5,2) NOT NULL, -- Trigger alerts below this score
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    modified_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_quality_thresholds_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_quality_thresholds_Dimension FOREIGN KEY (dimension_id) 
        REFERENCES dq.dimensions(dimension_id),
    CONSTRAINT UQ_quality_thresholds_Asset_Dimension UNIQUE (data_asset_id, dimension_id),
    CONSTRAINT CK_quality_thresholds_Scores CHECK (
        min_acceptable_score >= 0 AND min_acceptable_score <= 100 AND
        target_score >= 0 AND target_score <= 100 AND
        max_acceptable_score >= 0 AND max_acceptable_score <= 100 AND
        alert_theshold >= 0 AND alert_theshold <= 100 AND
        min_acceptable_score <= target_score AND
        target_score <= max_acceptable_score
    )
);

-- =====================================================
-- Data Quality Assessment and Monitoring
-- =====================================================

-- Quality Assessment Runs
CREATE TABLE dq.assessment_runs (
    assessment_run_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    run_name NVARCHAR(100),
    run_type NVARCHAR(50) NOT NULL, -- Scheduled, Manual, Triggered
    data_asset_id INT NULL, -- NULL for multi-asset runs
    start_time DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    end_time DATETIME2,
    status NVARCHAR(20) NOT NULL DEFAULT 'Running', -- Running, Completed, Failed, Cancelled
    records_processed BIGINT,
    errormessage NVARCHAR(2000),
    executed_by NVARCHAR(100),
    
    CONSTRAINT FK_assessment_runs_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT CK_assessment_runs_status CHECK (status IN ('Running', 'Completed', 'Failed', 'Cancelled'))
);

-- Quality Metrics (Main results table)
CREATE TABLE dq.quality_metrics (
    metric_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    assessment_run_id BIGINT NOT NULL,
    data_asset_id INT NOT NULL,
    data_field_id INT NULL, -- NULL for asset-level metrics
    dimension_id TINYINT NOT NULL,
    quality_score DECIMAL(5,2) NOT NULL, -- 0.00 to 100.00
    records_evaluated BIGINT NOT NULL,
    records_passed BIGINT NOT NULL,
    records_failed BIGINT NOT NULL,
    failure_rate DECIMAL(5,2) NOT NULL,
    metric_details NVARCHAR(MAX), -- JSON with detailed breakdown
    measurement_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_quality_metrics_AssessmentRun FOREIGN KEY (assessment_run_id) 
        REFERENCES dq.assessment_runs(assessment_run_id),
    CONSTRAINT FK_quality_metrics_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_quality_metrics_DataField FOREIGN KEY (data_field_id) 
        REFERENCES dq.data_fields(data_field_id),
    CONSTRAINT FK_quality_metrics_Dimension FOREIGN KEY (dimension_id) 
        REFERENCES dq.dimensions(dimension_id),
    CONSTRAINT CK_quality_metrics_Score CHECK (quality_score >= 0 AND quality_score <= 100),
    CONSTRAINT CK_quality_metrics_Records CHECK (records_evaluated = records_passed + records_failed)
);

-- Rule Execution Results
CREATE TABLE dq.rule_results (
    rule_result_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    assessment_run_id BIGINT NOT NULL,
    quality_rule_id INT NOT NULL,
    metric_id BIGINT NOT NULL,
    records_evaluated BIGINT NOT NULL,
    records_passed BIGINT NOT NULL,
    records_failed BIGINT NOT NULL,
    execution_time DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    execution_duration_ms INT,
    errormessage NVARCHAR(1000),
    
    CONSTRAINT FK_rule_results_AssessmentRun FOREIGN KEY (assessment_run_id) 
        REFERENCES dq.assessment_runs(assessment_run_id),
    CONSTRAINT FK_rule_results_QualityRule FOREIGN KEY (quality_rule_id) 
        REFERENCES dq.quality_rules(quality_rule_id),
    CONSTRAINT FK_rule_results_Metric FOREIGN KEY (metric_id) 
        REFERENCES dq.quality_metrics(metric_id)
);

-- Data Quality Issues (Failed records details)
CREATE TABLE dq.quality_issues (
    issue_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    assessment_run_id BIGINT NOT NULL,
    rule_result_id BIGINT NOT NULL,
    data_asset_id INT NOT NULL,
    data_field_id INT NULL,
    record_identifier NVARCHAR(500), -- Primary key or unique identifier of failed record
    issue_type NVARCHAR(50) NOT NULL, -- Missing, Invalid, Duplicate, Inconsistent, etc.
    issue_description NVARCHAR(1000),
    actual_value NVARCHAR(500),
    expected_value NVARCHAR(500),
    severity NVARCHAR(20) NOT NULL,
    status NVARCHAR(20) NOT NULL DEFAULT 'Open', -- Open, Acknowledged, Resolved, Ignored
    assigned_to NVARCHAR(100),
    detected_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    resolved_date DATETIME2,
    resolution NVARCHAR(1000),
    
    CONSTRAINT FK_quality_issues_AssessmentRun FOREIGN KEY (assessment_run_id) 
        REFERENCES dq.assessment_runs(assessment_run_id),
    CONSTRAINT FK_quality_issues_RuleResult FOREIGN KEY (rule_result_id) 
        REFERENCES dq.rule_results(rule_result_id),
    CONSTRAINT FK_quality_issues_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_quality_issues_DataField FOREIGN KEY (data_field_id) 
        REFERENCES dq.data_fields(data_field_id),
    CONSTRAINT CK_quality_issues_severity CHECK (severity IN ('Critical', 'High', 'Medium', 'Low')),
    CONSTRAINT CK_quality_issues_status CHECK (status IN ('Open', 'Acknowledged', 'Resolved', 'Ignored'))
);

-- =====================================================
-- Data Quality Trends and History
-- =====================================================

-- Daily Quality Scores (for trending and dashboards)
CREATE TABLE dq.daily_quality_scores (
    score_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    score_date DATE NOT NULL,
    data_asset_id INT NOT NULL,
    dimension_id TINYINT NOT NULL,
    avg_quality_score DECIMAL(5,2) NOT NULL,
    min_quality_score DECIMAL(5,2) NOT NULL,
    max_quality_score DECIMAL(5,2) NOT NULL,
    assessments_count INT NOT NULL,
    total_issues_count BIGINT NOT NULL,
    critical_issues_count BIGINT NOT NULL,
    high_issues_count BIGINT NOT NULL,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    
    CONSTRAINT FK_daily_quality_scores_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_daily_quality_scores_Dimension FOREIGN KEY (dimension_id) 
        REFERENCES dq.dimensions(dimension_id),
    CONSTRAINT UQ_daily_quality_scores_Date_Asset_Dimension UNIQUE (score_date, data_asset_id, dimension_id)
);

-- Quality Alerts and Notifications
CREATE TABLE dq.quality_alerts (
    alert_id BIGINT IDENTITY(1,1) PRIMARY KEY,
    alert_type NVARCHAR(50) NOT NULL, -- Threshold, Trend, Anomaly
    data_asset_id INT NOT NULL,
    dimension_id TINYINT NULL,
    severity NVARCHAR(20) NOT NULL,
    alert_message NVARCHAR(1000) NOT NULL,
    current_score DECIMAL(5,2),
    threshold_score DECIMAL(5,2),
    trigger_metric_id BIGINT,
    is_active BIT NOT NULL DEFAULT 1,
    created_date DATETIME2 NOT NULL DEFAULT GETUTCDATE(),
    acknowledged_date DATETIME2,
    acknowledged_by NVARCHAR(100),
    resolved_date DATETIME2,
    resolved_by NVARCHAR(100),
    
    CONSTRAINT FK_quality_alerts_DataAsset FOREIGN KEY (data_asset_id) 
        REFERENCES dq.data_assests(data_asset_id),
    CONSTRAINT FK_quality_alerts_Dimension FOREIGN KEY (dimension_id) 
        REFERENCES dq.dimensions(dimension_id),
    CONSTRAINT FK_quality_alerts_TriggerMetric FOREIGN KEY (trigger_metric_id) 
        REFERENCES dq.quality_metrics(metric_id),
    CONSTRAINT CK_quality_alerts_severity CHECK (severity IN ('Critical', 'High', 'Medium', 'Low'))
);

-- =====================================================
-- Indexes for Performance
-- =====================================================

-- Quality Metrics indexes
CREATE NONCLUSTERED INDEX IX_quality_metrics_DataAsset_Date 
    ON dq.quality_metrics (data_asset_id, measurement_date DESC);

CREATE NONCLUSTERED INDEX IX_quality_metrics_Dimension_Date 
    ON dq.quality_metrics (dimension_id, measurement_date DESC);

CREATE NONCLUSTERED INDEX IX_quality_metrics_Score 
    ON dq.quality_metrics (quality_score, measurement_date DESC);

-- Quality Issues indexes
CREATE NONCLUSTERED INDEX IX_quality_issues_DataAsset_status 
    ON dq.quality_issues (data_asset_id, status, detected_date DESC);

CREATE NONCLUSTERED INDEX IX_quality_issues_severity_status 
    ON dq.quality_issues (severity, status, detected_date DESC);

-- Daily Quality Scores indexes
CREATE NONCLUSTERED INDEX IX_daily_quality_scores_Asset_Date 
    ON dq.daily_quality_scores (data_asset_id, score_date DESC);

CREATE NONCLUSTERED INDEX IX_daily_quality_scores_Dimension_Date 
    ON dq.daily_quality_scores (dimension_id, score_date DESC);

-- Assessment Runs indexes
CREATE NONCLUSTERED INDEX IX_assessment_runs_DataAsset_Date 
    ON dq.assessment_runs (data_asset_id, start_time DESC);

CREATE NONCLUSTERED INDEX IX_assessment_runs_status_Date 
    ON dq.assessment_runs (status, start_time DESC);

-- =====================================================
-- Views for Common Queries
-- =====================================================

-- Current Quality Scores by Asset and Dimension
CREATE VIEW dq.vw_current_quality_scores AS
WITH LatestScores AS (
    SELECT 
        data_asset_id,
        dimension_id,
        MAX(measurement_date) AS latest_measurement
    FROM dq.quality_metrics
    GROUP BY data_asset_id, dimension_id
)
SELECT 
    da.asset_name,
    ds.source_name,
    dd.domain_name,
    dim.dimension_name,
    qm.quality_score,
    qm.records_evaluated,
    qm.records_failed,
    qm.failure_rate,
    qm.measurement_date,
    qt.target_score,
    qt.alert_theshold,
    CASE 
        WHEN qm.quality_score >= qt.target_score THEN 'Meets Target'
        WHEN qm.quality_score >= qt.alert_theshold THEN 'Below Target'
        ELSE 'Critical'
    END AS Scorestatus
FROM LatestScores ls
INNER JOIN dq.quality_metrics qm ON ls.data_asset_id = qm.data_asset_id 
    AND ls.dimension_id = qm.dimension_id 
    AND ls.latest_measurement = qm.measurement_date
INNER JOIN dq.data_assests da ON qm.data_asset_id = da.data_asset_id
INNER JOIN dq.data_sources ds ON da.data_source_id = ds.data_source_id
INNER JOIN dq.data_domains dd ON da.data_domain_id = dd.data_domain_id
INNER JOIN dq.dimensions dim ON qm.dimension_id = dim.dimension_id
LEFT JOIN dq.quality_thresholds qt ON da.data_asset_id = qt.data_asset_id 
    AND qm.dimension_id = qt.dimension_id;

-- Active Quality Issues Summary
CREATE VIEW dq.vw_active_issues_summary AS
SELECT 
    da.asset_name,
    ds.source_name,
    dd.domain_name,
    qi.severity,
    COUNT(*) AS IssueCount,
    MIN(qi.detected_date) AS OldestIssue,
    MAX(qi.detected_date) AS NewestIssue
FROM dq.quality_issues qi
INNER JOIN dq.data_assests da ON qi.data_asset_id = da.data_asset_id
INNER JOIN dq.data_sources ds ON da.data_source_id = ds.data_source_id
INNER JOIN dq.data_domains dd ON da.data_domain_id = dd.data_domain_id
WHERE qi.status = 'Open'
GROUP BY da.asset_name, ds.source_name, dd.domain_name, qi.severity;

GO

-- =====================================================
-- Sample Data for Testing
-- =====================================================

-- Sample Data Sources
INSERT INTO dq.data_sources (source_name, source_type, system_owner, description) VALUES
('CustomerCRM', 'Database', 'Sales Team', 'Main customer relationship management system'),
('OrderProcessing', 'Database', 'Operations Team', 'E-commerce order processing system'),
('ProductCatalog', 'API', 'Product Team', 'Product information API'),
('WebAnalytics', 'Stream', 'Marketing Team', 'Real-time web analytics data');

-- Sample Data Domains
INSERT INTO dq.data_domains (domain_name, description, business_owner, technical_owner) VALUES
('Customer', 'Customer master data and profiles', 'VP Sales', 'Data Engineering Team'),
('Product', 'Product catalog and inventory', 'VP Product', 'Data Engineering Team'),
('Order', 'Order and transaction data', 'VP Operations', 'Data Engineering Team'),
('Marketing', 'Campaign and analytics data', 'VP Marketing', 'Data Engineering Team');

PRINT 'Data Quality Metrics schema created successfully!';
PRINT 'Schema includes:';
PRINT '- Core reference tables for dimensions, sources, domains, assets, and fields';
PRINT '- Quality rules and thresholds configuration';
PRINT '- Assessment runs and metrics tracking';
PRINT '- Issue management and alerting';
PRINT '- Historical trending and daily scores';
PRINT '- Performance indexes and summary views';
PRINT '- Sample data for testing';
GO

One response to “Data Quality Metrics Schema”

Leave a comment

Trending