Test cases
Test cases page allows users to manage all data validations, easily filter and find any available test cases, and analyze results. All test cases are SQL based validations with expected results, thresholds, and other properties.
You can filter test cases by
Description/SQL - Search key matching anything from SQL Query or test case description
Connections - Filter by connections
Test suites - Filter by test suites
Status - Filter by test case latest execution status
Severity - Filter by test case severity
Tags - Filter by test case tags
Labels - Filter by test case labels
Custom field - Filter by custom fields
Test case options
On the left side of the page, user can choose between test case options:
Test case actions - Actions related to test cases
Statistics - Statistics based filtered test cases
Page settings - Overall page settings, based on customer preference
Actions
To change multiple test cases at once, you can perform mass actions, by selecting all tests you want to modify. Selecting tests also supports Shift+Click for selecting multiple at a time.
Execute - Execute selected tests
Delete - Delete selected tests
Save changes - Save all changes made to test cases
More options - Extra actions for test cases
Import tests - Import test cases as JSON or CSV file
Export selected tests - Export selected tests into JSON, ZIP or CSV file
Duplicate tests - Create copies of selected test cases
Disable/enable tests - Change test status to enabled/disabled (Disabled tests won't be executed with testsuite)
Testsuite assign - Assign test suite to all selected test cases
Connection assign - Assign connection to all selected test cases
Severity assign - Assign severity to all selected test cases
Set threshold - Set threshold to all selected test cases
Reports - Test case reports based on user selected filters
Business rules - Quick access to user defined business rules
Variables - Global variables
Dynamic rules - User defined rules
Test generation - Metadata based test generation
AI Assistant - Query generation from natural language by AI Assistant
Statistics
Execution chart - Overview of filtered test case execution statuses
Reports - Test case reports based on user selected filters
Page settings
Execution timeout - SQL query timeout to cancel the execution (if a connection has a lower timeout configured, then it will be used instead)
Rows displayed - Maximum number of rows displayed on the page
Row status color - Test case status will be used as a row background for all test cases
Display changes - Difference between previous and latest result is shown in a separate column with indicators and percentage of change
Display labels - Test case labels are shown in a separate column
Automated relations - New test cases will calculate relations automatically from metadata
Create new test case
Click "New test case" button
Fill test case info:
Description - Test case short description, which will be visible from list view. Additional longer description can be added with button
Expected - Test case expected results, which can be either numerical or string value. Expected result can also be combination of expected results:
Test suite - Test suite which the test belongs to
Business rule - Business rules which the test belongs to
Severity - Test case severity, either INFO/WARNING/CRITICAL. Can be used to filter test suite and ignore results in test suites
Threshold - For lesser/greater than result tests you can enable percentage threshold. You can also choose the prediction model.
Linear - uses linear regression
Polynomial - uses polynomial regression
Previous - compares to previous available result
Average - compares to average of all previous results
Target - Target connection to execute the test against
Query window - Test case SQL Query window
Click "Save" button
It is possible to compare data from different data sources by comparing SQL results. You can compare numeric values (SUMs, COUNTs, AVGs etc), Strings and data sets row by row/column by column. Sample of missing data is displayed when comparing data sets.
Click "New test case" button
Fill test case info:
Severity - Test case severity, either INFO/WARNING/CRITICAL. Can be used to filter test suite and ignore results in test suites
Description - Test case short description, which will be visible from list view. Additional longer description can be added with button
Expected - Expected comparison types
A=B - Both results are equal
A⊆B - A is subset of B
B⊆A - B is subset of A
A>B - A is greater than B
A<B - A is lesser than B
Test suite - Test suite which the test belongs to
Target A - Target connection to execute the test against for connection A
Target B - Target connection to execute the test against for connection B
Query window - Test case SQL Query window
Click "Save" button
Example - Create a Test case
Test case settings
All test cases can be configured to fit user needs. Users can copy test links and view test case audit information.
Open a test
Click on test case setting icon
Edit settings:
Created by - User who created test case
Modified by - User who last modified test case with modification date
Disabled - Enable to exclude test case from test suite execution
Save results - Enable to save failed execution results as a CSV file
Generating SQL
DQM has rule templates, which can be used to generate SQL with just a few clicks.
Open a test
Press "Generate SQL" button
Edit parameters:
Connection - Connection to generate SQL for
Parameters - Schema, object and columns selection to generate SQL for
Rule selection - Predefined SQL rule template for SQL generation
SQL auto-update - Enable for updating SQL query based on metadata changes
Press " Generate SQL "
Test case comments
Test case comments are meant for collaboration between DQM users. Users can add and edit comments on test cases they have access to.
Open a test
Press on the button
Add/edit comments section:
Editing comments - Clicking on the pencil icon to edit comments (Non-admin users can only edit own comments)
Adding new comment - Add new comment to test case
Test case relations
Test case relations are automatically assigned for test cases that are generated from rule templates. Users can also manually assign relations to test cases.
Open a test
Press relations button
Edit relations:
Labels
Test case labels can be used to group test cases. These labels can be used for filtering tests or even executing certain test via DQM API.
Open a test
Add labels to test case:
Execution graph
Execution graph shows all executions for test cases alongside with prediction graph.
On execution graph you can see:
Executions - Time series data for latest executions, which can be filtered by selecting result rows and pressing button
Prediction graph - Prediction graph for execution result history and allowed error rate
Test case results
Count based test cases and compare test cases have test case results, which can instantly show users related results from query.
Results will be displayed as a table, where users can copy query and save results as CSV file.
When "Save results" is enabled, every execution has persisted CSV file with results.
Results will be displayed two comparison tables, where users can see which records are the closest match and export both results to CSV files.