LiTech Data Observability Help

Test cases

Test cases page allows users to manage all data validations, easily filter and find any available test cases, and analyze results. All test cases are SQL based validations with expected results, thresholds, and other properties.

Test cases

You can filter test cases by

  • Description/SQL - Search key matching anything from SQL Query or test case description

  • Connections - Filter by connections

  • Test suites - Filter by test suites

  • Status - Filter by test case latest execution status

  • Severity - Filter by test case severity

  • Tags - Filter by test case tags

  • Labels - Filter by test case labels

Page settings

Page settings allows users to configure the Test cases page.

Test case page settings
  • Execution timeout - SQL query timeout to cancel the execution (if a connection has a lower timeout configured, then it will be used instead)

  • Rows displayed - Maximum number of rows displayed on the page

  • Row status color - Test case status will be used as a row background for all test cases

  • Display changes - Difference between previous and latest result is shown in a separate column with indicators and percentage of change

  • Display labels - Test case labels are shown in a separate column

Create new test case

  1. Click "New test case" button

  2. Fill test case info:

    Add query test case
    • Severity - Test case severity, either INFO/WARNING/CRITICAL. Can be used to filter test suite and ignore results in test suites

    • Description - Test case short description, which will be visible from list view. Additional longer description can be added with Additional description button

    • Expected - Test case expected results, which can be either numerical or string value. Expected result can also be combination of expected results:

      Expected result
    • Test suite - Test suite which the test belongs to

    • Threshold - For lesser/greater than result tests you can enable percentage threshold. You can also choose the prediction model.

      • Linear - uses linear regression

      • Polynomial - uses polynomial regression

      • Previous - compares to previous available result

      • Average - compares to average of all previous results

    • Target - Target connection to execute the test against

    • Query window - Test case SQL Query window

  3. Click "Save" button

It is possible to compare data from different data sources by comparing SQL results. You can compare numeric values (SUMs, COUNTs, AVGs etc), Strings and data sets row by row/column by column. Sample of missing data is displayed when comparing data sets.

  1. Click "New test case" button

  2. Fill test case info:

    Add compare test case
    • Severity - Test case severity, either INFO/WARNING/CRITICAL. Can be used to filter test suite and ignore results in test suites

    • Description - Test case short description, which will be visible from list view. Additional longer description can be added with Additional description button

    • Expected - Expected comparison types

      • A=B - Both results are equal

      • A⊆B - A is subset of B

      • B⊆A - B is subset of A

      • A>B - A is greater than B

      • A<B - A is lesser than B

    • Test suite - Test suite which the test belongs to

    • Target A - Target connection to execute the test against for connection A

    • Target B - Target connection to execute the test against for connection B

    • Query window - Test case SQL Query window

  3. Click "Save" button

Example - Create a Test case

Test case settings

All test cases can be configured to fit user needs. Users can copy test links and view test case audit information.

  1. Open a test

  2. Click on test case setting icon Test case settings icon

  3. Edit settings:

    Test case settings
    1. Created by - User who created test case

    2. Modified by - User who last modified test case with modification date

    3. Disabled - Enable to exclude test case from test suite execution

    4. Save results - Enable to save failed execution results as a CSV file

Generating SQL

DQM has rule templates, which can be used to generate SQL with just a few clicks.

  1. Open a test

  2. Press "Generate SQL" button Generate SQL

  3. Edit parameters:

    generateSQLPrompt
    1. Connection - Connection to generate SQL for

    2. Parameters - Schema, object and columns selection to generate SQL for

    3. Rule selection - Predefined SQL rule template for SQL generation

    4. SQL auto-update - Enable for updating SQL query based on metadata changes

  4. Press " Generate SQL "

Test case comments

Test case comments are meant for collaboration between DQM users. Users can add and edit comments on test cases they have access to.

  1. Open a test

  2. Press on the Comments icon button

  3. Add/edit comments section:

    commentsSection
    1. Editing comments - Clicking on the pencil icon to edit comments (Non-admin users can only edit own comments)

    2. Adding new comment - Add new comment to test case

Test case relations

Test case relations are automatically assigned for test cases that are generated from rule templates. Users can also manually assign relations to test cases.

  1. Open a test

  2. Press relations relationIcon button

  3. Edit relations:

    testCaseRelations

Labels

Test case labels can be used to group test cases. These labels can be used for filtering tests or even executing certain test via DQM API.

  1. Open a test

  2. Add labels to test case: testCaseLabels

Execution graph

Execution graph shows all executions for test cases alongside with prediction graph.

testCaseExecutionGraph

On execution graph you can see:

  1. Executions - Time series data for latest executions, which can be filtered by selecting result rows and pressing testCaseExecutionGraphIcon button

  2. Prediction graph - Prediction graph for execution result history and allowed error rate

Test case results

Count based test cases and compare test cases have test case results, which can instantly show users related results from query.

Test case graph

Results will be displayed as a table, where users can copy query and save results as CSV file.

When "Save results" is enabled, every execution has persisted CSV file with results.

Test case results
Test case compare results graph

Results will be displayed two comparison tables, where users can see which records are the closest match and export both results to CSV files.

Test case compare results

Mass actions

For more efficiency you can also perform mass action on tests, by selecting all tests you want to modify. Selecting tests also supports SHIFT + click for selecting multiple at a time.

  1. Choose test cases

  2. Choose action

MassSimpleActions
  • Execute - Execute selected tests

  • Delete - Delete selected tests

  • More options - Extra actions for test cases

MassExtraActions
  • Import tests - Import test cases as JSON file

  • Export selected tests - Export selected tests into JSON file or ZIP archive of JSON files

  • Duplicate tests - Create copies of selected test cases

  • Disable/enable tests - Change test status to enabled/disabled (Disabled tests won't be executed with testsuite)

  • Testsuite assign - Assign test suite to all selected test cases

  • Connection assign - Assign connection to all selected test cases

  • Severity assign - Assign severity to all selected test cases

  • Set threshold - Set threshold to all selected test cases

Variables

Please refer to Variables

Last modified: 27 March 2024