Data Quality Management

  • Business rule validation
  • ETL & data pipeline testing
  • Regression & unit testing
  • Data profiling & anomaly detection
  • Data reconciliation


"All-in-one data quality solution for our Vertica data warehouse and its source databases."
Starship Technologies
"Trust is hard to earn and easy to lose. LiTech makes it easier to earn and harder to lose."
"Convenient and systematic way to set up and manage your data quality queries."
Registrite ja Infosüsteemide Keskus
"An enthusiastic and motivated team to listen to the client's ideas and wishes."
Holm Bank
"Flexible and quick solution to discover and analyze data quality issues. The application is easy to install and use. Professional and friendly client support. Overall a positive experience with the pilot project."
Up to 50% of data practitioner's time is spent on data quality issues

Save time

Creating data validations manually and sampling data are not efficient or reliable methods for testing large datasets. Reduce manual effort and save time by automating data testing.

Detect errors

The use of continuous integration in development and constant change in data requires systematic testing and monitoring to detect errors and ensure data quality.

Avoid delays

Poor data often means additional development is required and other resources are wasted. We help you to prevent negative impact and delays in processes by detecting errors in data.

Features and attributes

Supported data sources

Is your database missing?
Let us know
  • Oracle
  • PostgreSQL
  • SQL Server
  • MySQL
  • MariaDB
  • Redshift
  • Vertica
  • Teradata
  • Snowflake
  • BigQuery
  • SAP IQ
  • SAP Hana
  • Cassandra
  • Databricks
  • RESTful API
  • CSV files

Data source comparison

Compare datasets from different data sources row-by-row using our built-in compare engine. This allows users to validate millions of rows to find errors caused by missing data.

Automated test creation

Generate test cases and profilings using reusable dynamic rules. Metadata based validations are automatically updated and can be generated in bulk to save time and reduce manual effort.

Data profiling

Detect sudden changes, deviations and variations in data by systematically profiling objects and analysing the data. Automatically detect changes in values, possible errors and find anomalies.

Anomaly detection

Automatically analyse execution results to learn how your data changes over time. Find errors in datasets using result prediction based on execution history.

Scheduling & alerting

Schedule tests and profilings to automatically execute and alert responsible users by email or webhooks when errors in data are found.


Automate data quality management in data pipelines by using our API to execute data validation queries, get results and request other data from the DQM.
previous arrowprevious arrow
next arrownext arrow

Contact us

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.