logo for information-management-architect.com
leftimage for information-management-architect.com

Data Quality Standards

Improve quality with David Bowman’s information management guidelines for data quality standards

This site is designed for Information Technology professionals who need to improve quality and require guidance and direction to help teams consistently ensure compliance with regulatory directives.

This site provides information management guidelines for measuring data and information quality for the sake of taking necessary action to support accurate decisions, monitor performance, and manage risk.

What are data quality standards?

Data quality metrics and measures form part of a consistent and controlled approach to the development and use of information, the management of data, and the assessment of internal controls. It is necessary to apply continuous oversight over the quality of data and information in order to ensure
effectiveness and efficiency of operations and controls related to Management Information Systems and Data Management environments; reliability of financial and managerial information; safeguarding of resources related to data, reports, and information; and achievement of strategic goals effectively balancing risk and return.
Oversight over the quality of data, and information should be guided by measures that are consistent, coherent, relevant, valid, secure, and accurate.

Data Quality Standards Objectives

The objective of data
quality standards is to ensure information management requirements for quality are met in order to:
  • Base decisions on fact;
  • Assist in prioritizing corrective action;
  • Assist in determining the source of quality problems;
  • Affirm or deny that solutions achieve or exceed intended goals; and
  • Provide clarity.
Once measurements have been taken, all metrics should be defined in the context of an appropriately approved method for assessing stability of the environment.

All metrics should be reported periodically as established by legal regulations, business rules, or exceptionally when urgent or special causes exist.
Metric Management Data Quality Standards

All metrics used for information quality measurement and reporting should meet the following criteria:
  • All metrics that are routinely reported within an organization should be adequately controlled;
  • Recommended methods for controlling metrics may include documenting metrics in a repository or lexicon, which includes a set of agreed upon metric attributes, establishing appropriate segregation of duties, and approval; and
  • Each metric should have the following metric attributes documented
Metric Attributes
  • Metric name should provide an identifier (abbreviation) or full English name of a given data element used for providing performance information or performance measurement;
  • Metric definition should specify the concise and complete description of the business purpose or function of the agreed-upon standard measure (metric) that is used for governance and oversight. This description may contain multiple specific contexts. For information quality metrics, these contexts may include either business processes or technical environments;
  • Calculation should specify the formula used to derive values and/or fields within a spreadsheet or database;
  • Data elements should provide context for the information presented e.g., $ Amt Delinquent, % Outstanding;
  • Data source should specify the most immediate system, database or spreadsheet from which the data is obtained; and
  • Metric owner should specify who is accountable for the definition of the metric, ensuring the metric is current and consistent with regulatory guidance, and all changes made to the metric.
Data Quality Standards and Information Quality

The measurement of information quality depends on the measurement of data quality.

Data quality standards objectives should show whether the information is meeting customer needs as follows:
  • The metric value falls explicitly between the upper and lower control limits specified by the business logic and business rules requirements, design documentation and application architecture or business logic for which the information is being used;
  • The information has supported decision-making over time.  Here the measure may be an indirect indicator, e.g. number of repeat users of a given report; and
  • Data profiling activities have confirmed suitability of the data for future intended uses.
The default metric for the quality of data being measured is defects per million (dpm). The upper and lower control limits for defect ranges should be set by the specific requirements of the data’s purpose, in accord with the data quality attributes as identified below.

Data Quality Attributes Suitable for Measurement
  • Accessibility i.e. retrieved as needed by appropriate individual should measure the degree or percentage of regulated and controlled access that enable people and machines to use the data in accordance with business rules in the best interests of the organization;
  • Accuracy i.e. exact and precise, should measure the degree to which data agrees with an original, acknowledged authoritative source about a real world object or event, such as a form, document, or unaltered electronic data received from outside the organization;
  • Completeness i.e. fulfilling formal requirements and expectations with no gaps, should measure the degree to which all required data are identified, defined, present, and documented to satisfy business requirements;
  • Consistency i.e. commonly defined and used across the enterprise  should measure the degree to which a set of data is equivalent in redundant or distributed databases. Sometimes called “uniqueness” or “concurrency.” Where possible, there should be one occurrence of an element; where not possible the element should be standardized and related across systems;
  • Timeliness i.e. current and not outdated or obsolete, should measure the degree or percentage of system availability within specified periods. Timeliness should always be measured from the perspective of the consumer of data;
  • Integrity i.e. what was requested and expected, should measure the degree of continuity between the form or quality of the data as originally acquired and the form or quality of the data in operation;
  • Validity i.e. obtained via an approval process, should measure the degree to which the data conforms to defined business rule e.g. a  constraint on access to the data, based upon business rules for who should and does have access to create, read, update, or delete the data; and
  • Relevance i.e. fits intended use, should measure the percent of data appropriate to the execution of a given business process or processes, as defined by user input and output specifications.
Samples for Measurement

Data selected for quality data management measurement reporting should be prioritized based on both of the following criteria:
  • The cost and/or risk to the enterprise of the particular data’s current quality; and
  • Availability of statistically significant samples.
Measurements should be taken using tested and rigorous sampling methodologies in order to accomplish specific objectives set by the business customers of the data or information.

Data Quality Standards Guidelines
  • The organization should plan, acquire, implement, and control data and information for the sake of enabling information value and cost chains to produce the highest quality data at optimal speed and cost;
  • These measurements should be considered in-line measurements i.e. statistical process that observes data quality effects inside an information chain, in contrast to a static measurement, which is made of data already-created and stored in a database;
  • Each information value and cost chain should begin with end-customer, shareholder, and regulatory satisfaction; 
  • Such satisfaction metrics taken outside the data quality management domain should be treated as primary indicators of data and information quality value;
  • Where possible, data and information quality metrics should be aligned with a corresponding set of defined process outputs; 
  • Measuring quality strictly within one organization or business area should be treated as one step in a process of managing data and information quality so that end-customers, shareholders, and regulators report satisfaction across organizational boundaries;
  • Data quality metrics should enable any information customer to specify quality requirements e.g. quality of data or information may degrade as it is handed from one process to another;
  • Requirements for the data quality scope and measures should be methodically captured from IT sponsors and information users;
  • Appropriate documentation should be developed and maintained by the business areas for all processes and procedures;
  • This documentation should be subject to the appropriate senior management review and approval; and
  • Accountabilities for each key role involved in data quality management should be defined and communicated to all stakeholders.
Summary...

Data quality metrics and measures form part of a consistent and controlled approach to the development and use of information, the management of data, and the assessment of internal controls

This site provided guidelines for measuring data and information quality for the sake of taking necessary action to support accurate decisions, monitor performance, and manage risk.