EC Data University – Data Quality Basics: Validity and Reliability

Data Quality Basics: Validity and Reliability

In this topic, you will advance your knowledge of high-quality data by learning about valid data and reliable data.

Learning Objectives

  1. Examine the concepts of high-quality data, focusing on validity and reliability.
  2. Learn how the concepts of validity and reliability apply to Part C and Part B 619 data collection, analysis, and use.

Data Leadership Competencies addressed by this content

  • FD-4. Is knowledgeable about the various aspects of data quality such as: accuracy, completeness, timeliness, and the processes that are needed to ensure high-quality data statewide.  
  • FD-7. Is knowledgeable of policies and procedures required for the governance and management of Part C/Part B 619 data, including those related to the quality and integrity of the data, and the security of and access to those data.

Resources

For reference: Validity of the Child Outcomes Summary (COS) Process Data: An Overview of Findings from the ENHANCE Project
This joint ECTA Center/DaSy Center webinar offered an overview of final results from the ENHANCE project. Four studies conducted in eight states provided information about the implementation and validity of COS information for accountability and program improvement. Results were drawn from a provider survey, coding of videos of the COS process, a study examining relationships between the COS and assessment tool scores, and examination of statewide and national population data. Participants learned about the findings and implications for states using the COS process.

Reflection / Practice Activities

To ensure data quality, you first need evidence of validity and reliability. Consider your data. Can you tell whether the data is valid or reliable? What evidence do you have? What else do you need?

Are the data collection procedures capable of producing valid and reliable data?

  • Identify the specific data elements
  • Define the data elements and communicate the agreed upon definitions
  • Communicate how each data element is related to the performance indicator. Is it directly or indirectly related?

Do you have the resources to complete all the necessary steps to ensure the data you need to collect will be valid and reliable?

  • What are the best sources of the data (e.g., families, program administrators)?
  • Are there existing reliable sources of data?
  • How will the data be collected (e.g., data system, checklist, self-rating scale, behavioral observation, interviews)?
  • Are the data collection procedures efficient?
  • Do you have written procedures for data collection? Training?

Have you identified ways to improve data quality?

  • Have you provided communication and training to ensure data quality?
  • Are procedures in place to check for the completeness, accuracy, and timeliness of the data? Will these procedures be done at the state or local level?
  • How can you use edit checks or consistency checks to improve data entry and data quality?
  • When low-quality data are identified, how will you address it?

Adapted from The Refining Your Evaluation Data Pathway – From Source to Use

Tags: