Measuring Total Data Quality
About this Course
By the end of this second course in the Total Data Quality Specialization, learners will be able to: 1. Learn various metrics for evaluating Total Data Quality (TDQ) at each stage of the TDQ framework. 2. Create a quality concept map that tracks relevant aspects of TDQ from a particular application or data source. 3. Think through relative trade-offs between quality aspects, relative costs and practical constraints imposed by a particular project or study. 4. Identify relevant software and related tools for computing the various metrics. 5. Understand metrics that can be computed for both designed and found/organic data. 6. Apply the metrics to real data and interpret their resulting values from a TDQ perspective. This specialization as a whole aims to explore the Total Data Quality framework in depth and provide learners with more information about the detailed evaluation of total data quality that needs to happen prior to data analysis. The goal is for learners to incorporate evaluations of data quality into their process as a critical component for all projects. We sincerely hope to disseminate knowledge about total data quality to all learners, such as data scientists and quantitative analysts, who have not had sufficient training in the initial steps of the data science process that focus on data collection and evaluation of data quality. We feel that extensive knowledge of data science techniques and statistical analysis procedures will not help a quantitative research study if the data collected/gathered are not of sufficiently high quality. This specialization will focus on the essential first steps in any type of scientific investigation using data: either generating or gathering data, understanding where the data come from, evaluating the quality of the data, and taking steps to maximize the quality of the data prior to performing any kind of statistical analysis or applying data science techniques to answer research questions. Given this focus, there will be little material on the analysis of data, which is covered in myriad existing Coursera specializations. The primary focus of this specialization will be on understanding and maximizing data quality prior to analysis.Created by: University of Michigan
Related Online Courses
Programming and complexity thinking are key skills for approaching 21st century challenges. NetTango Builder is a tool that allows for the creation of blocks-based programming experiences based on... more
Master the art of building scalable and efficient microservices using Java and the Spring framework in this Coursera specialization. Dive deep into the intricacies of Spring Boot and Spring Cloud,... more
In this course, you will learn about the dynamic world of robotics, which blends engineering, electronics, and computer science to create innovations that enhance our daily lives. You\'ll learn to... more
This course provides those involved in educating members of the health professions an asynchronous, interdisciplinary, and interactive way to obtain, expand, and improve their teaching skills.... more
This course is the second course in the Linear Algebra Specialization. In this course, we continue to develop the techniques and theory to study matrices as special linear transformations... more