David Brooks piece on the limitations of data in The New York Times hits on many of the real challenges and limitations of quantitative data collection and analysis that evaluators and consultants, like us here at Informing Change, face regularly. For example, numbers can miss nuance, big data are messy and sometimes interpreters of data look for what they want to see rather than the overall picture that the data actually paint. I want to remind data developers and consumers —that is, most of us!—of three critical data collection practices that address some of Brook’s concerns.
- Collect data that tell you what you need to know. Data become distracting and messy often because we collect information arbitrarily, because it happens to be available or because we mistakenly think more is better. It is important to thoughtfully choose data points that will identify whether you’re moving toward your objective.
- Use a mix of data types to fully understand a program, issue or organization. First let’s define data. Data are pieces of information. Quantitative data— numbers and statistics—typically explain ‘what.’ For example, 50% of students improved their test scores. Qualitative data—information collected from observations, interviews, focus groups or open-ended survey questions—explain why. Was the improvement due to increased instruction, parent involvement and/or a more supportive learning environment? Together, these data weave a much fuller understanding of any given issue, providing decision-makers with more complete story than was otherwise possible.
- Account for the complexity of the real world in data collection and analysis. Programs and organizations don’t operate in a vacuum, and neither should data. Every study has limitations, which must be clearly communicated with stakeholders. But a good analyst and evaluation partner approaches the data to answer contextual questions, equipped to engage in the messiness. Tools such as regression and statistical modeling, for example, are designed to help us understand the cause and sequence of things as well as the relationships among variables.
In the article’s final paragraph, Brooks reminds the reader that “like any tool,” data are, “good at some things and not at others.” Sure, fair enough. But let’s not let that be an excuse for poor practice. The combination of good data and good process will go a long way to help consultants, researchers and decision-makers understand our work, make better decisions and pursue solid strategy.