Last month, we focused on three creative data strategies for demonstrating impact. The strategies are borrowed from organizations working in especially challenging contexts, serving and engaging high-risk populations.
This month, we focus on a more ubiquitous challenge – the headcount as the sole data strategy for many organizations and schools. The headcount is an approach to data collection and analysis that focuses on “counting heads.” Organizations with this approach typically ask themselves questions like these.
- How many people did we serve with our program or service?
- What are the demographic variables of the program participants?
This scope, what we call a phase one approach to data collection and analysis, is important and serves organizations well during the early years. The two questions (above) support organizations in understanding who is and is not served through the program or service. However, we know that data has the potential to inform planning and actions beyond the scope of tracking who is being served.
Let’s start by asking, “What counts as proof of effectiveness?” What we call phase two moves organizations into a dynamic relationship with data and informs decision-making with an increasingly robust strategy. We provide you with a framework for exploring effectiveness of programs and services, and tools to help you to apply this framework to your organization.
How do I know if my organization is focused on the Headcount approach?
All effective programs have precise goals. The data collected need to speak to the goals, directly and clearly. For example, with a program that serves at-risk youth with college readiness curriculum, the goals for the program typically include the desire to prepare and shepard the youth into college.
In this example, data must be collected on whether the youth attend college tours, enroll in college, participate in support programs on college campuses, complete college (some amount of semesters or years), etc. Remember, the collected data cannot only speak to a phase one scope as in the case with the questions below.
- How many at-risk youth participated in our program?
- What does survey feedback state from program participants?
These phase one questions (above) inform organizations about how well they recruit program participants. Recruitment data points themselves do not inform the program on its effectiveness and community impact.
What counts as Dynamic Data?
We begin the discussion of a new framework for data collection and analysis with a series of questions that guides strategic data collection and analysis for understanding effectiveness and community-impact.
- What new resources (e.g. knowledge, skills, programs, activities, etc.) are part of the lives of program participants (or program alumni) that were not present before participation in the program?
- How many times have program participants (or alumni) contributed back to the program and/or the organization?
- Which behavioral changes made by program participants during programming have been maintained across specific increments of time (e.g. 3 months post program, 6 months post program, etc.)?
- Can program alumni identify current or newly developed challenges in their lives that were not present during participation in the program? How does the program curriculum inform or support alumni with addressing the challenges?
- Where knowledge, skills, and/or behavioral changes could not be maintained post programming, why? What factors intersect in the lives of the program alumni that have interrupted the impact of your program or service?
Notice that the questions target an examination of the experiences of participants who have engaged in the program or service. The key here is, “What are the benefits for participants immediately and long-term?”
Next, your organization can focus on the collection of information and materials (data) that compliment the five guiding questions (above).
- Pre- and post-tests are the most commonly used tool for collecting data on program participants. When using this approach, make sure the tests ask participants to apply knowledge and skills, not regurgitate information learned through the program curriculum. Application questions generally ask participants to respond to a scenario and apply knowledge or skills to the scenario.
- Whether administering pre- and post-tests or surveys, consider using oral transmission. Instead of giving program participants or alumni a piece of paper to fill out, have a program staff (or program alumni trained in data collection) pose the questions and jot-down responses in a notepad or in an electronic form. Orally transmitting a test or survey benefits data collection in multiple ways. Staff are afforded the opportunity to repeat or restate prompts based on the intended interpretation of them. Staff increase response rates since there is not a nameless and faceless survey sent-out to email inboxes. Finally, the back-and-forth exchanges characterized by oral transmission cultivates relationships among program participants/alumni and program staff.
- A local program asks program participants at the start of the curriculum to develop a collage that represents her/his life. At the end of the program, like the pre- and post-test approach, the program participants make a second collage with the same instructions. The program can then facilitate an informal focus group session with participants. The participants can discuss the differences they see in their two collages and speak to how the program has impacted their self-concept, their relationships, their goals, and their planned actions. In follow-up interviews and needs assessments with these same program participants, now alumni, staff can inquire into the sustainability of the self-concept, relationships, goals, plans, etc. post programming, and collect sample materials from the alumni to serve as complementary data (e.g. school transcripts, job descriptions, letters of recommendation, health records, etc.).
- An organization that focuses on program alumni data collection can collect information on the whereabouts, jobs, schools, health histories, activities in the community, etc. Additionally, the organization can collect materials from the alumni such as school transcripts, letters of recommendation from mentors, awards, etc. The information and materials can be analyzed systematically to pull-out data points that speak-back to how well the program goals have been met long-term. Events and activities as well as social media and technology (as appropriate) can attract alumni and be an effective way to collect the information and materials.
The alignment of program goals and curriculum, and the collection and analysis of data based on the alignment, can be challenging. The following checklist assists organizations in considering which types of data points reflect phase two data collection and analysis. We can take inspiration from the tree featured at the top of this discussion. Notice where the tree has released limbs, leaving healed knots sprinkled along the sturdy trunk; some limbs have been let go, and other limbs have grown. It is important that we let go of what does not work to make room for more reliable, sturdy practices.
Recent Comments