[vc_row][vc_column][vc_column_text]

This article is the third in a series we call, Dynamic Definitions of Data. In part one, we examine data collection strategies from programs delivered to high-risk populations. In part two, we explore two dimensions of program data collection and analysis, that of “counting heads” (phase one) and “community impact” data collection and analysis (phase two).  We wrap-up Dynamic Definitions of Data by tackling the fear of finding program failures – a fear that can be one of the biggest barriers to strategically engaging in outcomes assessment.

It is important to recognize that data is not dichotomous; it will not prove your program right nor prove your program wrong. The ways that the findings from outcomes assessment are used sets apart the program and its leadership; no matter the results of assessment, reflection and action based on your findings positions your organization as one that makes data-driven decisions. For example, the Urban Institute identifies four areas in which to use any assessment results: detecting needed program improvements, motivating and helping staff and volunteers, internal uses for operations, and reporting.

What if we discover we are not meeting the intended needs or outcomes?

Many organizations fear not meeting the outcomes or community needs that are at the core of their missions. Even organizations that pride themselves in collecting programmatic data beyond “counting heads” sometimes stay away from exploring specific data points or program dynamics to avoid exposing potential failures.

Fear not.

Whatever is discovered will strengthen the program and benefit program participants. Even with negative results, program leadership can develop an excellent set of decisions and actionable items based on the results including decisions on program curriculum, staff training, and assessment techniques.

Transform negative assessment findings into data-driven decision making

The three examples featured in Table 1 illustrate some uses of program assessment results. These examples reflect program outcomes and indicators from three types of programs: Job Training & Employment, Youth Mentoring, Arts & Culture.

The first column lists the intended program outcomes and aligning indicators. The next column presents an unexpected finding or seeming failure. In the last column, we present examples for how these unexpected  findings can advance the program. Specifically, in this column we propose ways to use the  findings in the areas of program delivery, program design, collaboration in the community, and assessment design.

There’s only good to come

The possibilities are encouraging and can advance the programs in the short- and long-term. We look forward to hearing about your use of these ideas for transforming seemingly negative assessment findings into meaningful action.

Table 1. Transform Negative Findings

Program Intended Outcome

Unexpected Assessment Findings

Use of Unexpected Findings

Train and Secure Employment for Unemployed Population; Indicator: Count/% of job offers for program alumni within a designated time period. A significant majority of program alumni are not offered jobs within the designated time allotted for this indicator (e.g. 60 days). Program Delivery: Consider whether the program curriculum was delivered in the most efficacious manner. Review materials from the leaders in job training specifically with your target population. Develop new delivery strategies that may or may not already be in existence (e.g. program participants shadow prospective employers, advocate with local leadership, and establish or support a program alumni network). This establishes the program for its high quality, and appropriate and innovative approaches — a leader in the area of job training.
Collaboration: Examine the factors that mitigate whether or not program alumni receive job offers. Consider whether there are factors or barriers among those program alumni who do not receive job offers — are the same factors present in their lives/contexts (e.g. illnesses, criminal records, and/or housing and transportation barriers that impede job interviews)? Develop partnerships and collaborations with appropriate organizations to address these mitigating factors.
Provide Mentoring to At-Risk Youth; Indicator: Count/% of program participants and alumni who reduce and/or eliminate alcohol/drug use within a designated time period. Unable to collect data on program alumni alcohol/drug use due to minimal successful communication with alumni. Program Design: Incorporate program alumni in the program curriculum as much as possible (e.g. engage alumni as guest speakers, for one-day volunteer activities, train alumni in assessment data collection, etc.). This strengthens the relationship between the program and its alumni, as well as offers opportunities for ongoing data collection.
Collaboration: Consider the factors at work that get in the way of staff successfully collecting assessment data on the lives and welling-being of program alumni (e.g. lack of updated contact information, transient population dynamics among the alumni, a phone-call only strategy to connect with alumni, etc.). Develop partnerships and collaborations with organizations and schools that serve this same group; the partnerships can cultivate a shared assessment plan for data collection. All programs who partner in this way will benefit from a single data collection strategy for program alumni outcomes — that is, when a staff member from any of the partnering organizations successfully connects with an alum, then data is collected for all of the partnering organizations.
Increase access to Arts & Culture Programming; Indicator: Count/% of community members who report program events are sensitive to their cultural identification. A significant percentage of community members report insensitivity about program events (e.g. Misuse of accurate names and/or use of stereotypes). Program Design: Change programming to reflect the concerns of the community, utilizing focus groups and specialists (as appropriate) to redesign the program with due attention to sensitivity and empowerment. This bolster’s the reputation of the organization as responsive, one that can be trusted to listen and learn.
Assessment Design: Report to multiple audiences (e.g. community participants, partners, funders, staff, etc.) the significance of the assessment findings and continued vigilance to dedicate time for ongoing program assessment. Also, seek feedback from specific audiences for their interpretations of what future findings show; including arts and culture consumers in the assessment process (e.g. early reviews of assessment findings in a focus group setting) increases transparency.

[/vc_column_text][/vc_column][/vc_row]

Events, Resources, Learnings // Seasonal Newsletters

The seasonal newsletter shares social transformation and anti-racism resources, events, and learnings through BrownGirlHealing.org. 

You will also receive a free copy of the introduction to Dr. Vèlez Young's latest book, Nonprofit Work is Killin' Me: Mitigating Chronic Stress and Vicarious Trauma in Social Service Organizations.

You have successfully signed up for Event Alerts and monthly resources from BrownGirlHealing.org!