Home > article

Notes from the January 27, 2012 Networking Meeting

Apr 11, 2012  Suzanne Marten, Facilitator, Center for Educational Options

Framing the Discussion and Reflecting on a Word: Evaluation

What does evaluation mean anyway' It seems overwhelming. It can feel like a test or external demand - which it can be, but it does not have to be that or only that. It can feel like something that is done to you, not with you. The purpose of this Networking meeting is to help us think about evaluation as an on-going process and tool that can inform everyday decision making in your programs. It can invite all your stakeholders - staff, kids, youth, and families - to get involved and give input.

Following this frame, the meeting was launched with a 'reflection on a word', a protocol for surfacing our thinking and multiple or layered meanings of the words and concepts that we were about to explore. Participants 'free wrote' on the word evaluation or the concept of evaluation for about 5 minutes and then shared their ideas first with a partner and then with the whole group.

While it is challenging to capture the richness and depth of the group's collective ideas, what follows are a few of the highlights and tensions that rose to the surface:

  • the word VALUE in EVALUATION
  • there is required evaluation and there is meaningful evaluation; sometimes they are not the same thing
  • it has consequences
  • it is tied to your mission, looking at what you actually do, defining success and what that looks like for YOU in YOUR program
  • it involves struggle - to develop openness and process
  • participants must be willing to make themselves vulnerable, to really reflect and consider impact
  • it involves asking questions - of youth, families, staff
  • it is a reflective process
    • that allows you to learn more about your program and its impact, what works and what doesn't
    • that involves staff, teens, kids, and families
    • who can learn about themselves as well
  • it requires resources - time, money, people power

The group noted the positive and negative connotations and connections to the idea of evaluation, and the multiple layers of involvement over time. Various contributions drew attention to the way in which the process of evaluation can draw stakeholders in, and while they help the program to know itself and its impact better, they also get to know themselves better in the process.

A Case Study: Megan Demarkis and Harlem RBI

Megan Demarkis, program director at Harlem RBI, presented a brief history and evolution of evaluation in her program as a way for participants to consider how evaluation can be a process or tool for informing everyday decision making in OST programs.

Megan introduced herself as someone who had no experience in formal evaluation before participating in the 18-month Evaluation Institute sponsored by the Bowne Foundation. She described how she continues to use and develop what she learned. Sometimes she only has time to grab enough data to finish a report, but she is continually looking for ways to engage staff and complete the cycle of sharing data back so that it does inform program decisions and staff's sense of their work.

She stated that evaluation should:

  • give voice to youth, families, staff
  • solve problems - give more complete information about programs and their impact
  • engage staff - over time and within reason
  • drive quality - if everyone is engaged and the full cycle of evaluation is completed
  • She also discussed, from the Harlem RBI perspective: What do we evaluate'
  • Maximal program participation - attendance; how often and regularly are kids here'
  • High quality program delivery - how are our providers doing'
  • Content alignment ' does what we do with elementary school age kids relate to middle and high school and on to college'
  • Achievement of outcomes - do kids achieve what we say they will achieve'

She noted that Harlem RBI is just beginning to look at content alignment and achievement of outcomes in meaningful way. Developing a system and tools to evaluate your program takes time.

Megan described the evolution of evaluation at Harlem RBI in phases.

Phase 1: An outside evaluator conducted an evaluation of Harlem RBI. One of the most interesting findings was that kids come to the program because they like being part of a team and their relationships with adults. They used this information to inform their practice--the program began to involve youth in hiring new staff. This gave the youth voice in choosing the adults that they would be developing relationships with.

Phase 2: Megan and another staff member attended the Bowne Foundation's Evaluation Institute, an 18-month program. This helped integrate evaluation into the program, a process that was internal to the organization, instead of having an outside evaluator coming in to conduct an evaluation. Megan and her colleague really understood what they were looking at, what they were asking, and this allowed them to better tell the story of the impact of Harlem RBI's work.

Phase 3: The organization formed the position of 'Director of Organizational Excellence.' In that role, Megan was able to get more on top of measuring attendance so that they could consider enrollment patterns and attrition in their recruitment planning. She was also able to develop a tool for looking at practice with part time staff. This new way of gathering information and looking at staff practice had an immediate effect; staff practice improved and turnover was reduced. But Megan still felt staff did not 'own' the process'she was not feeding back the results of the evaluation to them. Evaluation needs to start with staff being involved in developing the evaluation tools and end with staff being involved in analyzing the data being collected to inform the decisions they make about the program.

Phase 4: Harlem RBI has hired a full time evaluator to help them look in more depth at what they are doing and how they can do it better. This has only just begun, but promises to allow them to get better at completing the feedback loop of evaluation.

Over the course of this evolution Megan has learned many things. In her initial enthusiasm she wanted to evaluate everything in her program and ended up being overwhelmed. It's better if you start small'focus on evaluating one part of the program and build from there.

In Phases 2 and 3 Megan found that she could only really focus on Youth Participation. She learned that they needed to get better at data collection and that meant involving their coaches and youth leaders to help. Since kids participate in both baseball or softball teams AND classes, they needed to look at attendance of who comes to what, when. When they were able to look at the patterns that emerged they could be more strategic in targeting different age groups for recruitment. It also led them to ask new questions of their kids about why they come and why they don't.

Throughout this process Harlem RBI has also asked how they could use evaluation information to improve the performance of part time staff and volunteers. The first year, they used a tool borrowed from Ramapo. However there was no staff voice, it did not lead to solving problems or engagement, and it did not drive quality. During the second year they developed their own 5 page tool to observe staff, but it was too much, too top-down. They took that document to the staff themselves and got their input to revise the tool and they also developed a protocol for review that included NOT putting the evaluation/observation in the staff person's file. It was truly away of having dialogue to cultivate staff development. (See REAL KIDS Summer 2011 Observation Tool)

By the third year they had a 1 page tool with a protocol for observation of staff. Staff received training to introduce the tool and protocol and an opportunity to make any changes they felt appropriate before it was used. In the training the staff used the tool to observe the facilitators. The tool was then used with staff for 15 minute observations of which the staff received advance notice, followed by discussion about practice. The tool and protocol now feed into the end of year evaluation, which is a formal piece that goes into staff members' files. The process of evaluation has been demystified and staff feel supported in growing and developing. (See attached Summer Learning Coach Evaluation)

Going forward in year four, Harlem RBI is planning to do peer to peer observation and feedback with staff members using their tool. They will be collecting the observation forms to see where they need to go with training for staff and what else they can learn.

Finally Megan shared her insights and challenges in this evaluation process. She listed them as follows:

  • Just jump in and learn as you go.
  • Start small - you can go 'vertical' with a few items and get a lot out of the process rather than trying to cover broadly (horizontally) many topics and being overwhelmed.
  • Keep it simple - for example a one page tool is much better than a 5 page tool.
  • Get feedback from staff on any tools you plan to use before releasing them.
  • Report back - having the conversation to give feedback after observations allows you to drive quality program decisions.
  • Be persistent - don't give up on the process.

During the Q and A Megan touched on the following topics:

How do you rally the resources and people' It is really important to get buy in from staff. Participating in evaluation looks good on a resume and that can be an incentive. Many components of collecting data can be delegated to youth in your program and it gets them engaged in a different way. And you might be able to set up an internship program to get high school or college students or others looking for an internship experience to help you with some of the tasks.

How do you know what to focus on' You can think about what measures will help you look at some concrete aspects of your program. You can also brainstorm with staff and get them engaged in prioritizing and develop a focus that way. One way to do that is multi-voting. Ask staff to vote on three items listed and tally the votes. The items with the most tallies are 'winners' or items to focus on.

How do you get youth engaged' You can ask them to complete a youth survey, like a 'customer satisfaction' survey. Addressing the questions to specific and concrete things helps them to be able to answer in a meaningful way. For example, you can ask kids what they think about a specific activity and what they would do to make it better. You can also conduct focus groups with youth. Having others sharing and talking can get their thinking going and you will generate more information.

How do you share evaluation information or findings with participants' One way to share is through readable, accessible reports. But you can also ask staff to think ahead to what they noticed, what they want to know and what questions they have. Then the report back can focus on responding to their questions and observations.

Ideas and Inspirations

Next participants engaged in some brainstorming and planning to build on what Megan presented and adapting it to their programs. Using a worksheet (see attached Using Evaluation to Inform Everyday Program Decisions Worksheet) to prompt their thinking, participants worked for a brief time on their own and then shared and got feedback from the other participants in their group. In conclusion, volunteers shared what they were excited about and thinking about trying:

  • LAMP, a media literacy organization, is thinking about giving the process a media literacy slant. They are planning to use the process of viewing or observing and then making changes with viewing media.

  • Global Kids is thinking of using their alumnae network. They are planning to survey those who have graduated from their programs and now are in college about what they remember, liked, etc. They are also thinking about how to the use the self-assessment they do with their youth and their staff, and how to get a better return on their family surveys.

  • Fresh Youth Initiative is thinking about some of the data that they already have that does not get analyzed to solve problems or drive quality. They are hoping to take a fresh look at reading and academic records of kids as well as 'outcome progress reports.'

  • FYI is also thinking about asking kids and families to help them with building questions into their program that would allow them to get before and after pictures.

  • Queens Community House is thinking about creating an evaluation that can work across multiple programs since they are a multipurpose organization. They have articulated some outcomes that deal with socialization skills and good decision-making. They are also working on triangulating their data between families, children, and staff.

Many participants noted how important it is to understand the many layered meanings of evaluation in the activity they generated at the beginning of the session, such as evaluation as a process and the need to support participants in opening up and reflecting. These should inform how you implement evaluation in you program. We look forward to gathering again in May to hear back from participants on what they have tried and how it is going, and to consider some other facets of evaluation.

Next Meetings

The next Robert Bowne Foundation Networking meeting on Evaluation will be held on May 11, 2012.