VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Kelly Pierce <[log in to unmask]>
Reply To:
VICUG-L: Visually Impaired Computer Users' Group List
Date:
Sun, 10 Jan 1999 02:44:55 -0600
Content-Type:
TEXT/PLAIN
Parts/Attachments:
TEXT/PLAIN (4892 lines)
This is an excellent tool for local groups to use in determining their
effectiveness or evaluating the service providers in your community to
figure out theirs.

kelly



>From the file in portable document format
http://www.horizon-research.com/stock.pdf

                           Pages 1--97 from

   A Practical Guide to
   Evaluating Your Own Programs

   by
   Sally L. Bond
   Sally E. Boyd
   Kathleen A. Rapp

   with
   Jacqueline B. Raphael
   Beverly A. Sizemore

   1997
   Horizon Research, Inc. 111 Cloister Court -Suite 220
   Chapel Hill, NC 27514-2296

   Development of this manual, Taking Stock: A Practical Guide to
   Evaluating Your Own Programs, was made possible with the sup-port
   of the DeWitt Wallace-Reader's Digest Fund which sponsors
   Science Linkages in the Community (SLIC), a national program
   designed and coordinated by the American Association for the
   Ad-vancement
   of Science (AAAS).

   Table of Contents
   Acknowledgments
   Introduction
   Chapter One Why Use This Manual? 1

   Program Evaluation Basics
   Chapter Two Why Evaluate? 5
   Chapter Three Getting Started: Framing the Evaluation 11
     Documenting Context and Needs

   Chapter Four What Are You Trying To Do? 15
     Defining Goals and Objectives

   Chapter Five Finding the Right Mix 21
     Using Quantitative and Qualitative Data

   Chapter Six Finding the Evidence 27
     Strategies for Data Collection

   Chapter Seven Making Sense of the Evidence 39
     Interpreting and Reporting Your Data

   Examples Featuring Fictional CBOs
   Chapter Eight Applying This Manual 45
     How One CBO Did It

   Chapter Nine Applying This Manual in a Bigger Way 55
     Expanding the Evaluation Design

   Appendices Sample Reports Using Evaluation Data
   Appendix A Final Evaluation Report 71
   Appendix B Proposal for Expanding a Program 75
   Appendix C Annual Progress Report 81

   Glossary of Terms 89
   Index of Key Concepts 93

   Acknowledgments
   Taking Stock was prepared at the request of the American Association
   for the Advancement of
   Science (AAAS) for its Science Linkages in the Community (SLIC)
   initiative. The work was
   carried out with the financial support of the DeWitt Wallace-Reader's
   Digest Fund. We gratefully
   acknowledge the vision and support of these two organizations in the
   development of this manual
   and its companion workshop.

   For many years, AAAS has worked with national and local
   community-based organizations to
   design and deliver enrichment activities for use in out-of-school
   science programs. Beginning in
   1993, the SLIC initiative focused these efforts in three diverse U. S.
   cities (Chicago, IL; Rapid
   City, SD; and Rochester, NY). Recognizing that evaluation is essential
   for quality programming
   and continued funding, AAAS commissioned Horizon Research, Inc. to
   develop materials to
   guide community-based organizations through the evaluation of their
   informal science activities.
   However, in the course of writing the manual, it became clear that the
   need for a fundamental
   grounding in program evaluation is not limited to organizations that
   provide out-of-school science
   activities.

   And so, this manual evolved into what we hope is a more broadly useful
   guide to program
   evaluation. To the extent that we have succeeded, we owe much to our
   reviewers, who include
   members of the SLIC National Planning Council and their colleagues, as
   well as grantmakers from
   national and local foundations: DeAnna Beane, Association of
   Science-Technology Centers;
   Janet Carter, Bruner Foundation; Stacey Daniels, Kauffman Foundation;
   Catherine Didion,
   Association for Women in Science; Hyman Field, National Science
   Foundation; Sandra Garcia,
   National Council of La Raza; Maritza Guzman, DeWitt Wallace-Reader's
   Digest Fund; Barbara
   Kehrer, Marin Community Foundation; Roger Mitchell, National Science
   Foundation; Nancy
   Peter, Academy of Natural Sciences; Annie Storer, American Association
   of Museums; Ellen
   Wahl, Education Development Corporation; and Faedra Weiss, Girls, Inc.

   Other important "reality checks" were provided by AAAS and SLIC staff,
   most notably: Betsy
   Brauer, Rochester SLIC; Yolanda George, AAAS; Gerri Graves, AAAS;
   Michael Hyatt, Chicago
   SLIC; and Margie Rosario, Rapid City SLIC.

   Finally, we would like to thank Shirley Malcom, head of the AAAS
   Directorate for Education and
   Human Resources Programs, who recognized the need for this resource
   and provided us with the
   time and space to do it right.


   Chapter One
   WHY USE THIS MANUAL?
     Do you want information that will help improve your organization's
   programs?
     Are your sponsors asking about the quality and impact of the
   programs they fund?
     Are you applying for a grant that requires an evaluation plan?
   If you answered "Yes" to any of these questions, then this manual can
   help. It is a practical guide
   to program evaluation written for community-based organizations
   (CBOs). It provides
   information that you can put to use now to help improve your programs.

   This manual focuses on internal evaluation that is, program evaluation
   conducted in-house by
   CBO staff. We have taken this approach for one simple reason: many
   CBOs cannot afford to
   hire someone outside the organization to evaluate their programs, but
   they still need the kinds of
   information that evaluation can provide.

   The information in this manual should better prepare you to design and
   carry out a program
   evaluation. And because the field of evaluation is now putting greater
   emphasis on participatory
   evaluation (a middle ground between internal and external evaluation),
   you will be able to apply
   your knowledge either within your own organization or in working with
   an external evaluator.
   This manual will also help you recognize when you might need an
   external evaluator and the ad-vantages
   of using these services, should your CBO be able to afford them at
   some point.

   Here are some assumptions that we made about you as we wrote this
   manual:
     You care about kids and communities.
     Your organization is committed to providing the best services
   possible.
     You have some experience running or participating in a CBO program,
   so you have an
   idea of how to get things done.

     You want to evaluate a program not the people who run it or
   participate in it.
   These shared qualities aside, we realize that CBOs come in all shapes
   and sizes. Some have full-time
   staff and annual program budgets exceeding $100,000; others spend less
   than $5,000 per
   program and rely almost entirely on volunteers. Community-based
   organizations also range
   widely in their goals from teaching new information or skills, to
   strengthening families, to
   enhancing students' educational and career options.


   This manual is designed to help a wide variety of
   organizations, whatever your goals or resources.

   What's In This Manual?
   Chapters 2 7 include basic information on evaluation concepts and
   techniques. Ideally, everyone
   who picks up this manual will read these chapters for some background
   in program evaluation.

     Chapter 2 talks about what evaluation can do for your programs and
   describes
   two types of evaluation: formative and summative.

     Chapter 3 discusses the importance of documenting needs and context,
   and
   identifies some important first steps in planning your evaluation.

     In Chapter 4, we distinguish between program goals, objectives,
   indicators,
   and outcomes, and their role in evaluation.

     Chapter 5 talks about using quantitative and qualitative data to
   evaluate
   progress and impact.

     Chapter 6 describes how to collect information to evaluate your
   programs
   through document review, observations, interviews, and surveys.

     Chapter 7 provides tips for organizing, interpreting, and reporting
   the
   evaluation data that you collect.

   Overview of the Evaluation Process

   / Identifying needs
   / Documenting context
   / Taking stock of
   available resources
   / Designing program
   strategies

   / Choosing goals that are
   consistent with needs
   / Defining objectives
   / Generating evaluation
   questions
   / Selecting indicators and
   outcomes

   / Expanding the evaluation
   plan
   / Looking at records and
   documents
   / Observing program
   activities
   / Interviewing people
   / Conducting surveys

   / Looking for themes
   / Interpreting data
   / Putting it together
   / Reporting your results

   Framing the
   Evaluation

   Framing the
   Evaluation
   Defining
   Goals and
   Objectives

   Defining
   Goals and
   Objectives
   Finding the
   Evidence

   Finding the
   Evidence
   Making Sense
   of the
   Evidence

   Making Sense
   of the
   Evidence

   The remaining chapters of this manual show how to apply this
   information, with examples of how
   evaluations might differ for programs with varying levels of
   resources. Chapter 8 takes you
   through a simple evaluation of a small program in a fictional CBO.
   Chapter 9 describes how the
   same CBO enlarged the evaluation when the program was expanded. We
   have also included
   sample evaluation plans and instruments that can be adapted for use in
   your own programs.


   The appendices include examples of three types of reports that present
   evaluation information:
     Appendix A is an example of a final evaluation report that describes
   the im-pact
   of the small-scale program described in Chapter 8.

     Appendix B illustrates a proposal for expanding the scope of this
   program as
   described in Chapter 9.

     Appendix C models an annual progress report that describes the
   formative
   evaluation of the multi-year program described in Chapter 9.

   A Glossary of Terms is included at the end of the manual. Throughout
   the manual, words and
   terms that are shown in bold italics are defined in this glossary.

   Finally, we have tried to make this manual accessible to a wide range
   of audiences. As an over-view,
   it takes a relatively traditional approach to evaluation, providing
   information on funda-mental
   concepts and activities and how these can be applied. However, in
   practice the field of
   evaluation is far more complex than we have described it here. Using
   this guide as a basic intro-duction,
   we recommend the following resources to help you expand your knowledge
   and under-standing
   of program evaluation.

   Assess for Success: Needs Assessment and Evaluation Guide, ~ 1991
   Girls Incorporated
   30 East 33 rd Street
   New York, NY 10016

   Leadership Is: Evaluation with Power, ~ 1995
   by Sandra Trice Gray
   Independent Sector
   1828 L Street, NW
   Washington, DC 20036

   Measuring Program Outcomes: A Practical Approach, ~ 1996
   United Way of America
   701 North Fairfax Street
   Alexandria, VA 22314

   User-Friendly Handbook for Project Evaluation: Science, Mathematics,
   Engineering and
   Technology Education
   by Floraline Stevens, Frances Lawrenz, Laure Sharp
   National Science Foundation
   4201 Wilson Blvd.
   Arlington, VA 22230


   Notes


   Chapter Two
   WHY EVALUATE?
   To evaluate something means literally to look at, and judge, its
   quality or value. A CBO might
   evaluate individual employees, its programs, or the organization as a
   whole. When you evaluate a
   person's performance, you try to find out how well she carries out her
   responsibilities. When you
   evaluate a program, you want to know how far the program went in
   achieving its goals and
   objectives. And when you evaluate an organization, you ask how well it
   operates to achieve its
   larger organizational mission. Evaluation involves the collection of
   information that helps you to
   make these judgments fairly.

   This manual focuses exclusively on program
   evaluation. Why is program evaluation so
   important?

     First, it generates information that can help
   you to improve your programs.

     Second, it can demonstrate to funders and
   others the impact of your programs.

   In the past, evaluation was often used only to measure performance.
   Based on information gath-ered
   in a final, summative evaluation, further funding decisions were made.
   Programs were
   continued or discontinued depending on the results of the evaluation.

   Luckily, program staff and funders have begun to expand their view of
   evaluation and appreciate
   its potential for program improvement. Through ongoing, formative
   evaluation, you and your
   sponsors can gain insight into how well your program is performing and
   what adjustments may be
   necessary to keep it on track.

   More about Formative Evaluation
   Formative evaluation can help you determine how your program is doing
   while it is in progress, or
   taking form. The information you collect can help you make changes in
   your program and correct
   problems before it's too late! Formative evaluation can also help you
   identify issues of interest
   that you might not have thought about when planning your program. And,
   it can help shape and
   refine your data collection activities.


   Formative Evaluation (Provides information as a program takes form)
     Monitors progress toward objectives
     Provides information to improve
   programs

     Helps identify issues of interest
     Helps refine data collection activities
     Helps clarify program strengths and
   limitations

   Information from a variety of sources (such as participants,
   instructors, and parents) can tell you
   how a program is progressing. For example: Do students like the
   program? Are staff and par-ticipants
   satisfied with the activities? What changes are needed to improve the
   program?

   The people involved with your
   programs should be consulted
   during the evaluation planning
   stage, and as often as your re-sources
   permit during program
   implementation. Let partici-pants
   know that their opinions
   are important, and provide
   them with opportunities to
   share their views. With their
   input, you can improve your
   programs and increase the
   likelihood that you will achieve
   positive results. Even
   programs that have been suc-cessful
   for a long period of
   time benefit from suggestions
   and comments. This formative
   evaluation feedback can help
   good programs become even
   better.

   Pinpointing Problem Areas:
   Getting Formative Feedback

   Youth Action Today! was in the third year of providing
   three-day summer camps for middle school students and
   their high school mentors. Interest in the camp had
   steadily increased among sixth and seventh graders, with
   enrollment rising each year. But pre-registration this spring
   showed fewer eighth graders were signing up. Thinking
   fast, program staff met with several small groups of eighth
   graders, who had attended the camp when they were
   younger, to see if they knew what the problem was.
   Students told the staff that word was out that camp
   activities were "babyish" and that the camp wasn't
   "cool" enough for older kids. With this feedback, pro-gram
   staff revamped the eighth grade activities to pro-vide
   more opportunities for interacting with the high
   school mentors. In addition, they engaged in a publicity
   campaign through eighth grade teachers and parents to
   talk about how the camp would be different this year and
   more appealing. Their efforts paid off as eighth grade
   registration increased for the day camp.


   More about Summative Evaluation
   Summative evaluation differs from formative evaluation in two
   important ways purpose and
   timing. Ongoing, formative evaluation helps monitor progress as
   programs are occurring. Sum-mative
   evaluation occurs when you are summing up what you have achieved. This
   can occur at
   the end of the program, or at appropriate "break points" during the
   implementation of an on-going
   or multi-year program.

   Planning for Summative Evaluation
   What are you trying to achieve? What do you want
   your participants to know or be able to do when
   they have finished your program (that is, what are
   your goals and objectives)?

   How will you know whether or not you have
   achieved what you intended? What evidence will
   convince you? What evidence will convince your
   funder?

   Summative evaluation helps you determine if you achieved what you and
   your sponsor set out to
   do. To understand what your program achieves, however, you have to
   know where you began.
   This is why it helps to collect baseline information before, or very
   soon after, a program begins.
   Baseline questions might include:

     How serious is a particular problem or need among children who will
   partici-pate
   in your program?

     What behaviors, interests, or skills do the children have at the
   start of the
   program?

   The amount of baseline information you collect will depend on your
   level of resources. For ex-ample,
   you may be limited to asking participants about their attitudes or
   behaviors. Or you may
   have the resources to gain a fuller picture by also asking parents and
   teachers about participants'
   needs, interests, and skills.

   Collecting summative information allows you to find out how well the
   program achieved what it
   set out to do. Have children's skills or interest levels increased
   because of the program? What
   parts of the program appear to have contributed most (or least) to the
   participants' success? If
   you did not achieve what you intended, how do you account for this?
   What should you do dif-ferently
   next time?


   Summative Evaluation (Provides information for summing up at the end
   of a program)
   Baseline
   Information

   Participant skills, behaviors, and
   attitudes before the program

   Summative
   Information

   Participant skills, behaviors, and
   attitudes after the program

   In this chapter, we have distinguished between formative and summative
   evaluation in terms of
   tracking progress and gauging impact. Both kinds of information are
   important for improving
   programs, for determining whether programs are successful, and for
   illustrating the success of
   programs to others.

   While it is important to grasp the difference between formative and
   summative evaluation, it is
   equally important to think of these activities as part of an on-going
   evaluation process, not as
   distinct categories. Data collected while the program is in progress
   can be used in the summative
   evaluation to gauge impact. Similarly, information collected at the
   end of a program can be used
   in a formative way for designing an improved or expanded program or
   new programs with similar
   goals.

   Why Evaluate?
   To generate information that
   can help you to improve your
   programs by:

     Monitoring progress toward
   program objectives

     Identifying issues of impor-tance
   to program participants

     Refining data collection ac-tivities

   To demonstrate the impact of
   your programs to funders and
   other potential supporters by:

     Assessing progress toward
   program goals

     Documenting the quality of
   your programs and describing
   the effects on participants

     Quantifying the amount of
   change experienced by pro-gram
   participants

   Now that we have discussed the main reasons for doing evaluation, we
   can begin to explore the
   program design and evaluation process. The first step, identifying
   needs and documenting con-text,
   is described in Chapter 3.


   Notes


   Notes


   Chapter Three
   GETTING STARTED: FRAMING THE EVALUATION Documenting Context and Needs

   / Identifying needs
   / Documenting context
   / Taking stock of
   available resources
   / Designing program
   strategies

   Framing
   the
   Evaluation

   Framing
   the
   Evaluation

   Defining
   Goals and
   Objectives

   Defining
   Goals and
   Objectives
   Finding the
   Evidence

   Finding the
   Evidence
   Making Sense
   of the
   Evidence

   Making Sense
   of the
   Evidence

   Evaluation planning should begin at the same time you are thinking
   about the design of your
   program. But how do you get started? What do you need to think about
   in the early stages of
   program and evaluation planning?

   You start the process by clarifying what needs you are trying to
   address, who your audience will
   be, and the setting, or context, in which your program will operate.

   Early Program and Evaluation Planning
     What needs are you trying to address?
     How are these needs best identified?
     Who is your targeted audience?
     What factors will influence levels of par-ticipation
   and program success?

   Setting the Stage for Evaluation: Documenting Context and Needs
   Documentation is an important piece of the evaluation puzzle. It
   involves describing (rather than
   assessing) conditions, events, or people to help gain a better
   understanding of the context in
   which a program occurs. For example, what are the socioeconomic and
   demographic character-istics
   of the community and the targeted audience? How might these factors,
   and others, affect
   how you implement your program?
   Knowing the finer details of context is also crucial for program and
   evaluation design. For ex-ample,
   lack of transportation may deter students from staying after school
   for a tutoring program,


   which in turn will influence program success. In a case like this,
   program planning would include
   working with school administrators to arrange for a later bus
   departure, or rescheduling sessions
   earlier in the day.

   Initial documentation activities often focus on the identification of
   needs, or needs assessment.
   Information gathered before a program is planned or implemented can
   help staff to identify needs,
   identify appropriate target populations for program services, and
   customize their program design
   to meet specific needs. Collecting this kind of information can also
   help you justify your program
   to your community and to potential funders.

   There are many ways to document needs. You can attend community and
   church meetings to
   learn about the concerns of neighborhood residents. You can informally
   survey human services
   personnel to find out what needs they see in your community. And you
   can conduct interviews or
   focus groups with parents, teachers, or students in your community.
   Identifying and docu-menting
   the needs identified by people who live and/ or work in your community
   helps to lay the
   groundwork for program and evaluation design.

   Thinking Like an Evaluator
   As an experienced program de-signer,
   you know what ques-tions
   to consider next:

     What strategies will enable
   me to address the needs I've
   identified?

     What resources do I have to
   work with including
   funds, personnel (paid and
   volunteer), and in-kind
   contributions of facilities
   and equipment?

     Given the level of resources
   available to me, which of
   the possible strategies can I
   implement well?

   Now, in order to design a good
   evaluation plan, you need to start thinking like an evaluator. In
   order to do that, you must
   translate the needs you've identified into realistic goals and
   objectives. This is the subject of
   Chapter Four.

   Needs Assessment and Baseline Data
   There's an important connection between summative
   evaluation and the documentation of context and needs.
   When we described summative evaluation in Chapter
   Two, we talked about the importance of comparing
   baseline data information gathered prior to program
   implementation with data collected at various break-points
   during, or in the final phase of, a program. Data
   collected for needs assessment purposes may also be
   used as baseline data.

   Once you have collected data which adequately de-scribes
   the context and needs of your target population at
   the beginning of your program, you can plan to collect
   the same kinds of descriptive information at the end of
   your program. One way to evaluate the effectiveness or
   impact of your program is then to compare baseline and
   summative data. What has changed as a result of your
   efforts?


   Notes


   Notes


   Chapter Four
   WHAT ARE YOU TRYING TO DO? Defining Goals and Objectives

   / Choosing goals that are
   consistent with needs
   / Defining objectives
   / Generating evaluation
   questions
   / Selecting indicators and
   outcomes

   Framing the
   Evaluation

   Framing the
   Evaluation
   Defining
   Goals and
   Objectives

   Defining
   Goals and
   Objectives
   Finding the
   Evidence

   Finding the
   Evidence
   Making Sense
   of the
   Evidence

   Making Sense
   of the
   Evidence

   One of the most important evaluation questions you can ask is, "What
   do I expect to accomplish
   through this program?" Another way to phrase this is:

   "What are my goals and objectives?"
   The answer to this question will influence how you design your program
   and your evaluation.
   If you were to look up the words "goal" and "objective" in the
   dictionary, you might find them
   used to define each other. But in program design and evaluation, the
   terms goal and objective are
   used for different things. A goal is what you hope to accomplish when
   your program is com-pleted
   it is broader than an objective. An objective, on the other hand,
   refers to a more specific
   event or action that has to occur before you can achieve your goal.

   Given the complexity of the problems that CBO programs typically
   address, it is important to be
   realistic about which part( s) of a long-term goal or problem you can
   successfully tackle through a
   single program.

   What is Realistic? Breaking Down Goals
   CBO program goals are sometimes as broad and ambitious as the
   organization's mission, or rea-son
   for existing. For example, your organization's mission may be to
   prepare the youth of your
   community for future employment. There are many ways that you might
   accomplish this
   mission through educational programs, leadership development programs,
   or job skills
   programs. Community members or potential participants may have ideas
   about appropriate
   strategies. But how do you decide on a plan for a specific program?
   One way to identify possible


   objectives is to think about your goal as a problem to be solved. As
   you break the problem down,
   you can see that there are many possible objectives that must be
   achieved in order to truly
   accomplish your goals.

   For example, given that your
   mission is to prepare youth for
   future employment, you might
   choose to pursue the following
   goal:

   "Prepare youth to enter science-and
   mathematics-related fields"

   What kinds of experiences would
   help to prepare children for careers
   in these fields? Here are some
   ideas:

     Elementary school stu-dents
   need exposure to
   good science and mathe-matics
   enrichment activi-ties
   in order to develop
   their interest in these
   subjects and to enhance what they learn in school.

     Middle school students need to spend time with role models or
   mentors who can
   advise them on appropriate ways to prepare for a specific field and
   provide them
   with some meaningful experiences in that field.

     Middle and high school students need to experience high-quality
   tutoring in key
   areas, such as Algebra and Chemistry, which are useful in many
   science-and
   mathematics-related fields.

     High school students need access to appropriate guidance services to
   help them
   identify post-secondary programs that suit their needs and interests
   in science and
   mathematics.

   The objectives of your career preparation program will then be to
   provide one or more of these
   experiences or services to the youth you serve. It is important to
   remember, however, that these
   objectives represent just a few of the options a CBO might use to
   address this particular goal, and
   that other objectives might be equally valid. In other words, there is
   no finite number of "correct"
   objectives to meet a selected goal.

   Being Realistic:
   Separating Goals from Objectives

   Let's say the goal of your program is to reduce the school
   drop-out rate. This goal could be addressed in many dif-ferent
   ways. Based on your experience and the resources
   available to you, you and your colleagues decide that a
   realistic objective for this program is to provide mentors for
   middle and high school students who are at risk of school
   failure.

   You believe that achieving your objective (providing stu-dents
   with positive, one-on-one relationships with caring
   peers or adults) will decrease participants' tendency to
   engage in self-destructive behaviors, and will stimulate
   their interest in school first steps toward addressing your
   long-term goal of reducing the drop-out rate. With your
   objective in mind, you design program activities that you
   feel will support positive mentoring relationships.


   Relationship between mission, goal, and objectives
   Organizational Mission
   Program
   Objective 2

   Program
   Objective 2 Program Objective 1
   Program
   Objective 1

   Program
   Goal

   Program
   Goal

   Working Out An Evaluation Plan
   Now that you have identified your goals and objectives, you can begin
   framing formative evalua-tion
   questions in terms of progress toward your objectives and summative
   evaluation questions
   in terms of impact on your goals.

   Using the example of the program to prepare youth for future
   employment in science-and
   mathematics-related fields, your objectives are (1) to provide
   elementary age students with high
   quality science and mathematics activities outside of school, (2) to
   develop their interest in science
   and mathematics, and (3) to build on the science and mathematics that
   these students are learning
   in school. What evaluation questions will help you determine if you
   are making progress toward
   these three objectives? Using a chart like the one that follows might
   help you visualize how the
   evaluation design will take shape.


   Developing an Evaluation Plan
   Mission: To prepare the youth of our community for future employment
   Goal: To prepare youth to enter science-and mathematics-related career
   fields
   Objectives: a) To expose elementary students to good science and
   mathematics activities
   b) To develop students' interest in science and mathematics
   c) To enhance the science and mathematics that students learn in
   school

   Sample Formative Questions
   (related to objectives)

     What do students think of the mathematics
   and science activities that we provide?

     How do students demonstrate genuine in-terest
   in science and mathematics?

     How are students using the science and
   mathematics they learn in school as they
   participate in our activities?

   Sample Summative Questions
   (related to goals)

     How do students' interest in science-and
   mathematics-related careers compare before
   and after the program?

     What steps have students taken on their
   own to find out more about science-and
   mathematics-related careers?

   You will undoubtedly come up with many evaluation questions as you try
   to develop a similar
   plan for your own programs. Some of your questions will be very
   specific, like "Did students
   appear to be interested in the nature hike?" Other questions will be
   more general, like the ones in
   the preceding box. Whatever your questions are, grouping them in terms
   of your goals and ob-jectives
   will help you to organize your thoughts and to identify gaps in your
   evaluation plan.

   How Will You Know When You Get There? Measuring Progress and Impact
   Thinking through your evaluation questions in terms of the goals and
   objectives you have defined
   provides the foundation for your evaluation plan. The next step is
   equally important deciding
   what kinds of evidence will convince you and your funders that your
   program is a success. What
   do you expect to see, hear, or measure if your program is successful
   at achieving your objectives
   and ultimately your goals?

   In the formative evaluation stage, while a program is in progress, we
   look for intermediate indi-cators
   what you expect to see if you are progressing toward your objectives.
   In the early career
   preparation program described above, intermediate indicators might
   include:

     Parents reporting that the students talk enthusiastically about
   program
   activities while at home.

     Students asking questions that indicate they are linking science and
   mathematics concepts with their everyday lives.


     Science and mathematics teachers reporting that students refer to
   program
   experiences during classroom discussions.

   In the summative stage of the evaluation, when the program is
   completed, we look for evidence of
   final program outcomes. These are the changes you expect to see if
   your program has actually
   achieved its goals. Once again using the early career preparation
   program as the example, you
   might expect outcomes such as the following:

     When asked to list jobs that interest them, more students mention a
   science-or
   mathematics-related field after the program than when asked this
   question at
   the beginning of the program.

     Over the course of the program, at least half of the participants
   checkout
   library books related to science and mathematics professions.

   Relationship between mission, goals, objectives, indicators and
   outcomes
   Organizational Mission
   Final Program Outcomes Final Program Outcomes
   Intermediate
   Indicators

   Intermediate
   Indicators Intermediate Indicators
   Intermediate
   Indicators Intermediate Indicators
   Intermediate
   Indicators

   Objective 3 Objective 3 Objective 2 Objective 2 Objective 1 Objective
   1
   Goal 1 Goal 1

   Final Program Outcomes Final Program Outcomes
   Intermediate
   Indicators

   Intermediate
   Indicators Intermediate Indicators
   Intermediate
   Indicators Intermediate Indicators
   Intermediate
   Indicators

   Objective 1 Objective 1 Objective 2 Objective 2 Objective 3 Objective
   3
   Goal 2 Goal 2
   Program Program

   The figure above illustrates the interrelationships between
   organizational mission, program goals,
   objectives, indicators, and outcomes. In Chapter Five, we briefly set
   aside our discussion of the
   evaluation process in order to explore in more depth the different
   kinds of information that can be
   used to define indicators and outcomes.


   Notes


   Chapter Five
   FINDING THE RIGHT MIX Using Quantitative and Qualitative Data
   How will you know whether you are achieving your objectives and making
   progress toward your
   goals? What counts as evidence of progress and impact? Though
   simplifying a bit, it's conven-ient
   to think of measuring progress and impact in terms of quantitative and
   qualitative data.

   What are Quantitative Data?
   Information that is measured and expressed with numbers can provide
   quantitative data. For ex-ample,
   attendance records can show the number of persons who participate over
   a period of time;
   surveys can show the percent of participants who respond to a question
   in a certain way. These
   quantitative data can be used in a variety of ways. To name just a
   few, they can be presented as
   numbers or percents, as ranges or averages, and in tables or graphs.
   They can also be used to
   compare different groups of participants girls and boys, students of
   different socioeconomic or
   ethnic backgrounds, or students in your program with non-participants.

   To illustrate different ways to present quantitative data, let's go
   back to the mentoring/ dropout
   prevention program that we first described in the box on page 16. In
   this example, the 15 middle
   school students (7 girls and 8 boys) and 25 high school student
   participants (10 girls and 15 boys)
   were asked to fill out a questionnaire at the end of the school year.
   The following tables and
   graphs illustrate several ways to present the same questionnaire
   results.

   As numbers, combining the results for
   all of the program participants:
   As percentages, separating middle school
   from high school:

   End-of-Year Survey

   Response on Questionnaire
   Number responding
   Agree/ Strongly Agree

   I look forward to meetings with
   my mentor.

   I think my mentor cares about
   me personally.

   I understand my school work
   better when my mentor helps
   me.

   38
   38

   23
   Total Number of Participants 40

   End-of-Year Survey
   Percentage responding
   Agree/ Strongly Agree

   Response on Questionnaire
   Middle
   School
   High
   School

   I look forward to meetings with my
   mentor.

   I think my mentor cares about me
   personally.

   I understand my school work better
   when my mentor helps me.

   100
   87
   67

   92
   100
   52
   Total Number of Participants 15 25


   You might also choose to present some of the information graphically
   to help make a point that
   might be difficult to see in a table. Here, the graph shows that the
   boys responded quite differ-ently
   from the girls to one specific question:

   Students Reporting They Understood School Work
   Better with the M entor's Help

   0%
   10%
   20%
   30%
   40%
   50%
   60%
   70%
   80%
   90%
   100%

   Middle School High School
   Girls
   Boys

   Notice how each of these examples has highlighted a different aspect
   or detail in the questionnaire
   results. We went from looking at the results for all participants, to
   comparing results for middle
   and high school participants, and finally comparing results for boys
   and girls at the middle and
   high school levels.

   What are Qualitative Data?
   Evaluators also look at progress and impact in terms of qualitative
   data, where changes are more
   often expressed in words rather than numbers. Qualitative data are
   usually collected by document
   review, observations, and interviews. Open-ended questions on surveys
   can also generate
   qualitative data.

   Qualitative data can provide rich descriptions about program
   activities, context, and participants'
   behaviors. For example, we can assess the impact of the mentoring/
   dropout prevention program
   on students' relationships with their mentors by describing how well
   the student-mentor pairs
   interact before and after the program.


   Example of Qualitative Data Observations of Program Activities
   Student behaviors during the
   first week of a program

   At a "Get Acquainted" bowling party, stu-dent/
   mentor pairs grouped themselves into
   two pairs per alley. In some cases, the
   youths spent most of the time talking to-gether,
   not mingling with the adults. In
   two cases, the youths left the bowling area
   to play video games. Several adults ap-peared
   hesitant to break into the youthful
   conversations; in most cases, the adults sat
   and conversed separately.

   Several of the youths bowled a game or
   two with their mentor, but appeared un-comfortable
   with the adult, and uneasy
   about approaching other youths who were
   engaged in conversations. These students
   seemed bored and distracted.

   Student behaviors during the
   last week of a program

   At a "Welcome Summer" picnic, students
   and mentors appeared quite comfortable
   with each other. Most students chose to sit
   near their mentors at picnic tables. All the
   students appeared at ease talking with their
   mentors, and in many cases, talking to other
   adults sitting nearby. No one appeared
   bored or hesitant to join in conversation.

   After eating, mixed groups of adults and
   students played volleyball and softball, with
   everyone actively participating. Interactions
   were relaxed and enthusiastic. Students and
   mentors appeared to enjoy the opportunity
   to be together.

   Qualitative data can also be expressed in numbers. For example,
   interview responses can be
   tallied to report the number of participants who responded in a
   particular way. Similarly, in the
   example above, the observer could report the number of students in the
   entire group who were
   actively engaged in the activity.

   Seeing Quantitative and Qualitative Data as Indicators and Outcomes
   To further illustrate quantitative and qualitative data, let's return
   to the mentoring program dis-cussed
   earlier. The goal of the program is to reduce the school drop-out
   rate. The objective is to
   provide positive role models and mentors for at-risk middle and high
   school students.

   Formative Evaluation: While your program is underway, how will you
   know that
   you are building mentoring relationships that are having a positive
   impact on stu-dents'
   behavior?

   The number of students who engage in weekly activities with their
   mentors
   is one possible quantitative, intermediate indicator. Using this
   in-formation,
   you might reason that steady or increased participation means
   that students enjoy the activities and find the new relationships
   rewarding.


   Fewer disciplinary reports with participating students mid-way through
   the
   program might also suggest progress.

   A change in students' behavior, as reported through teacher
   interviews, is a
   possible qualitative, intermediate indicator. Teachers might note that
   participating students are less hostile and more motivated since the
   pro-gram
   began. These qualitative data might suggest a change in students'
   attitudes toward themselves and others in authority.

   Summative evaluation: How will you know that building positive
   mentoring rela-tionships
   has helped produce behavior conducive to students staying in school?

   As baseline data, you compiled data on the number of disciplinary
   reports
   and suspensions among your participants before the program began. Your
   summative data the same data for participants at the end of each year
   of
   your program might show a leveling off or decline in these numbers.
   This would be a quantitative, final program outcome.

   Your observations or parents' and teachers' descriptions of students'
   be-havior,
   both before and after the program, can provide summative qualita-tive
   data. A description of behavior in and out of school that provides
   evi-dence
   of more interest and motivation is a possible qualitative, final
   program outcome.

   Program to Reduce the Drop-out Rate
   Quantitative Outcomes Qualitative Outcomes
   Intermediate
   Indicators
   Number of students who engage in
   activities with mentors stays the
   same or increases over course of
   program.

   Quality of students' interactions with
   others shows improvement during
   program.

   Final Outcomes Number of suspensions/ discipline
   reports decreases among participants
   by program's end.

   Quality of students' interactions in
   and out of school consistently im-proves
   by program's end.

   The following figure summarizes where we are now in the evaluation
   design process. In the next
   chapter, we resume our discussion of the evaluation process by
   focusing on methods for collect-ing
   quantitative and qualitative data.


   Goals
   Objectives

   Goals
   Objectives

   How do you
   plan to
   accomplish
   it?

   How do you
   plan to
   accomplish
   it?

   Strategies
   Activities

   Strategies
   Activities

   How will you
   know whether
   you've
   accomplished
   it?

   How will you
   know whether
   you've
   accomplished
   it?

   Outcomes
   Indicators

   Outcomes
   Indicators

   Types of data
   to "measure"
   progress and
   impact

   Types of data
   to "measure"
   progress and
   impact

   Quantitative
   Qualitative

   Quantitative
   Qualitative

   What do you
   want to
   accomplish?

   What do you
   want to
   accomplish?

   A Final Word About Quantitative and Qualitative Data
   Collecting both quantitative and qualitative data in your formative
   and summative evaluation is
   important, but is not always possible. For example, many positive
   outcomes do not have tests or
   scales associated with them, so a number cannot be assigned to measure
   progress or success. In
   these cases, qualitative data may prove more useful, since they allow
   you to describe outcomes
   with words. Qualitative data can also be highly useful for clarifying
   what you think is important,
   and for discovering new issues that you might have overlooked in your
   initial evaluation design.

   On the other hand, collecting and using qualitative data is often
   time-consuming and labor-intensive.
   As a general rule, you will want to use the measures (quantitative or
   qualitative) that
   are most feasible in terms of your skills and resources, and most
   convincing to you and your
   sponsors.


   Notes


   Chapter Six
   FINDING THE EVIDENCE Strategies for Data Collection

   / Expanding the evaluation
   plan
   / Looking at records and
   documents
   / Observing program
   activities
   / Interviewing people
   / Conducting surveys

   Framing the
   Evaluation

   Framing the
   Evaluation
   Defining
   Goals and
   Objectives

   Defining
   Goals and
   Objectives
   Finding the
   Evidence

   Finding the
   Evidence
   Making Sense
   of the
   Evidence

   Making Sense
   of the
   Evidence

   So far, you have defined goals and objectives for your program, and
   you have thought about the
   kind of evidence you need to measure progress and impact. You would
   like to collect some
   baseline data to compare with the summative data you collect at the
   end of the program. And you
   know that you want to collect both quantitative and qualitative data
   as evidence for your
   intermediate indicators and final program outcomes. But how do you
   actually get the
   information that you need?

   Measuring progress and impact basically means collecting and
   interpreting information. Before
   you decide how to collect this information, it is important to have a
   clear idea of what you are
   trying to learn. While it may be tempting to try and capture every
   facet of change occurring
   among youth in your program, being clear on the purpose of your
   evaluation can help keep data
   collection more manageable. For example, if you are trying to measure
   problem-solving abilities,
   your questionnaire does not need to ask students about their attitudes
   towards mathematics.

   Be clear about what you want to find out.
   Sticking to these areas of interest and avoiding
   unnecessary data collection will keep your
   evaluation focused.


   At this stage in designing your evaluation, think about your program
   activities, possible sources of
   information (e. g., students, parents, and teachers) about how well
   these activities are working,
   and different ways to collect information from each of these sources.

   There are four basic ways to collect evaluation data: document review,
   observations, interviews,
   and surveys. Using a combination of these methods will help you to
   check your findings. And
   your evaluation will be more convincing if you can refer to more than
   one information source and
   method of data collection (such as interviewing students and surveying
   parents) to support your
   statements or conclusions.

   What Records and Documents Can Tell You
   Written documents and records can reveal things about people's
   behavior and about the context
   in which your program occurs. Such records may already exist somewhere
   or you may create
   customized records to meet your evaluation needs. In either case,
   records and documents can
   provide you with some fairly reliable information about program
   participants, and about the
   evolution of a particular issue or program over time.

   Creating your own records can be a cheap and easy way to collect
   information and to make sure
   that you get the information you want about your participants and the
   impact of your program.

   Examples of Records and Documents
   Existing Records/ Documents
     School attendance records
     Report cards
     Extracurricular activity
   records

     Arrest records

   Created Records/ Documents
     Program attendance sheets
     Participant information sheets
     Library checkout lists
     Participant journals or
   portfolios

   How might a CBO use specially-created forms? Simple forms completed on
   the first day of the
   program can provide vital information about participants, including
   name, race or ethnicity, gen-der,
   and age. This demographic information is important to determine if the
   program served the
   intended target audience (for example, middle school girls).

   An attendance sheet is another easily-created form that can help
   measure program success; in-formation
   from these forms may indicate steady or growing participation,
   suggesting program
   popularity. A program aimed at improving attitudes toward science and
   mathematics might de-vise
   a form to keep track of the number of science/ mathematics-related
   library books checked out
   by program participants. An increase in the number of books checked
   out may indicate growing
   interest in and appreciation for science and mathematics.


   Existing records can also pro-vide
   useful evaluation infor-mation.
   For example, school
   records of student participa-tion
   in extracurricular activi-ties
   may indicate increased
   motivation and interest. But
   be aware that you may not
   always get permission to look
   at the documents that interest
   you. Access may require the
   cooperation of people outside
   your organization, and getting permission can often be tricky. This is
   often a problem with report
   cards. Singling out and checking program participants' records (from
   the hundreds on file at a
   school) can also be time-consuming.

   Given these obstacles, you might be able to get the same information
   with a more ingenious
   strategy. While access to report cards through the schools may be
   difficult to attain, it might be
   relatively simple to get parental permission for students to bring in
   their report cards, and to en-courage
   participants to do so with small incentives such as inexpensive or
   donated prizes. In
   general, however, because accessibility varies tremendously, it is a
   good idea to inquire about the
   availability of certain records before you decide to rely on them in
   your evaluation.

   Considering Different Types of Records
   Advantages Disadvantages
   Existing Records   May provide good information
   about student behaviors
     May be difficult to access

     Require permission of others
     Time-consuming to match with
   participants

   Created Records   Can be customized to suit the
   program

     Simple forms require little ex-pertise
   to create or use

     Require accurate and regular
   record-keeping by staff

   Creating records or using existing documents can be fairly
   straightforward. In addition, the
   analysis of records may simply involve tallying the results. But
   records and documents provide
   only a piece of the evaluation picture. They are indirect measures;
   that is, they only suggest pos-sible
   conclusions because they tend to be related to certain kinds of
   attitudes and behaviors. For
   example, increased attendance at CBO programs suggests that the
   popularity of the program is
   growing. However, higher attendance rates could also mean that
   children are using the program

   Be Creative!
   You can sometimes be quite creative in using records to suit
   your needs. For example, researchers studying the impact of
   a new elementary school music program consulted the
   school nurse's records of "emergency" student visits before,
   during, and after the new program was implemented. They
   found that visits decreased during the program, and used
   this information to support their contention that students
   enjoyed the new program better than the previous one.


   to avoid doing something else that they like even less. It is always
   best to supplement the picture
   with other kinds of direct evidence. This may include letting
   participants tell you whether or not
   they like the program or observing them to see if they appear to be
   engaged and enjoying
   themselves.

   Why Watch? What Observations Can Tell You
   There is no substitute for taking a firsthand look at your program.
   Observing children engaged in
   activities or sitting in on staff meetings can provide useful
   information for answering both for-mative
   and summative evaluation questions. By observing, you also can see
   what is or is not
   working, how the program is developing, and the appropriateness of
   activities for participants. In
   short, observations can yield a wealth of information about your
   program.

   What skills do observers need?
   The most important qualities required are the
   ability to take in what is seen, heard, and felt in an
   event, and to report those impressions and details
   clearly in writing. Someone with good attention
   and writing skills is more likely to assemble a
   useful observation report than someone who
   struggles with these tasks.

   As an observer, it is essential to have a clear idea of what you are
   looking for. Within these
   guidelines, however, it is also important to just look before you
   begin looking for something, and
   that means leaving behind any preconceived notions about what you
   think you might see. Your
   observation guidelines may be very general at the beginning of the
   program, but will narrow in
   focus over time as you decide what evidence is most crucial for your
   evaluation.

   Think about your objectives and desired outcomes. What behaviors would
   support your claim
   that the program has changed students' motivation, attitudes, or
   skills? With observations,
   "actions speak louder than words." For example, while students might
   say they like science better
   because of a program, it is even more convincing when an observer
   reports that students are
   actually asking more or better questions about science-related topics.
   Similarly, it is easy for
   participants to say their self-esteem has increased. But seeing
   differences in the way a student
   dresses or interacts with others can support statements about the
   program's influence on students'
   self-image. Tasks that are designed to gauge changes in student's
   behavior or skills, and that are
   completed by participants during an observation session, can also
   provide excellent evaluation
   data.

   Most observers write notes while they are watching, describing what
   participants and staff say or
   do during the observed event. For example, students working in a small
   group might talk excit-edly
   while working out the solution to a problem. Recording their comments
   can provide valu-able
   testimonial to the benefits of cooperative learning. Audiotapes,
   videotapes, or photographs


   may prove useful in capturing the essence of observed events,
   providing that you have permission
   from participants to use these tools.

   While you are observing, be
   attentive and open to discov-ering
   behaviors, both verbal
   and nonverbal, that suggest
   the presence or lack of stu-dent
   motivation. Interactions
   between children, between
   instructors and children, and
   between children and the
   materials are all available to
   the observer's eye.

   Despite their strengths, ob-servations
   alone are not suffi-cient
   evidence for convincing
   others that a program has
   caused lasting change. For
   instance, observations of students working with each other during a
   twenty-minute activity do not
   necessarily mean that students are more inclined to work cooperatively
   in general. Again, it is
   always important to look for several sources of evidence that support
   whatever changes you think
   have occurred in participants.

   Observing With an Evaluator's Eye
   Imagine you are sitting in the back of a room where ten students are
   taking
   turns reading aloud from a book about a science-related topic. The
   in-structor
   takes frequent breaks to ask questions and stimulate discussion. If
   you are looking for indicators of student interest in science, you
   will con-sider:

     How many students are participating in the discussion? What
   are they saying?

     How do students look? Are they distracted or bored, or are
   they listening with interest?

     How much personal experience do the students bring into their
   responses?

     How excited do they seem about the subject? What do they
   say?

   Who Should Observe?
   Activities can be observed by someone involved with the
   program or by someone without a role in the activity. An
   "outsider" gathers details during the event, while a partici-pant-
   observer who is part of the process (for example, an
   assistant instructor) writes down observations afterwards.
   Outsiders can be more objective, but insiders have the ad-vantage
   of really knowing the issues and the ability to pro-vide
   immediate feedback. For example, program staff may
   wonder how students with reading difficulties are faring in
   the program's laboratory projects. The program director
   could ask teachers and assistants to pay particular attention
   to this issue and report on their observations at the next staff
   meeting.


   What's the Word on the Street? Conducting Interviews
   Interviewing participants, program staff, parents, classroom teachers,
   and others is a great way to
   get information about the impact of your program. As with
   observations, being clear and focused
   about the information you want is critical. There are many questions
   that can be asked; the
   evaluator's challenge is to ask just the half dozen or so that best
   meet the needs of the evaluation.

   It is also important to get a range of perspectives. For example,
   interviewing only staff members
   about program impact presents only one point of view and can result in
   a biased interpretation of
   program outcomes; getting students' and parents' views can give you a
   more complete picture of
   what your program did or did not accomplish.

   Interviews offer a wide range of formats they can be formal or
   informal, structured or unstruc-tured,
   individual or in groups, in-person or by telephone. Given the limited
   resources that most
   CBOs have, structured interviews that follow a prepared set of
   questions may work best. An in-terview
   guide, or protocol, can be quite simple. In cases where it is
   important to do so, a proto-col
   is helpful in making sure that each person is asked to respond to the
   same questions.

   If you are working with inexpe-rienced
   interviewers, short, spe-cific,
   and very structured interview
   guidelines can help ensure that you
   get the information you want. In
   addition to this formal interview
   format, some informal interviews
   may occur as well. For example,
   you might ask a few students what
   they think about an activity while
   you are observing the group.
   These spontaneous comments can
   yield excellent insights and in-formation
   for formative and sum-mative
   evaluation purposes.

   Since interviews require people to
   reveal their thoughts, it is impor-tant
   to keep in mind a good fit
   between interviewer and partici-pants.
   For example, having an
   instructor interview students about
   how they liked the class may not
   yield reliable results because chil-dren
   may feel the need to give a positive response. In this case, someone
   not associated with
   program delivery would be a better choice. Assuring respondents of
   individual confidentiality and
   respecting that confidentiality can also help ensure that people are
   candid with their answers.

   Interviewing Children
   Students sometimes act reserved with an adult interviewer
   and may require a certain amount of "probing" to get at key
   issues or to get a better understanding of what they mean.
   For example:

   Interview question:
   "What did you like best about the program?"
   Student: "Everything was great."
   Probe #1: "What one thing stood out?"
   Student: "The food was really good."
   Probe #2: "What about with the program activities?"
   Student: "Well, I really liked working in groups."
   Probe #3: "How come?"
   Student: "It just made you feel like everybody was working
   together, and like you weren't alone, and you
   could feel good about what you did in the
   group."

   In this example, it took three probes to find out what the
   student really liked best and why. This is the kind of informa-tion
   you want, so be prepared to follow up until you get an
   answer to your question.


   Interviewers should be objective, non-threatening,
   knowledgeable about the
   program, and be able to communicate and
   listen well.

   Group interviews, or focus groups, are a good way to talk to more
   people in a shorter amount of
   time. It takes a skilled interviewer to keep the group on track,
   however, and to make sure that
   everyone gets involved in the discussion. Restricting a group to 8 10
   people is a good idea, as is
   limiting the people in your group to those who have similar
   experiences such as teachers only or
   students only.

   To capture the important points that emerge from an interview,
   interviewers usually take notes
   and/ or tape record (if the person or group is willing). In either
   case, it's important to try to get
   the exact words people use about key points. These direct quotes can
   provide powerful data
   about program impact. Summaries of what people say are also useful for
   illustrating program
   impact in evaluation reports.

   Interviewing people can be
   time-consuming and labor-intensive,
   but the rich detail
   that comes from interviews
   can make it all worthwhile.
   Interviews can provide in-depth
   information about be-haviors,
   attitudes, values,
   knowledge, and skills be-fore,
   during, and after a pro-gram.
   Interviews can also
   help clarify and expand what
   you learn through document
   review and direct observa-tions.
   And because interviews
   can provide such rich data, it
   is possible to get enough
   detailed information about a
   program by interviewing a sample or subset of participants, instead of
   all participants.

   Tips for Interviewing
     Make the interview setting as friendly and as comfort-able
   as possible.

     Use your own words to sound more natural and conver-sational,
   even as you use an interview guide with set
   questions.

     Be patient. Allow people to think and answer in their
   own time.

     Try not to give verbal or facial clues to people's re-sponses.
   By doing so, you might lead their answer or
   make them think they said something wrong.

     At the end of the interview, give people a chance to
   add miscellaneous comments or ask you any questions
   they might have.


   Making Numbers Count: Conducting Surveys
   A survey is a method of collecting information by mail, by phone, or
   in person. Surveying in-volves
   a series of steps, including selecting a sample, collecting the
   information, and following up
   with non-respondents. A questionnaire is the instrument (written
   questions) used to collect
   information as part of a survey.

   Responses to multiple-choice items on questionnaires can be tallied to
   provide numbers and per-centages
   that are powerful quantitative evaluation data. While people can be
   surveyed by mail or
   phone, community-based organizations might more frequently choose to
   have participants com-plete
   a written questionnaire in person during program events. With a
   captive audience, you will
   likely get a better response rate, which can yield more accurate
   information about the group as a
   whole.

   Questionnaires can be especially useful in evaluation if the same set
   of questions is asked at the
   beginning of a program (for baseline information) and again at the end
   of the program (to
   measure impact).

   For programs with a large number of participants, surveying a sample
   of the group may be more
   cost-effective than surveying everyone in the program. However, you
   need to be careful to
   choose a sample that is representative of the entire group. For
   example, if attendance at a
   particular event is low, then surveying only those participants who
   come to the event may lead to
   biased results. Everyone who attended may have thoroughly enjoyed the
   activity, while the rest
   of the people who were invited chose not to attend because the
   activity did not seem very
   interesting or worthwhile. Talking to non-participants will help you
   to more accurately evaluate
   your program activities.

   Surveys can include several kinds of questions. Closed-ended questions
   resemble items on a
   multiple-choice test; they provide a selection of possible answers
   from which to choose. People
   who complete the questionnaire are asked to select the answer that
   best matches their beliefs or
   feelings. In the following questionnaire, items 1 and 4 are examples
   of closed-ended questions.
   Question 1 gives the participant five options for describing his or
   her reaction to the program.
   Question 4 provides the participant with several options each for
   describing their gender, grade
   level, and race/ ethnicity. Notice that the answers to question 4
   provide important contextual or
   demographic information about the participants.

   Open-ended questions, on the other hand, provide no answer categories.
   Rather, they allow
   participants to respond to a question in their own words. For example,
   question 3 asks
   participants to write out specific suggestions for future programs.
   Notice that question 3 is
   carefully worded to discourage a simple "yes" or "no" answer.


   Family Science and Math Nights [Excerpt from Participant Survey]
   Please discuss these questions within your family and mark answers
   agreed upon
   by the family.

   1. Using the following scale, how would you rate the activities you
   experienced
   this evening on the whole? (Circle one response.)

   1 = Really Boring
   2 = Boring
   3 = No Opinion
   4 = Fun
   5 = A Lot of Fun

   2. How many Science and Math Nights have you attended? ______
   3. What suggestions do you have for making future Science and Math
   Nights
   better?

   4. Which word or phrase in each column best describes you?
   Gender Grade Level Race
   ____ Girl ____ 1 st 5 th grade ____ African American
   ____ 6 th 8 th grade ____ Hispanic
   ____ Boy ____ 9 th 12 th grade ____ White
   ____ Native American
   ____ Asian/ Pacific Islander
   ____ Other ______________


   Developing good surveys re-quires
   a certain level of ex-pertise
   that some community-based
   organizations may lack.
   This does not mean that using
   questionnaires in your
   evaluation is out of reach.
   Here are some tips you can
   use to develop a questionnaire
   or adapt one that someone
   else has created for a similar
   purpose.

     Keep your questionnaire
   short, ideally no more than
   a page or two. Re-member,
   someone will
   have to tally or read and
   analyze all of those
   responses.

     Keep it simple, with short
   questions and clear answer
   categories.

     Make it easy to use participants will be more likely to complete it.
     Make it anonymous, and participants will probably be more honest.
     Use language appropriate for the audience. The younger the student,
   the simpler the ques-tions
   and answer categories need to be.

   A Final Word about Data Collection
   There are always tradeoffs to consider when selecting data collection
   methods for your evalua-tion.
   Some tradeoffs involve time and the level of effort needed to collect
   and analyze certain
   kinds of data. For example, conducting individual interviews takes
   longer than interviewing a
   group of people all at once, but potentially sensitive questions
   should not be asked in a group
   setting. Interviews in general require more staff time than having
   participants fill out a survey.
   On the analysis side, counting closed-ended responses to a question
   generally takes less time than
   reading the same number of open-ended responses and drawing out the
   major themes to be
   summarized.

   Another tradeoff involves using program staff to conduct evaluation
   activities as opposed to
   hiring someone from outside of your organization. Hiring an external
   evaluator obviously
   involves some expenditure which you are trying to avoid by using this
   manual! However, there
   are at least two good reasons to consider using an external evaluator.
   First, participants are not
   always comfortable saying critical things about a program to the
   people who are directly involved

   Tips for Developing Questionnaires
   Wording Matters!

   How you word your questions can influence the response you
   get. Be precise in your language to help the respondent
   understand what information you are requesting. For
   example, an open-ended question that asks participants
   how many Science and Math Nights they have attended
   might yield a variety of responses such as, "a lot," "four,"
   "can't remember," or "most of them." In this case, to help
   jog memories and get more accurate information, it might
   be better to provide the dates of the sessions and the major
   activity that occurred, and ask respondents to check which
   ones they attended.

   With questionnaire items, it's also important to avoid leading
   the respondent in a particular direction with your questions
   or answer categories. For example, a closed-ended item
   with mostly positive answer choices (" Okay," "Fun," "Great")
   does not give participants suitable options for expressing a
   negative opinion.


   in it. And second, funders often perceive external evaluators as more
   impartial and objective
   about programs than are the people who run them. You may be able to
   deal with these issues by
   finding a staff member who is not directly involved in your program to
   interview program par-ticipants
   or recruiting volunteers who have some experience doing interviews and
   observations.

   Additional Pointers for Data Collection
     Set aside 5 10 percent of staff time for evaluation
   activities and 5 10 percent of the program budget
   for additional evaluation expenses.

     Be realistic and stay focused on the information
   needed to answer your specific evaluation questions.

     Look for volunteers with any additional expertise you
   need.

   Now that you have collected all this information, what are you going
   to do with it? Interpreting
   and reporting your data is the subject of Chapter 7.


   Notes


   Chapter Seven
   MAKING SENSE OF THE EVIDENCE Interpreting and Reporting Your Data

   / Looking for themes
   / Interpreting data
   / Putting it together
   / Reporting your results

   Framing the
   Evaluation

   Framing the
   Evaluation
   Defining
   Goals and
   Objectives

   Defining
   Goals and
   Objectives
   Finding the
   Evidence

   Finding the
   Evidence

   Making
   Sense of
   the
   Evidence

   Making
   Sense of
   the
   Evidence

   One thing is for certain all of the formative and summative data that
   you collect can quickly add
   up, even for a small program. What does it all tell you? How can you
   use it to judge your
   programs? How can you present it to your board, your funders, the
   community, and others who
   might have a stake in your efforts?

   Looking for Themes
   As part of the documentation and formative evaluation, you will have
   accumulated some impor-tant
   information that can help you make sense of things. Reviewing the data
   periodically as it
   accumulates has several advantages: it helps you to begin to identify
   themes; it makes the analysis
   process less intimidating than if you wait until all of the data have
   been collected; and most
   importantly, it enables you to use the results to improve your
   program.

   Your first step in data analysis will be to look for recurring themes.
   As you review data from
   documents, observations, interviews, and surveys, some ideas will
   occur more often than others.
   Learning to recognize these patterns, and the relevancy of this
   information as it emerges in each
   of these formats, is crucial to your evaluation. These key themes are
   what you must capture in
   your evaluation report.

   What is the most important thing to remember when interpreting and
   reporting your data? The
   intermediate indicators and final program outcomes that you defined at
   the beginning of your
   program! Framing your thinking and your results in terms of these can
   help you to understand
   and present your data clearly.


   Be Flexible
   In your review of formative data,
   you may discover key issues
   other than the ones you originally
   thought to look at when you de-signed
   your evaluation. It is im-portant
   to be flexible enough to
   explore these unexpected issues,
   within the limits of your re-sources.
   Be sure to note new
   ideas, different patterns or
   themes, and questions that need
   further investigation. Interview
   or observation guides and
   surveys can be adjusted over time
   in response to what you learn through the review and interpretation of
   your formative data.

   Putting It Together
   Once you have taken the trouble to collect data from a variety of
   sources (students, staff, parents,
   or others), it is important to look at all of these perspectives
   together to get a full picture of your
   program. The various pieces of the evaluation (formative and
   summative) and each data collec-tion
   activity (document review, observations, interviews, and surveys) all
   add up to tell you
   about the quality and success of your program. Looking at all of this
   evidence together and con-sidering
   it in terms of your objectives will enable you to say with some
   accuracy whether or not
   your program achieved what you intended.

   Looking At It All Together

   Student
   questionnaires

   Student
   questionnaires

   Interviews with
   students, teachers,
   and parents

   Interviews with
   students, teachers,
   and parents

   Observations of
   students in program
   events, classes, and
   in other interactions

   Observations of
   students in program
   events, classes, and
   in other interactions

   Records of par-ticipating
   students
   who volunteer for
   leadership roles

   Records of par-ticipating
   students
   who volunteer for
   leadership roles

   Program
   Objective:
   To increase the
   leadership skills of
   student participants

   Learning As You Go
   During the summer camps for middle school students and
   their mentors, Youth Action Today! found that parental sup-port
   and involvement was particularly strong this year. Unlike
   previous years, program staff actually had the luxury of
   selecting volunteers from a pool of over twenty parents who
   agreed to help. The staff originally planned to survey all
   parents as part of their evaluation. However, when they no-ticed
   the increase in parental support this year, they
   changed their evaluation plan to include interviews. The
   staff decided to conduct interviews with a sample of parents
   to get more in-depth information on what prompted their
   involvement in the program this year.


   The amount of time that you can devote to this process will depend on
   the level of resources your
   CBO has. For example, a small CBO may just do a quick review of
   interview notes to get the
   main points; a CBO with extensive resources and staff might do a more
   in-depth analysis
   summarizing each interview in writing, developing charts that compare
   the responses of different
   groups of people, and writing up common themes that emerge from the
   interviews.

   Working With What You've Got Again
   In some cases, interpreting the data you collect may require some
   additional expertise. For ex-ample,
   science or mathematics content may play a central role in some program
   activities; having
   knowledge in these areas may help with the analysis of student
   misconceptions about certain
   topics.

   In a case like this, you might want to discuss your observations or
   share observation notes with
   someone who has this expertise and can help shed light on your
   descriptions of student questions
   or discussions. (Better yet, have these persons do the observations.)
   In a larger CBO, there may
   be individuals on staff who can help. If you do not have this
   expertise on staff, you might look to
   your CBO's board members or volunteers who may bring these skills to
   your organization.

   Telling the Story: How to Report Your Evaluation Results
   Interpreting your evaluation data for in-house use can be done
   informally, but making it available
   and useful to others requires a more polished product. Formal
   evaluation reports can provide in-formation
   to your board members, the community, and your funders about the
   program's prog-ress
   and success. Portions of these reports can also be a valuable public
   relations tool. When
   distributed to newspapers or other media, this information can
   increase community awareness and
   support for your organization's programs.

   Here are several things you will want to include in your evaluation
   report:
     The objectives of your program and your targeted audience
     What data you collected for your evaluation and how it was collected
     The evaluation results in terms of program goals and objectives
     Plan for using the evaluation to improve the program

   In addition to these pieces, you will want to include a description of
   the context in which your
   program occurs. This might consist of a brief summary of needs
   assessment data, the demo-graphic
   and socioeconomic characteristics of the community and your program
   participants, and
   documentation of the level of impact (such as the number of young
   people served compared to
   the number of youth in the community). Your report should also
   highlight tactics you used to
   attract your targeted audience, as well as other strategies to ensure
   that your program was well-implemented.


   Presenting your data simply
   and concisely can help your
   audience get a clear and ac-curate
   picture of your pro-gram.
   For example, it is un-likely
   that you would include
   long excerpts from inter-views
   in your report
   (although these might be in-cluded
   in an appendix). In-stead,
   pick a few powerful,
   short quotes that really make
   your point and sprinkle them
   throughout your summary or
   analysis of other data. An-other
   strategy is to include a
   brief description of a par-ticularly
   effective program activity.

   Blending your qualitative data, such as quotes from interviews or
   descriptions from observations,
   with your quantitative data from surveys is a useful way to report
   your evaluation results. Simple
   charts, tables, and graphs that show how many students participated,
   or what percent demon-strated
   changes after the program, can help illustrate the impact of your
   program. Take a look at
   Appendix A for an example of a full evaluation report that uses these
   strategies.

   Tips for Telling Your Program's Story
     Know your audience a report for a funder will look
   different from an in-house summary.

     Leave the jargon at home be straightforward and
   clearly state your major findings.

     Blend the presentation of quantitative and qualitative
   data. Quotes from relevant persons interspersed with
   tables and graphs illustrating quantitative data
   (numbers or percents) make the report more readable
   and strengthen your summary of the data.

     Be honest your report will be considerably more
   credible if you note both the strengths and weaknesses
   of your program.


   Notes


   Notes


   Chapter Eight
   APPLYING THIS MANUAL How One CBO Did It
   In earlier chapters, we discussed the various pieces that make up
   program evaluation. Now we
   are going to pull it all together in a way that lets you see how a CBO
   might choose to evaluate a
   program and what an evaluation looks like from start to finish. The
   organization and program
   are small, and as a result, so is the evaluation. Below is a snapshot
   of our fictional CBO and
   program to help you compare it to your own in terms of staff, budget,
   and other resources.

   Trash for Cash
   Youth and Communities Alive! (YACA) is a small community-based
   organization located in an
   inner-city housing project. With a total operating budget of $50,000-$
   100,000 a year, YACA's in-dividual
   program budgets range from $500 to $10,000. Programs typically target
   low-income
   African American and Latino youth and are funded by churches and
   community organizations.
   Program activities often take place at nearby locations such as the
   housing project's TV lounge
   and the playgrounds scattered throughout the community. Program staff
   at YACA include a
   part-time director, some paid and volunteer assistants, and volunteer
   program coordinators.

   YACA's director, Mrs. Alvarez, recently received funding from a local
   church for a program de-signed
   to address two concerns expressed by community members at local
   meetings cleaning
   up the neighborhood and providing constructive activities for youth to
   serve as an alternative to
   the street. The program was called "Trash for Cash."

   Trash for Cash (TFC) included a number of activities. Most TFC
   sessions began with a brief lesson
   taught by Mrs. Alvarez and a volunteer on the importance of recycling
   or other environmental
   topics. Over the course of the school year, seven guest speakers from
   the community made
   presentations about conservation, waste management, water quality,
   recycling, and other re-lated
   issues. Subsequent sessions with program staff reviewed what students
   had learned in these
   presentations, and how the information applied to their own lives.

   In addition to these lessons, participating youth were given a central
   role in all of the clean-up
   and recycling activities. In doing this, YACA staff hoped to develop a
   sense of neighborhood
   pride and ownership among the youth. Students organized a weekly
   community collection of
   trash and recyclable cans and bottles, and encouraged recycling in
   their homes. They also kept
   track of the pounds of recyclables collected, using mathematics skills
   to weigh and record
   amounts and measure progress toward their 1,000-pound goal. Students
   also kept accounting
   records of incoming money for exchanged recyclables, and outgoing
   expenses for trash bags,
   refreshments, and other minor outlays.

   Reaching the 1,000-pound goal in recyclable materials entitled
   participants who attended at
   least half of the clean-up sessions to a free ticket to an NBA
   basketball game. The TFC program
   budget of $2,000 covered staff time for Mrs. Alvarez, supplies, a
   small honorarium paid to each
   guest speaker, and the cost of the NBA tickets.


   Trash for Cash
   Target Audience: High school students
   Main Strategy: Weekly after-school sessions
   No. of Participants: 25
   Duration: One academic year
   Cost: $2,000

   Framing the Evaluation
   Creating a program to match community needs was the first step for
   YACA. To do that, Mrs.
   Alvarez first considered the priorities identified by community
   members, and the population most
   targeted for these needs.

   Needs Assessment
   Identified Needs:   Constructive youth activities
     Cleaner community environment

   Target Population:   High school students

   Mrs. Alvarez also consulted her board of directors a broad spectrum of
   community representa-tives,
   including school and agency staff, parents, and two students. Board
   discussions about
   community needs, as well as youth's needs and prospects for the
   future, helped focus program
   goals and objectives. As a result of this dialogue, Mrs. Alvarez added
   an academic enrichment
   component to the program which included everyday applications of
   science and mathematics, and
   an expanded view of what science is and what scientists do.

   Defining Goals and Objectives
   YACA pinpointed the major goal and several objectives for the Trash
   for Cash program.
   Goal: Improve youths' future options in the community and in school
   Objectives:
   1. To develop a sense of ownership and pride in the community among
   partici-pating
   youths

   2. To expand students' awareness of science and mathematics
   applications in
   everyday life

   3. To clean up the neighborhood


   Recognizing the limitations of her staff and resources, Mrs. Alvarez
   was determined to keep the
   evaluation focused. This meant asking formative and summative
   questions that were specifically
   designed to provide information on the stated objectives.

   Evaluation Questions Matched to Program Objectives
   Objectives Formative Questions Summative Questions
   1. To develop students' sense
   of ownership and pride in
   the community

     What did YACA do to
   promote the program and
   attract students to partici-pate?

     To what extent do students
   show interest in the activi-ties
   and take initiative for
   recycling efforts?

     What changes have oc-curred
   in students' atti-tudes
   and level of interest
   in the community?

     To what extent do students
   exhibit knowledge of the
   importance of community
   involvement?

   2. To expand students'
   awareness of science and
   mathematics applications in
   everyday life

     In after-school TFC ses-sions,
   how do students ex-hibit
   an understanding of
   the relevancy of the topics
   presented?

     What connections do stu-dents
   make between dis-cussion
   topics and their
   own experiences?

     To what extent do students
   exhibit an understanding of
   the importance of recycling
   and other science-related
   topics, and the relevancy of
   these issues to themselves
   and the community?

   3. To clean up the
   neighborhood
     How is the neighborhood
   appearance changing as
   students progress toward
   their clean-up goal?

     How do neighborhood
   areas targeted for clean-up
   compare before and after
   the program?

   Finding the Evidence
   What information would help YACA to answer these questions? Again
   reflecting back to her
   level of resources, Mrs. Alvarez thought about her options. In making
   decisions about data col-lection,
   she considered not only her available resources, but also what
   evidence was adequate for
   determining if the program achieved its objectives.

     Documentation of program strategies to reach target audience. To
   demon-strate
   that YACA tried to reach a broad spectrum of students, program staff
   de-veloped
   and documented outreach strategies used to recruit participants,
   including
   school visits, and discussions with students, teachers, parents, and
   agency staff.


   Participant information sheets also gathered information about the
   age, gender, and
   race/ ethnicity of participants.

     Attendance sheets. Mrs. Alvarez considered this essential to
   determine if the
   program was meeting attendance goals. If attendance dropped off, this
   might
   signal the need for changes in the program or in program logistics.
   Similarly,
   attendance sheets could tell staff if particular groups of students
   (for example, girls
   or boys) were attending less often so that staff could adapt program
   strategies
   accordingly.

     Student journals or student interviews or student questionnaires.
   Any one of
   these might help tell Mrs. Alvarez if students liked the program. She
   decided
   against interviews because they were too labor-intensive. For the same
   reason, she
   decided not to do student journals. She settled on a short
   questionnaire at the
   end of the program with four questions that asked students what they
   liked best
   and least about the program, what they had learned, and how they would
   rate the
   program.

     Observations of after-school sessions. Mrs. Alvarez thought it was
   important to
   try to document changes in student attitudes toward science and their
   awareness of
   the relevancy of science. To do this, she recruited two members of her
   board with
   teaching experience to observe and report on sessions at the beginning
   and at the
   end of the program.

     Tallying the recyclables. This was essential for knowing whether or
   not students
   were progressing toward their 1,000 pound goal, and presumably,
   whether or not
   the neighborhood was getting cleaned up.

     Before and after pictures of designated "ugly" spots in the
   community. Mrs.
   Alvarez liked this idea a lot, thinking that "a picture is worth a
   thousand words."
   She could go out with the students on the first and last day of the
   after-school ses-sions
   to take the pictures. It seemed like a good way to get participants
   involved
   first hand, and a quick and easy way to collect data, too.

   Interpreting and Reporting the Data
   In the end, Mrs. Alvarez was pleased with her simple evaluation. While
   it did not give her a lot of
   information about the program directly from the students, the
   attendance records kept her in-formed
   about their level of participation. For example, when attendance
   slipped in the fall, she
   asked some of the participants if there was a problem with the
   program. Discovering that TFC
   sessions conflicted with some students' tutoring sessions, she
   adjusted the schedule. With this
   change in logistics, the program was able to meet its goal for weekly
   attendance.

   Observations by board members revealed some changes in students' level
   of interest and partici-pation
   in discussions, with more students actively participating at the end
   of the program than in
   earlier observations. In addition, students' comments seemed to
   demonstrate a greater awareness
   of the relevancy of science. For example, observers noted that many of
   the participants volun-tarily
   made connections between the discussion topic and their own personal
   experiences.


   Student questionnaires provided evidence that supported observations.
   Students reported that
   they liked working together to improve the neighborhood, had learned
   about the importance of
   recycling, and had gained an expanded view of what science is and how
   it relates to their lives.

   Tallying recyclables kept students involved in the process as they
   watched the group move toward
   their 1,000-pound goal, and also gave them a chance to use mathematics
   skills. According to
   Mrs. Alvarez, the pictures she and her students took were the best
   part of the evaluation, pro-viding
   "hard" evidence that the neighborhood was cleaner.

   There is one thing that Mrs. Alvarez would have changed in her
   evaluation design she would
   have recruited volunteers to help her tally the survey results. Four
   questions per questionnaire
   didn't seem like much, but given all of her other responsibilities,
   tallying the responses from 25
   participants was too much to do. She still thought the survey was
   important it was her only
   source of data that came directly from the students and that provided
   information on how the
   program had affected them. In hindsight, she would have lined up
   several board members as
   volunteers to assist.

   The evaluation of Trash for Cash showed that the program had a
   positive impact on participating
   students and the community. With churches emphasizing community
   involvement and schools
   highlighting environmental awareness, Mrs. Alvarez was reluctant to
   say that her program was the
   sole cause of these changes. However, the evidence collected in the
   evaluation demonstrated that
   Trash for Cash had successfully met its objectives and it is likely
   that the program contributed to
   the positive outcomes.

   How can Mrs. Alvarez best present the evaluation results to showcase
   the program's success to
   her board and her funders? Take a look at a final evaluation report
   for Trash for Cash in
   Appendix A.

   Sample Data Collection Instruments for Trash for Cash
     Participant Information Sheet
     Attendance Sheet
     Student Questionnaire
     Tally Sheet for Recyclables


   Participant Information Sheet
   Participant Name Age
   Male/
   Female Race/ Ethnicity


   Attendance Sheet
   Date Participant Name*
   Present

   (+ )
   Absent
   (+)

   * Once names have been recorded, multiple copies of the attendance
   sheet can be made to use
   at each session.


   Student Questionnaire
   1. How did you like the program? (Circle one.)
   4 = Great! 3 = Good 2 = Boring 1 = Really Boring!

   2. What did you like best about the program?

   3. What did you like least about the program?
   4. What was the most important thing you learned in the program?
   Thanks for Filling This Out!


   Tally Sheet for Recyclables
   Date
   Weight
   of
   Cans

   Weight
   of
   Bottles

   Amount
   Received
   Today

   Total-to-Date
   Received for
   Recyclables


   Notes


   Chapter Nine
   APPLYING THIS MANUAL IN A BIGGER WAY Expanding the Evaluation Design
   In Chapter Eight, we saw how one CBO designed an effective evaluation
   matched to the limited
   resources and staff available for the program. How might Mrs. Alvarez
   plan an evaluation for a
   larger program with more resources? This chapter looks at what she
   might do differently in her
   evaluation of an expanded Trash for Cash Program. Below is a
   description of the new program
   run by our fictional CBO, Youth and Communities Alive! (YACA). See
   Appendix B for YACA's
   proposal to expand the program.

   More Trash for Cash
   After seeing the positive results in the neighborhood's appearance and
   observing an increased
   interest among youth in community improvement, Mrs. Alvarez wrote a
   proposal to expand the
   program (see Appendix B). The More Trash for Cash (MTFC) program
   increased the number of
   youth served and lasted two years. Youth and Communities Alive!
   received a total of $20,000
   over two years from the United Way and a local foundation for the More
   Trash for Cash program.

   More Trash for Cash included several new features. Mrs. Alvarez
   increased the amount of science
   instruction in the after-school sessions. Each session began with
   hands-on activities that engaged
   students in thoughtful investigations into various environmental
   topics. Two high school science
   teachers were recruited to teach some sessions, as was a professor
   from a nearby university. With
   a larger program budget, Alvarez was able to pay the instructors a
   stipend. In addition, she lined
   up more guest speakers and arranged for two field trips each year.

   The expanded program included a new group of 20 middle school students
   and 25 high school
   students each year. Five high school students who had participated in
   the original program
   came back as program assistants in the first year; during the second
   year, five new high school
   students were recruited to fill these positions. The older students
   took on leadership roles, includ-ing
   mentoring the new students and helping Mrs. Alvarez and two volunteers
   with program co-ordination.
   Each of the student assistants was paid a small stipend for their
   work. Mrs. Alvarez also
   hired a program assistant to work 8 hours a week.

   Program activities were similar to the original Trash for Cash during
   year one, students selected
   new "ugly" spots for clean-up. Students were given their choice of
   incentives for reaching a new
   goal of 1,500 pounds of recyclables each year NBA basketball game
   tickets, a ride on a local
   paddle-wheel river boat, or tickets to a performance by an inner city
   youth theater group. In
   addition, during the second year of the program, greater emphasis was
   placed on community
   awareness and involvement. Several of the high school students made
   presentations at
   community meetings and talked to local businesses about recycling and
   MTFC's efforts.


   More Trash for Cash
   Target Audience: Middle and high school students
   Main Strategy: Weekly after-school sessions
   No. of Participants: 20 middle school students,
   (each year) 25 high school students, and
   5 "veteran" high school students
   Duration: Two academic years
   Cost: $20,000

   Framing the Evaluation
   Mrs. Alvarez was ahead of the game here. From the original Trash for
   Cash program, she had
   identified both the needs and the targeted population. However, with
   the new program, she de-cided
   to add middle school students to her target audience.

   Needs Assessment
   Identified Needs:   Constructive youth activities
     Cleaner community environment

   Target Population:   Middle and high school students

   Defining Goals and Objectives
   The More Trash for Cash program sought to address the same goal as the
   original program to
   improve youths' options in the community and in school. Mrs. Alvarez
   also wanted to keep the
   same focus on building a sense of ownership in the community and on
   the clean-up efforts.
   However, she wanted to expand the academic enrichment component to
   emphasize skills and
   knowledge in science. In addition, she added a fourth objective
   related to community involve-ment
   to increase the likelihood that the program would be sustained. With
   these changes, the
   objectives for the More Trash for Cash program looked like this:

   1. To develop a sense of ownership and pride in the community among
   partici-pating
   youth

   2. To develop students' science skills and knowledge, and their
   awareness of
   science and mathematics applications in everyday life

   3. To clean up the neighborhood
   4. To increase community awareness and involvement in clean-up efforts



   Mrs. Alvarez used her evaluation design from the original program as a
   basis for the More Trash
   for Cash evaluation. For the new program objectives, she developed a
   set of evaluation questions
   that would provide both formative and summative information.

   Expanding the Evaluation Design
   Expanded Objectives Evaluation Questions
   To develop students' science
   skills and knowledge, and
   their awareness of science
   and mathematics applica-tions
   in everyday life

     What opportunities are students given to increase
   their knowledge and skills in science?

     How effective are hands-on activities in engaging
   students?

     How do students demonstrate greater understand-ing
   of scientific topics and issues, and the relevancy
   of these topics?

     What changes occur in students' skills (observing,
   measuring, recording, hypothesizing, drawing con-clusions)
   over the course of the program?

   To increase community
   awareness and involvement
   in clean-up efforts

     What strategies are used to increase awareness?
     How aware are parents and community members of
   clean-up efforts?

     How do parents, businesses, and community mem-bers
   support clean-up efforts?

     What evidence suggests that clean-up efforts will
   persist beyond the program?

   The next step for Mrs. Alvarez was to define intermediate indicators
   and final program
   outcomes. What would she accept as proof that the program was of high
   quality and that the
   objectives had been achieved, and how could these outcomes be stated
   explicitly?


   Indicators and Outcomes for the More Trash for Cash Program
   Objectives
   1. To develop a sense of ownership and pride in the community among
   participating
   youth

   2. To develop students' science skills and knowledge, and their
   awareness of science
   and mathematics applications in everyday life

   3. To clean up the neighborhood
   4. To increase community awareness and involvement in clean-up efforts

   Intermediate
   Indicators
     Number of students who attend after-school sessions and collect
   trash
   stays the same or increases over course of program. (Obj. 1)

     Students demonstrate greater leadership in activities during the
   year: take
   initiative in organizing/ doing activities. (Obj. 1)

     Number of students who actively participate in discussions, link
   science
   with personal experiences increases during the year. (Obj. 2)

     Students exhibit greater understanding of science-related topics by
   asking
   more high level questions; demonstrate improvements in skills through
   hands-on science activities. (Obj. 2)

     Pounds of recyclables collected increases during school year. (Obj.
   3)
     Amount of trash in designated "ugly" spots in the community
   decreases
   during the year. (Obj. 3)

     Community expresses awareness of clean-up at neighborhood meetings;
   number of businesses that actively support recycling increases. (Obj.
   4)

   Final Outcomes   Seventy-five percent of the students attend at least
   half of the weekly
   sessions. (Obj. 1)

     At least three-quarters of the students express awareness of the
   impor-tance
   of community involvement. (Obj. 1)

     At least three-quarters of the students express an understanding of
   the
   relevancy of science, and demonstrate improved skills and attitudes
   to-ward
   science. (Obj. 2)

     At least 1,500 pounds of recyclables are collected by end of each
   school
   year. (Obj. 3)

     Neighborhood "ugly" spots are cleaned up by end of each year. (Obj.
   3)
     Community actively supports clean-up; number of businesses involved
   in
   recycling increases by 50 percent by end of program. (Obj. 4)


   Finding the Evidence
   Mrs. Alvarez wanted to get a better feel for the data collection
   activities to make sure that her
   strategies would yield information about the chosen indicators and
   outcomes and that she was
   being realistic in her plans. It was one thing to list everything they
   would do to collect informa-tion;
   it would be more difficult to pin down when these activities would
   occur and how often.
   Mrs. Alvarez again wanted to be sure to collect both qualitative and
   quantitative data. She also
   knew that she would need this information each year of the program to
   provide data about each
   group of student participants.

   In planning the data collection activities, Mrs. Alvarez immediately
   fell into
   the "starting big" trap. She thought about conducting student focus
   groups twice each month to see how students liked the program. She
   thought monthly student questionnaires could also help gauge interest
   in
   the program, as well as impact. Survey forms could be short and simple
   and provide regular feedback to staff. Even so, she realized, it would
   be a
   lot to read and tally every month. And someone would have to
   facilitate
   student discussion groups and report the information.

   Mrs. Alvarez knew she had to cut back. Instead of the frequent
   question-naires
   and focus groups, she decided to ask instructors to set aside 10 15
   minutes of class time every other month to let students talk about the
   pro-gram.
   The class could be separated into several smaller groups to allow
   better participation. Students would talk about the program among
   them-selves;
   one student would be designated as the recorder to report the major
   themes from each group in writing. The high school program assistants
   could help facilitate the group discussions.

   Mrs. Alvarez liked this strategy because it avoided the issue of
   students
   telling instructors what they did or didn't like, and enabled them to
   talk
   about their progress or where they needed help. Rotating the role of
   re-corder
   each month would provide students with an additional opportunity
   for participation and leadership. To help focus their discussions,
   Mrs.
   Alvarez would develop a guide for them to write down their responses.

   For each of her outcomes, Mrs. Alvarez went through this process. How
   can we collect the in-formation?
   Who will do it? What will it involve? How can it be streamlined to
   reduce the bur-den
   on both staff and participants?

   In thinking about all of this, Alvarez realized that each data
   collection activity involved not only
   collecting the data, but also preliminary and follow-up work as well.
   For example:

     She would have to develop questionnaires, distribute them, make sure
   they
   were completed and returned, and tally the results.

     Volunteers who did observations would need a simple guide to tell
   them what
   to look for.


     Student discussion groups would need a guide as well.
   All of this quickly added up to a lot of work an added incentive to
   streamline data collection
   activities. After some hard thinking, Alvarez came up with a data
   collection plan that she thought
   was manageable, but one that would also provide useful formative
   information and convincing
   summative data.

   Refining the Data Collection Plan
   Data Collection Activity Schedule
   Before and after photographs of
   neighborhood

   Attendance records
   Tally of recyclables
   Observations of after-school
   sessions; informal interviews with
   staff and students as part of
   observations

   Student group discussions
   Participant survey

   Documentation of student
   presentations to businesses and
   community groups; observations of
   community meetings

   Community survey (optional)

   +
   +
   +
   +

   +
   +

   +

   +

   At beginning and at end of each
   year of the program

   Weekly
   Weekly
   Once per semester

   Twice per semester
   At the end of each year of the
   program

   As they occur

   At the end of the second year of
   the program

   Mrs. Alvarez planned to look at community awareness at neighborhood
   meetings as one way to
   gauge the impact of student presentations on recycling. If awareness
   was high, she would try to
   support her observations with a survey of community members at the end
   of the second year of
   the program.

   At this point, Mrs. Alvarez realized she had a lot of pieces of paper
   floating around with different
   ideas for the evaluation. All of these had helped her to plan the
   evaluation, but now she wanted
   to see it all together objectives, evaluation questions, indicators,
   outcomes, and data collection
   activities. What she came up with helped her to see the big picture,
   and to make sure she was
   answering the right questions. She thought of it as her evaluation
   road map.

   The Road Map: More Trash for Cash Evaluation Design
   Objectives Evaluation Questions Intermediate Indicators Final Outcomes
   1. To develop a
   sense of owner-ship
   and pride in
   the community
   among partici-pating
   youth

   a) What did YACA do to promote the program and
   attract students to participate?

   b) To what extent do students show interest in the
   activities and take initiative for recycling efforts?

   c) What changes have occurred in students' attitudes
   and level of interest in the community?

   d) To what extent do students exhibit knowledge of
   the importance of community involvement?

     Number of students who
   attend after-school sessions
   and collect trash stays the
   same or increases over
   course of program.

     In observations, students
   demonstrate greater lead-ership
   in activities during
   the year take initiative in
   organizing/ doing activities.

     Seventy-five percent of the
   students attend at least half
   of the weekly sessions.

     On surveys, at least three-quarters
   of the students
   express awareness of the
   importance of community
   involvement.

   2. To develop stu-dents'
   science
   skills and knowl-edge,
   and their
   awareness of sci-ence
   and mathe-matics
   applica-tions
   in everyday
   life

   a) What opportunities are students given to increase
   their knowledge and skills in science?

   b) How effective are hands-on activities in engaging
   students?

   c) To what extent do students demonstrate greater
   understanding of scientific topics and issues, and
   the relevancy of these topics?

   d) What changes occur in students' skills (observing,
   measuring, recording, hypothesizing, drawing
   conclusions) over the course of the program?

   e) What connections do students make between dis-cussion
   topics and their own experiences?

     Number of students who
   actively participate in dis-cussions,
   link science with
   personal experiences in-creases
   during the year.

     In group discussions and
   observations, students ex-hibit
   greater understanding
   of science-related topics by
   asking more high level
   questions; demonstrate
   improvements in skills
   through hands-on science
   activities.

     On surveys, at least three-quarters
   of the students
   express an understanding
   of the relevancy of science.

     In observations, at least
   three quarters of students
   demonstrate improved
   skills and attitudes toward
   science.

   Taking Stock
   -
   61

   The Road Map: More Trash for Cash Evaluation Design
   Objectives Evaluation Questions Intermediate Indicators Final Outcomes
   3. To clean up the
   neighborhood
   a) How is the neighborhood appearance changing as
   students progress toward their clean-up goal?

   b) How do neighborhood areas targeted for clean-up
   compare before and after the program?

     Weekly tallies show that
   pounds of recyclables
   collected increases during
   school year.

     Informal interviews with
   students reveal amount of
   trash in designated "ugly"
   spots decreases during the
   year.

     Goal of 1,500 pounds
   reached; one hundred per-cent
   of the students achieve
   goal of free tickets.

     Before and after photo-graphs
   of neighborhood
   show differences.

   4. To increase
   community
   awareness and
   involvement in
   clean-up efforts

   a) What strategies are used to increase awareness?
   b) How aware are parents and community members of
   clean-up efforts?

   c) To what extent do parents, businesses, and
   community members support clean-up efforts?

   d) What evidence suggests that clean-up efforts will
   persist beyond the program?

     In informal interviews and
   observations at community
   meetings, parents and
   others express awareness
   of program.

     Number of businesses that
   actively support recycling
   increases.

     On community survey, at
   least 50 percent of com-munity
   members express
   awareness of and support
   for recycling.

     Number of businesses in-volved
   in recycling in-creases
   by 50 percent by
   end of program.

   Taking Stock
   -
   62


   Interpreting and Reporting the Data
   How did the evaluation turn out? Let's take a look at the information
   gathered, how it was inter-preted
   to measure progress and impact, and what changes program staff made to
   improve the
   program, based on the evaluation data.

   Objective 1
   To develop a sense of ownership and pride in the
   community among participating youth

   Mrs. Alvarez considered the level of student participation each week
   as one indicator of program
   success. During the first year, weekly attendance records revealed
   that participation decreased
   from September to October. Student discussion groups held in October
   were a timely way to get
   some information about what students liked and disliked about the
   program, and their suggestions
   for improvement.

   Mrs. Alvarez learned from the students who were still attending that
   the absentees had tutoring
   activities scheduled on Thursdays. Once she changed the collection day
   to Wednesdays, atten-dance
   improved. Forms filled out in student discussion groups in December,
   February, and April
   indicated that participants liked the program more and more as the
   year progressed they ex-pressed
   excitement about getting closer to their 1,500-pound goal and about
   the neighborhood's
   "new look."

   Student surveys at the end of each program year gave participants an
   opportunity to talk about
   how the program had affected them. One question (" What did you like
   best about the program?")
   elicited comments relating to the positive experiences provided by the
   program. Over half the
   participants said that cleaning up their neighborhood had made them
   "feel good." Students also
   liked being part of a group and working together toward a common goal.
   Some said this was the
   first time they had ever "been a leader." When asked about the most
   important thing they learned,
   students wrote about the value of working together to accomplish
   something. Finally, students
   liked the recognition they received which made them feel important,
   and in the words of one
   student, feeling "like I have something I can give to the community."

   Objective 2
   To develop students' science skills and
   knowledge, and their awareness of science and
   mathematics applications in everyday life

   Mrs. Alvarez learned from student discussion groups that some of the
   participants were having
   difficulty with hands-on activities that required mathematics skills.
   To remedy this, she decided to


   have students work in teams of three, and mixed students with higher
   and lower mathematics
   skills. Data from student discussion groups revealed that this
   solution helped many of the stu-dents
   improve their skills.

   Observations by Mrs. Alvarez and a community volunteer once a semester
   also provided oppor-tunities
   for observing student interest and skills, and for talking informally
   with participants. In
   her observation notes, Mrs. Alvarez repeatedly cited examples of
   students observing, measuring,
   recording, and drawing conclusions, and of students helping one
   another with these tasks.
   Alvarez also noted in her observations changes in students who
   appeared to be "mathematics-shy"
   at the beginning of the year, but who now participated fully in the
   activities. Other students' en-thusiasm
   and participation had remained steady.

   At least once a month, instructors took some class time to discuss
   with students what they were
   learning about the environment, including the sources of pollution and
   the challenges involved in
   recycling. Students noted that although they understood most of the
   scientific concepts discussed
   in after-school sessions, a few of the speakers had "talked over their
   head." This was useful
   information for lining up future speakers and making sure they were
   briefed on speaking at a level
   that was appropriate for an adolescent audience.

   Classroom discussions became more lively during the year as students
   took more interest in the
   program and the topics discussed by guest speakers. Data from student
   discussion groups sup-ported
   observations of high levels of student interest in science-related
   topics, and an increase in
   the number of students who related topics to their personal
   experiences. Finally, on question-naires
   almost two-thirds of the students said that the science activities
   were their favorite program
   activity; slightly more than two-thirds said that the most important
   thing they had learned was
   that, working together, their actions could make a difference in the
   community.

   Objective 3
   To clean up the neighborhood

   Each year of the program, five areas in the neighborhood were
   identified for clean-up. Mrs.
   Alvarez decided that taking photographs of these targeted sites at the
   beginning of the school year
   would provide good baseline data for the summative evaluation. Both
   years, the before and after
   pictures showed that a great deal of progress had been made toward
   cleaning up the neighbor-hood.

   Weekly tally sheets recorded by students and checked by instructors
   kept participants and staff
   aware of how the program was progressing toward its goal of 1,500
   pounds of recyclables. Year-end
   results revealed that this goal was achieved each year, and tickets
   for the community events
   were awarded to all of the students.


   Objective 4
   To increase community awareness and
   involvement in clean-up efforts

   YACA documented its MTFC community outreach strategies, including the
   number of presenta-tions
   made by students to community groups and businesses. Observations of
   neighborhood
   meetings and informal interviews with parents and community members at
   these meetings re-vealed
   that people noticed some changes in the way the community looked, even
   though some
   were unaware of the MTFC program.

   Based on the high level of awareness demonstrated by persons attending
   community meetings
   during the first year of the program, Mrs. Alvarez decided to go ahead
   with the survey of com-munity
   members. Students conducted a door-to-door survey in March of the
   second year of the
   program. Using a guide designed by Mrs. Alvarez, 25 student teams
   surveyed six households
   each for a total of 150 community members. Two volunteers helped tally
   the results. The sur-veys
   revealed that the majority of community members surveyed had noticed
   the change in
   community appearance and would be willing to participate in a
   recycling program.

   Telling the More Trash for Cash Story: Presenting the Evaluation
   Results
   The evaluation of More Trash for Cash showed that the program had a
   positive impact on the
   neighborhood, the participating students, and the community. A
   progress report for the first year
   of More Trash for Cash can be found in Appendix C.

   Sample Data Collection Instruments for More Trash for Cash
     Student Group Discussion Guide
     Session Observation Guide
     Survey for Community Members

   (See Chapter Eight for the following instruments)
     Participant Information Sheet
     Attendance Sheet
     Student Questionnaire
     Tally Sheet for Recyclables


   Student Group Discussion Guide
   Please talk about the following questions and decide as a group on the
   most
   appropriate answer. The group "recorder" should write in your
   responses.

   1. How do you like the More Trash for Cash Program? (Circle one.)
   4 = Great! 3 = Good 2 = Boring 1 = Really Boring!

   2. What do you like best about the program?

   3. What is the most important thing you have learned in the program so
   far?
   4. What suggestions do you have for making the program better?

   Thanks for Filling This Out!


   Session Observation Guide
   1. Are students:
   interested?
   enthusiastic?
   bored?
   distracted?

   2. What kinds of questions do students ask?

   3. Do students demonstrate an understanding of the topics?
   4. How do students work together?


   Survey for Community Members
   1. a) Have you noticed any changes in how the community looks?
   ____ Yes ____ No
   b) If yes, what has been the most noticeable difference?

   2. a) Would you be willing to help save recyclables for a community
   recycling program?
   ____ Yes ____ No
   b) How would you be willing to help? Check all that apply.
   ___ Will save bottles and cans
   ___ Will help with clean-up efforts
   ___ Will volunteer for program sessions
   ___ Will make presentations
   ___ Other (please explain):

   3. Have you heard about a program called "More Trash for Cash"? If
   yes, what can you tell me
   about it? (If they haven't heard about the program, you can describe
   it to them.)

   Thanks for Filling This Out!


   Notes


   Notes


   Appendix A
   FINAL EVALUATION REPORT


   FINAL EVALUATION REPORT
   "Trash For Cash" Final Report
   Written by Maria Alvarez
   Director of Youth and Communities Alive!

   Submitted to the Central United Methodist Church

   "The Trash for Cash program really helped me come out of myself. I
   didn't know I could be a leader, but now I know I can."
   16-year-old female participant

   Youth and Communities Alive! (YACA) is a small community-based
   organization dedicated to
   serving low-income minority youth. Last year, YACA received $2,000
   from the Central United
   Methodist Church to run the Trash for Cash (TFC) program. The program
   targeted high school
   students and lasted one academic year. TFC had three main objectives:

   1. To develop a sense of ownership and pride in the community among
   participating youths

   2. To expand students' awareness of science and mathematics
   applications in
   everyday life

   3. To clean up the neighborhood

   We wanted to reach a broad spectrum of students, especially those who
   might not participate in
   an after-school program. To recruit participants, we made
   presentations in the schools, and met
   with students, teachers, parents, and agency staff to get referrals.
   We wanted to try to get both
   African American and Latino youth from the neighborhood. In all we had
   14 girls and 11 boys.
   Thirteen were African American, 8 were Latino, and 4 were white.

   A total of 25 high school students participated in the TFC program,
   which included weekly col-lection
   of trash in the community during after-school sessions. Students
   collected recyclables and
   kept track of the number of pounds of recyclables that they turned in
   for cash at the local re-cycling
   center. Their goal was 1,000 pounds of recyclables, which would make
   them eligible for
   tickets to an NBA game.

   Most TFC sessions began with a brief lesson about the importance of
   recycling or other envi-ronmental
   topics. Over the course of the school year, seven guest speakers from
   the community
   visited and made presentations about recycling, waste management for
   the city, water treatment,


   and other related issues.
   We had two questions that we wanted the evaluation to answer:
     What changes have occurred in the students' interest in the
   community and their
   awareness of the relevancy of science and mathematics?

     To what extent did the program result in a cleaner neighborhood?

   Keeping track of attendance helped us determine student interest in
   program activities. Student
   attendance at our weekly after-school sessions was generally high
   throughout the year, especially
   after the meeting day was changed to enable those with a conflict to
   come. We were pleased that
   the average weekly attendance was 18 students. By the end of the
   school year, all 25 students
   had participated in at least half of the weekly sessions. Three
   students had participated in every
   weekly session throughout the entire school year! Their continued
   participation in the program
   indicated to us that students were interested in the program's
   activities.

   The brief lessons that started most TFC sessions focused on
   environmental topics and seemed to
   interest most of the students. Some said that this was the first time
   they really understood why
   recycling was important to the community and not just a hassle. In
   addition to learning about
   science-related topics, students used practical mathematical skills to
   tally and weigh the recycla-bles
   they collected. By the end of the year, students who had had
   difficulty with these tasks were
   actively participating in the activities.

   At the end of the TFC program, we asked students to fill out
   questionnaires telling us what they
   liked best and what they had learned. From the responses on this
   survey, we think the program
   had a positive effect on the students. Three-quarters of the students
   wrote that they learned you
   could work together to accomplish a goal. Some students mentioned that
   they learned to use new
   skills. Almost half said they had learned how science plays a part in
   everyday life.

   What Students Said They Liked Best About
   the Trash for Cash Program

   Response on Questionnaire Percentage
   Getting recognition
   Working together
   Making a contribution to the community
   Achieving their goal and getting free NBA tickets

   48
   40
   28
   28

   Total number of participants 25

   Many students said they especially liked getting recognition from the
   community for their ef-forts
   it made them feel important. In the words of one student, "I feel like
   I have something I
   can give to the community." Students also liked working together and
   helping to improve the


   community. Over half the participants said that cleaning up their
   neighborhood made them feel
   good.

   To see if we had an impact on the community, we took pictures of five
   areas in the neighborhood
   at the beginning of the school year and again in the spring. These
   pictures were posted on the
   wall of the YACA center for staff, participants, and community members
   to see, and to help raise
   awareness about the program.

   The photographs taken after the program showed that the places where
   our students worked were
   much cleaner than before. The students were very excited when the
   community paper, The
   Central City Weekly, published our before and after pictures of the
   Adams Street playground.
   This publicity brought the students a great deal of pride in what they
   were doing.

   In May, we achieved our goal of collecting 1,000 pounds of
   recyclables. We were very pleased
   that all of our students were eligible for free tickets to the NBA
   game (because they all attended
   at least half of the weekly TFC sessions). We had our basketball night
   on May 25 and everyone
   had a lot of fun. We used money collected from recycling for a pizza
   party before going to the
   game.

   We believe that our program accomplished what it set out to do to
   clean up the neighborhood,
   increase students' community involvement, and expand their awareness
   of the relevancy of sci-ence
   and mathematics. As one student said, "TFC has been a great thing for
   me and for this
   neighborhood."

   *Note that some supplies, snacks, and a pizza party were paid for with
   the
   money earned from recycling. This enabled YACA to pay honoraria to 7
   guest speakers rather than the 5 originally budgeted for.

   Trash for Cash Program Final Budget
   Budget Item Budget Spent
   Salary for Maria Alvarez
   Tickets to NBA Game
   ($ 30 x 25 participants)

   Supplies*
   Honoraria for guest speakers
   ($ 25 x 7 speakers)

   TOTAL

   $ 825
   800
   250

   125
   $2,000

   $ 925
   750
   150

   175
   $2,000


   Appendix B
   PROPOSAL FOR EXPANDING A PROGRAM


   PROPOSAL FOR EXPANDING A PROGRAM
   "More Trash For Cash" Program Proposal
   Written by Maria Alvarez
   Director of Youth and Communities Alive!
   Submitted to the Central City United Way and the Tri-Cities Community
   Foundation

   Youth and Communities Alive! (YACA) is a small community-based
   organization dedicated to
   serving low-income minority youth. Last year, YACA received a $2,000
   grant from the Central
   United Methodist Church for a new program called "Trash for Cash"
   (TFC). In its first year,
   TFC had a great deal of success in achieving its objectives of
   cleaning up the neighborhood, de-veloping
   students' sense of pride in the community, and increasing their
   awareness of the rele-vancy
   of science and math. In the words of one participant:

   "The Trash for Cash program really helped me come out of myself. I
   didn't know I
   could be a leader, but now I know I can."
   16-year-old female participant

   We very much hope to build on our successes and continue TFC. However,
   based on our experi-ence
   last year, we believe the program would have a much greater impact on
   our community if
   program activities were expanded to include more science instruction,
   more guest speakers, and
   field trips. We also see the importance of including middle school
   students in this program and
   continuing to include high school students to serve as positive role
   models for the younger chil-dren.
   We are applying to new sponsors because the Central United Methodist
   Church does not
   have funds available for an expanded program.

   Trash for Cash: A Success Story
   Trash for Cash (TFC) targeted high school students and lasted one
   academic year. The program
   had three main objectives:

   1. To develop a sense of ownership and pride in the community among
   participating youths

   2. To expand students' awareness of science and mathematics
   applications in
   everyday life

   3. To clean up the neighborhood
   We wanted to reach a broad spectrum of students, especially those who
   might not usually par-ticipate
   in an after-school program. To recruit participants, we made
   presentations in the schools,


   and met with students, teachers, parents, and agency staff to get
   referrals. We wanted to try to
   get both African American and Latino youth from the neighborhood. In
   all we had 14 girls and
   11 boys. Thirteen were African American, 8 were Latino, and 4 were
   white.

   Most TFC sessions began with a brief lesson about the reasons for
   recycling and conservation.
   Seven guest speakers made presentations about various topics related
   to the environment. In ad-dition,
   students were given primary responsibility for organizing weekly
   community clean-ups and
   keeping track of the recyclables collected. Achieving the 1,000-pound
   goal set for the year
   entitled students to tickets to an NBA game.

   To see if TFC achieved its objectives, we looked at students' level of
   interest and participation in
   program activities, and their awareness of the relevancy of science in
   their own lives. Attendance
   sheets, observations, and student surveys helped us get this
   information. We also looked at
   changes in the community "ugly" spots chosen for our clean-up efforts,
   using before and after
   photographs and weekly tallies of recyclables.

   We achieved our attendance goal of 75 percent of the participants
   attending at least one-half of
   the weekly sessions. Observations by board members revealed changes in
   students' level of in-terest
   and participation in discussions, with more students actively
   participating at the end of the
   program than in earlier observations. In addition, students' comments
   seemed to demonstrate a
   greater awareness of the relevancy of science. For example, observers
   noted that many of the
   participants voluntarily made connections between the discussion topic
   and their own personal
   experiences. Student questionnaires provided evidence that supported
   observations. Students
   reported that they liked working together to improve the neighborhood,
   had learned about the
   importance of recycling, and had gained an expanded view of what
   science is and how it relates to
   their lives. Some students said this was the first time they really
   understood why recycling was
   important.

   Students successfully met their goal of 1,000 pounds of recyclables,
   and all received tickets to an
   NBA game. The photographs we took at the end of the year offered real
   proof that our program
   made a difference the areas were much cleaner, and the students could
   see the results of their
   work.

   Building on Success: "More Trash for Cash"
   We propose to build on the TFC success story by continuing and
   improving the program based on
   what we learned last year. The expanded two-year program is called
   "More Trash for Cash." We
   plan to put more emphasis on academic achievement, with higher quality
   science experiences to a
   larger number and wider range of students than the original TFC
   program. We will continue to
   develop the students' sense of pride and ownership in the community
   through weekly after-school
   community clean-up efforts, and in the process, improve the appearance
   of the neighborhood. In
   addition, we hope to increase community awareness of environmental
   issues and recycling. We
   plan to involve parents in clean-up efforts and drum up support for
   recycling among
   neighborhood businesses and community members. High school students
   will make presentations
   about "More Trash for Cash" and recycling at various community
   meetings. Our success in the
   area of public awareness will have a lasting impact on this community.



   Each year of the program, we will work toward collecting at least
   1,500 pounds in recyclables.
   When this goal is reached, participants who have attended at least
   half of the weekly sessions will
   be eligible to receive their choice of tickets to an NBA basketball
   game, a ride on a paddle-wheel
   river boat, or tickets to a performance by the Central Youth Theater.

   The "More Trash for Cash" program will include improved science
   instruction by enlisting the
   help of science educators. A real understanding of environmental
   issues will be gained through
   meaningful hands-on science activities. Joyce Edwards, a biology
   teacher from Franklin High
   School, and Park Central Middle School teacher, Ed Masterson, have
   each agreed to provide bi-weekly
   environmental science activities during the school year.

   In addition, Dr. Andrea Tybola, an environmental science professor at
   Western State College, has
   agreed to offer her expertise to "More Trash for Cash." She will work
   with the two teachers to
   coordinate the science lessons offered throughout the year. Dr. Tybola
   also has extensive con-tacts
   in the environmental community, and will help us to bring in high
   quality guest speakers
   including a colleague from Western State's Civil Engineering
   Department who will speak to the
   students about waste water treatment, and a colleague with the park
   service who will discuss the
   effects of pollution on the city's parks. Dr. Tybola's influence will
   also help us coordinate
   meaningful field trips to sites including the city's waste water
   treatment plant and the Orange Is-land
   Biological Research Park. These environmental education experiences
   will be invaluable to
   our students and will prepare them to share their knowledge with other
   community members.

   "More Trash for Cash" will also build leadership skills among high
   school students. A small cadre
   of participants from last year's TLC program will return to serve as
   program assistants for "More
   Trash for Cash." These five students will assist instructors as
   necessary and will help younger
   students with science activities. During the program's second year,
   high school participants from
   year one will be selected to fill these roles. Each year, we expect to
   work with 30 high school
   students (including the five program assistants). The older students
   will serve as positive role
   models for the 20 middle school students that we expect will
   participate each year in "More Trash
   for Cash."

   Monitoring Progress and Evaluating Impact
   In order to keep the program on track and to learn about the impact of
   "More Trash for Cash,"
   we have designed an evaluation that will provide both formative and
   summative data. The fol-lowing
   questions will guide the evaluation:

     What changes occur in students' interest in community involvement,
   their
   awareness of real life applications of science and mathematics, and
   their
   knowledge and skills in science?

     To what extent did the program result in a cleaner neighborhood?
     To what extent is the community aware and supportive of clean-up
   efforts?
   Like last year, we will monitor attendance at the weekly sessions of
   "More Trash for Cash." We
   will also continue to tally the amount of recyclables collected, and
   take before and after pictures


   of selected neighborhood areas targeted for clean-up. All participants
   will be asked to fill out a
   brief survey at the end of the program to answer questions about what
   they liked best and what
   was the most important thing they learned during the program.

   In addition to these activities, we will set aside 15 minutes twice
   each semester for the students to
   discuss in small groups what they think about the program. Students
   will record the major issues
   that come up in these discussions; we will use this information for
   formative evaluation purposes
   and make changes to the program as necessary. Observations of program
   activities and informal
   interviews with participants once per semester will enrich our
   understanding of the impact of the
   program on the participants.

   We plan to document the impact of "More Trash for Cash" on the
   community by attending
   meetings of various community organizations, keeping track of the
   number of businesses actively
   involved in recycling, and possibly conducting a community survey at
   the end of the second year
   of the program.

   Conclusion
   "More Trash for Cash" will offer quality weekly science experiences
   for our neighborhood's
   middle and high school students exciting, constructive activities that
   provide an alternative to
   the many negative influences in this neighborhood. The YACA staff feel
   confident that we will be
   successful at implementing this expanded program. We have been running
   enrichment programs
   for the children in our community for the past ten years. More
   specifically, we have already had
   success at running the "Trash for Cash Program" and we learned from
   that experience. We know
   what works and what doesn't work, and we know what our community
   needs. The "More Trash
   for Cash" program proposed here will expand on the ideas that we have
   already seen work with
   students in this community. The students will benefit immensely from
   this program, learning
   science and mathematics skills that will help them throughout their
   lives, and teaching them the
   importance of protecting the environment and recycling. In addition,
   this program will enable the
   participants to share their positive experiences and their
   environmental knowledge with others, to
   the benefit of the entire community.


   More Trash for Cash Program Proposed Year One Budget
   Budget Item
   Estimated
   Cost

   Salaries
   Program Director: $1,500
   Part-Time Program Assistant: $750
   High School Students: 5 @ $200 each

   Awards: Tickets to Community
   Events ($ 30 x 50 participants)

   Stipends for Instructors
   Teachers: 2 @ $500 each
   College Professor @ $1,000

   Field Trips (3)*
   Supplies*
   Honoraria for guest speakers
   ($ 50 x 15 speakers)

   TOTAL

   $ 3,250
   1,500

   2,000
   1,500
   1,000

   750
   $10,000

   * Note that we expect funds received for recyclables collected by
   students
   during the program will cover additional expenses related to field
   trips and
   supplies.


   Appendix C
   ANNUAL PROGRESS REPORT


   ANNUAL PROGRESS REPORT
   "More Trash For Cash" Year One Report
   Written by Maria Alvarez
   Director of Youth and Communities Alive!
   Submitted to the Central City United Way and the Tri-Cities Community
   Foundation

   "I never liked science in school. The More Trash for Cash program
   showed me
   how fun science really is. Plus we got to go to neat places that I had
   never seen
   before. Now I plan to study hard and be a biologist when I grow up."
   12-year-old male participant

   "I've always been kind of shy, I guess. Who would have thought I could
   be a
   leader? But with the More Trash for Cash group I have made
   presentations to the
   PTA and the Ministers' Alliance. It's fun and it's a good cause,
   because we are
   making the neighborhood better."
   17-year-old female participant

   Youth and Communities Alive! (YACA) is a small community-based
   organization dedicated to
   serving low-income minority youth. YACA received $7,000 from the
   Central City United Way
   and $3,000 from the Tri-Cities Community Foundation for the first year
   of the "More Trash for
   Cash" (MTFC) program. The program is expected to continue at the same
   funding level for an-other
   year. This report summarizes changes made to the program based on
   formative evaluation
   data, and describes the impact of the program evident after year one.

   MTFC Program Activities
   A total of 30 high school students and 20 middle school students
   participated in the MTFC pro-gram
   this past year. Participants included 28 girls and 22 boys; 32 were
   African American, 12
   were Latino, and 6 were white.

   Each MTFC session began with hands-on activities that engaged students
   in thoughtful investi-gations
   into various environmental topics. High school and middle school
   students participated in
   different (but usually related) activities appropriate for their grade
   levels, although on several
   occasions, we mixed the two levels. Activities were planned and
   presented by our science in-struction
   team comprised of high school teacher Joyce Edwards and middle school
   teacher Ed
   Masterson, and coordinated by Western State College faculty member,
   Dr. Andrea Tybola.
   Activities included weekly collection of trash in the community during
   after-school sessions.
   Students collected recyclables and kept track of the number of pounds
   of recyclables that they


   turned in at a community recycling center. A goal of 1,500 pounds of
   recyclables was set for the
   year.

   In addition to these weekly activities, twice each month guest
   speakers talked to our students
   about topics ranging from backyard bird feeders to global warming. All
   together, 15 community
   speakers visited during MTFC sessions. Most presentations were brief
   and tied in with hands-on
   activities in order to keep interest levels high.

   Two field trips were held this year. In September, we visited the
   waste water treatment plant in
   East Bay. In late April, we hiked through the Orange Island Biological
   Research Park where Dr.
   Evan Felden explained various environmental studies underway and the
   children participated in
   water sampling and testing activities.

   After achieving our goal of collecting 1,500 pounds of recyclables, we
   allowed the children to
   select the community event that they wanted to attend. This year's
   MTFC culminated with these
   exciting events, when each of our 50 participants attended either the
   NBA basketball game, the
   Central Youth Theater dance performance, or took a ride on the River
   Queen paddle wheel boat.

   The Evaluation Design
   MTFC has four main objectives that are addressed in the evaluation
   design:
   1. To develop a sense of ownership and pride in the community among
   participating youth

   2. To develop students' science skills and knowledge, and their
   awareness of
   science and mathematics applications in everyday life

   3. To clean up the neighborhood
   4. To increase community awareness and involvement in clean-up efforts

   The evaluation activities for the year were guided by three major
   questions.
     What changes occur in students' interest in community involvement,
   their
   awareness of real life applications of science and mathematics, and
   their
   knowledge and skills in science?

     To what extent did the program result in a cleaner neighborhood?
     To what extent is the community aware and supportive of clean-up
   efforts?

   We answered the first question by keeping track of attendance,
   providing participants with op-portunities
   to talk about the program several times during the year, and with a
   year-end question-naire.
   Observations of program activities were conducted several times during
   the year. To
   monitor our progress in cleaning up the neighborhood, we took before
   and after photographs at
   several neighborhood sites and kept a weekly tally of the amount of
   recyclables collected by the
   participants. To gauge community support for the MTFC clean-up,
   program staff attended


   meetings of neighborhood organizations, conducted informal interviews
   with parents and other
   community members, and documented the number of presentations made by
   our students to
   community groups and the number of businesses actively recycling.

   Changes in the Community
   We took photographs of several sites in the neighborhood at the
   beginning of the school year.
   These were places that needed our children! All of these photographs
   showed a great deal of
   trash. For example, the 2nd Street bridge overpass was piled four feet
   high in one corner with
   miscellaneous trash including hubcaps, newspapers, and even a
   refrigerator door. The Jackson
   Reservoir photo showed Styrofoam cups washed up on the shore and lots
   of soda cans.

   Our students went out in the neighborhood and cleaned it up. Each
   week, we would divide up
   into five clean-up crews and get out there and pick up trash! We
   averaged forty trash bags full of
   non-recyclable trash cleaned up from our community each week. We kept
   recyclables separate so
   that we could tally them and take them to the recycling center. Our
   MTFC students picked up an
   average of 62 pounds of recyclables every week. The week after New
   Year's, we collected a
   record 157 pounds of recyclables!

   We believe that MTFC is having a positive impact on this community.
   Many people see our
   clean-up crews out working and congratulate the children on their
   efforts. Our "after" photos
   show how good our neighborhood can look with just a little muscle
   power. We posted all the be-fore
   and after photos on the wall of the YACA center for staff,
   participants, and community
   members to see, and to help raise awareness about the program. The
   Community Weekly ran a
   story about the MTFC students and included before and after photos.
   The children really got a
   boost from this publicity.

   Our high school students made five presentations to different
   community groups and businesses.
   Our observations at community meetings show that people are starting
   to notice that the neigh-borhood
   looks better. However, at this point, adult members of the community
   are not them-selves
   participating in the clean-up efforts.

   Changes in Students
   Our evaluation information shows that MTFC has had a great effect on
   the students that partici-pate.
   Attendance has been high, although what weekly participation in MTFC
   sessions dropped
   between September and October. During a student discussion group in
   October, we learned that
   our scheduled Thursday sessions conflicted with other extracurricular
   activities particularly for
   the high school students. We changed our meeting time to Wednesday
   afternoons and found that
   attendance improved.

   After we changed the meeting time, student attendance at weekly
   after-school sessions remained
   generally high through the rest of the academic year. We averaged 37
   attendees per weekly ses-sion,
   with even higher attendance (42 on average) on days with guest
   speakers. In fact, some par-ticipants
   brought siblings and friends to MTFC sessions, so there were often
   even more children


   involved than the numbers indicated in this report (we did not include
   the unregistered attendees
   in our evaluation). We encouraged this, because the more children
   participating in the neighbor-hood
   clean-ups, the better.

   The continued high level of participation in the program indicated to
   us that the students were
   interested in our activities. At the end of the school year, 48 of the
   50 registered participants had
   stayed with the program and each had participated in at least half of
   the weekly sessions. Twenty-two
   students attended at least 23 of the 27 weekly MTFC sessions.

   We think the program had a positive effect on the students. When asked
   what they had learned,
   two-thirds of the students wrote that they learned you could work
   together to accomplish a goal.
   Others mentioned that they saw science "in action," learned to use new
   skills, learned interesting
   things from guest speakers, and became more aware of their
   neighborhood's trash problem.
   Ninety percent of the students rated the MTFC program as "Great!" Said
   one student:

   "I couldn't believe that we would clean up Bailey Avenue one week and
   then I
   went by there the next day and there was already more garbage on the
   street.
   I just couldn't believe it. I tell everybody I'm with not to litter."
   12-year-old female participant

   What Students Said Was the Most Important Thing They Learned
   from the More Trash for Cash Program

   Response on Questionnaire
   Percentage of
   Responses

   Learned that working together, you can make a difference
   Learned or improved math/ science skills
   Learned interesting things from speakers
   Became more aware of the neighborhood's trash problem

   69
   54
   48

   29
   Total number of participants responding 48

   Observations of MTFC sessions in October and May showed that
   participating children made
   great strides in developing skills used for scientific investigation.
   Early in the school year, only a
   few of the students were actively involved in observing, measuring,
   recording, and drawing con-clusions.
   By the end of the year, the majority were contributing to these
   efforts.

   Early in the year, we learned from student discussion groups that many
   of the younger children
   were frustrated with some of the tasks that required mathematics
   skills. The program staff dis-cussed
   these problems and we decided to have the students work together in
   teams. Each team
   included students with different levels of mathematics skills and at
   least one high school student


   assistant. This solved the problem, as evidenced by student comments
   later in the school year.
   Said one participant:

   "I didn't know how to multiply big numbers before. But Janeesha helped
   me
   learn how. Now I help our team do our tally every week because I know
   I'm
   going to go to that basketball game!"
   11-year-old male participant

   By the end of the school year, most of the students seemed quite
   confident in their ability to do
   these everyday mathematical tasks; those who had been "math-shy" at
   the beginning now actively
   participated. Middle school students seemed especially intrigued by
   the activities focusing on
   weight and volume. Said one participant:

   "I couldn't understand at first how we could collect a whole bag full
   of plastic
   milk jugs and it only weighed two pounds! A whole bag that I could
   hardly
   carry by myself!! And then Charles showed off because his little bag
   of alu-minum
   cans weighed 2.2 pounds! It took a while to understand that!"
   11-year-old female participant

   On several occasions, we mixed middle and high school students in work
   groups. It was a good
   way for the students to help each other with hands-on science
   activities. We found that mixing
   the groups enhanced everyone's experience. The younger children loved
   working with the big
   kids, and the high school students enjoyed the excitement of the
   younger ones. According to one
   high school student:

   "I didn't really want to deal with the little kids at first. But I
   actually found
   that they were cool to work with and really funny."
   15-year-old female participant

   The MTFC hands-on science activities that kicked off each weekly
   session were a great hit with
   all the students. For most of these children, MTFC was their first
   brush with "real" science the
   first time they saw that science really mattered in their daily lives.
   The guest speakers and field
   trips complemented and reinforced the concepts we investigated in the
   activities. On the ques-tionnaire
   at the end of the school year, 50 percent of the students said that
   they liked the field trips
   best of all the program activities.

   We feel that the participants gained a real understanding of the
   importance of recycling and the
   human impact on the environment. During observations and informal
   student interviews, students
   frequently commented on various environmental issues that they were
   newly aware of, and
   discussed different ways that they could personally help clean up the
   planet.

   When asked what they liked best about the program, most high school
   students mentioned the
   satisfaction they gained from improving their community and "making a
   difference." In contrast,
   a majority of the younger participants enjoyed the recognition that
   they gained from the
   program the NBA tickets, the newspaper story, and having their efforts
   displayed at the YACA
   Center.


   What Participants Liked Best About
   "More Trash for Cash"

   0
   20
   40
   60
   80
   100

   Improving the
   Community
   Recognition
   for their
   Efforts

   Percentage
   of
   Respondents

   Hig h S c h o ol
   (participants = 28)

   Mid dle S c h o ol
   (participants = 20)

   Where Do We Go From Here?
   We believe that the MTFC program is making progress toward our
   objectives to develop stu-dents'
   interest in community involvement, their awareness of real life
   applications of science and
   math, and their knowledge and skills in science; to clean up the
   neighborhood; and to increase
   community awareness about clean-up efforts. We plan to continue weekly
   neighborhood clean-up
   efforts. We know from surveys and informal interviews that the
   students are enjoying the clean-up
   activities, the hands-on science activities, and the field trips and
   guest speakers. We plan to do
   similar MTFC activities next year.

   One area that was not as successful as we had hoped was getting the
   community actively in-volved.
   We are going to work harder to make the community aware of
   environmental issues, re-cycling,
   and the efforts of the MTFC students. During a student discussion
   group this past May,
   several high school students commented that they really wanted to make
   others in the community
   more aware of MTFC efforts. These students have an action plan for
   getting the word out. They
   will work together to put on more presentations for community groups
   to spread the word about
   MTFC and drum up more support for the program. We think these
   activities will also enhance
   our students' leadership abilities as they take an active role in
   talking to adults in the community
   about the importance of environmental action.

   Many of our students have been "spreading the word" about recycling
   with their families and
   friends, but we want to organize more family activities to get parents
   truly involved. We hope to
   schedule some community clean-up days on weekends and post flyers so
   that community mem-bers
   know they are welcome to join in.


   To gauge community awareness and support for MTFC efforts, we plan to
   conduct a door-to-door
   neighborhood survey during the second year of the program. We plan to
   ask community
   members if they have noticed changes in the neighborhood, if they
   would like to participate in the
   clean-up, and if they have heard of MTFC.

   We think expanding awareness of MTFC in our community will have a huge
   impact on this
   neighborhood. The students will gain self-confidence from making
   presentations and being
   leaders in these activities, community members will become more aware
   of environmental issues,
   and the neighborhood itself will be improved if more people
   participate in recycling and clean-up
   activities. We hope that by spreading the word throughout the
   community, the MTFC program
   will have a lasting impact on this neighborhood.

   More Trash for Cash Program Year One Budget
   Budget Item Budget Spent
   Salaries
   Program Director: $1,500
   Part-time Program Assistant: $750
   High School Students: 5 @ $200

   Awards: Tickets to Community
   Events

   Stipends for Instructors
   Teachers: 2 @ $500 each
   College Professor @ $1,000

   Field Trips (3)*
   Supplies*
   Honoraria for guest speakers
   ($ 50 x 15 speakers)

   TOTAL

   $ 3,250
   1,500

   2,000
   1,500
   1,000

   750
   $10,000

   $ 3,250
   1,250

   2,000
   1,500
   1,250

   750
   $10,000

   * Note that some supplies and additional field trip expenses were paid
   for with
   funds received for recyclables collected by students during the
   program.


   GLOSSARY OF TERMS
   Evaluators do not always agree about how to use evaluation terms. This
   can lead to some confu-sion
   when you are first exploring the field. Some terms, like questionnaire
   and sample, are very
   specific and therefore are used consistently from one evaluator to
   another. Other terms, like for-mative
   and summative evaluation, can vary in subtle ways. We have simplified
   our use of these
   terms in order to give you an easy introduction to the key concepts of
   evaluation.

   You will undoubtedly come across other definitions or uses of some
   terms when you read other
   sources and talk to other evaluators. For the time being, however,
   here is a summary of how we
   have used key evaluation terms in this manual.

   baseline information Documentation of
   people, conditions, or events before a pro-gram
   begins. Provides evaluator with data to
   compare to information collected during and
   at the end of a program to gauge impact.

   biased Influenced in a particular direction.
   Evaluation data may be biased if it presents
   only a single point of view, as opposed to a
   variety of perspectives (e. g., participants,
   staff, community members). Similarly, asking
   only the most active participants to rate a
   program may bias the results and prevents
   you from learning why less active participants
   choose not to take part in program activities.

   CBO Community-based organization. This
   manual is written primarily for CBOs that
   offer science and mathematics programs for
   young people.

   closed-ended question Survey questions
   that provide respondents with a selection
   of possible answers (agree/ disagree/ no
   opinion; yes/ no/ don't know) and ask them
   to select the answer that best matches
   their beliefs or feelings. Responses can be
   tallied to provide quantitative data.

   data analysis The systematic examination
   and interpretation of information gathered
   through various data collection strategies,
   including document review, observations,
   interviews, and surveys. For most CBO
   program evaluations, data analysis is best
   focused around program objectives, inter-mediate
   indicators, and final outcomes.

   data collection The accumulation of infor-mation
   for evaluation through document
   review, observations, interviews, surveys,
   or other strategies.

   demographic information Descriptive
   data that includes race/ ethnicity, gender,
   age, grade level, socioeconomic status,


   and similar kinds of information. Can help in
   the analysis of program impact on different
   groups of participants, and in proving that
   you reached the audience your program
   targeted.

   direct quote Words, sentences or paragraphs
   taken directly from a person or group,
   through observations, interviews, or surveys.
   These excerpts use the respondent's exact
   words as opposed to paraphrasing or
   summarizing.

   document review The examination of records
   or documents that reveal information about
   the context in which a program occurs, about
   people's behavior, and about other conditions
   or events. Evaluators can make use of
   existing records (e. g., report cards) or
   develop forms especially for the evaluation
   (e. g., participant journals, attendance sheets).

   external evaluation Activities undertaken by
   a person or group outside the organization to
   determine the success of a program.

   final program outcome Changes you expect
   to see, hear, or measure which can tell you if
   your program achieves the goals for which it
   was designed.

   focus group An interview conducted with a
   small group of people. We find that focus
   groups often work best when participation is
   limited to 8 10 people. A focus group
   enables the evaluator to get in-depth infor-mation
   from a group of people in a short
   amount of time.

   formal interview A conversation in which the
   evaluator obtains information from a re-spondent
   or group of respondents by asking a
   set of specific questions.

   formative evaluation Data collection ac-tivities
   and analysis that occur over the course

   of program implementation. A process
   used to determine whether or not a
   program is working: What progress is be-ing
   made toward program objectives?
   How do we use feedback information to
   improve the program, refine data collec-tion
   activities, and identify problems or
   issues of importance that were not evident
   before a program began?

   goal The end what CBOs hope
   programs will accomplish in the long-run.

   informal interview A spontaneous
   conversation between evaluator and
   respondent. The interviewer uses no
   guidelines or protocol; questions are
   guided by the context of the situation.

   intermediate indicator The kinds of
   progress you expect to see if your
   program is moving toward achieving its
   objectives.

   internal evaluation An examination of
   program activities conducted in-house by
   CBO staff.

   interview A conversation in which the
   evaluator obtains information from a re-spondent
   or group of respondents. Inter-views
   can be formal or informal;
   structured, semi-structured, or
   unstructured; individual or in focus
   groups; in person or by telephone.

   needs assessment Information collected
   before a program is planned or
   implemented to help staff identify needs
   and target audiences, and to develop
   appropriate strategies. Sometimes
   referred to as front-end evaluation.

   objective A means to achieving a goal;
   what CBOs hope their program will
   achieve.


   observation In-person, firsthand examination
   of program participants and activities.

   open-ended question Survey and interview
   questions that allow people to respond in their
   own words. No answer categories are
   provided on the questionnaire or in the
   interview protocol. Questions are worded to
   discourage simple "yes" or "no" answers.

   organizational mission The reason why a
   CBO exists. Program goals are often closely
   related to an organization's mission.

   participatory evaluation The involvement of
   program staff in the design and imple-mentation
   of an evaluation conducted by a
   person or group external to the organization.

   probe Follow-up questions asked during an
   interview to help get at key issues and clarify
   what the respondent means. Probes may be
   included in the interview guide or protocol to
   help obtain the information needed for the
   evaluation.

   program evaluation Data collection and
   analysis which enables program staff to
   improve program activities while they are in
   progress and to measure the degree to which
   a program ultimately achieves its goals.

   protocol A set of questions used as a guide
   for conducting observations or interviews to
   help ensure that the appropriate information is
   collected from each respondent.

   qualitative data Information typically gath-ered
   through document review, observations,
   and interviews. Often expressed in words as
   opposed to numbers, although some
   qualitative data may lend itself to tallying and
   numerical presentation.

   quantitative data Information measured and
   expressed with numbers, typically gathered

   though surveys. Can be presented in a va-riety
   of ways, including numbers or per-cents,
   ranges or averages, tables, and
   graphs.

   questionnaire The written instrument used
   to collect information as part of a survey.
   Can include closed-and open-ended ques-tions,
   and questions that obtain demo-graphic
   information about the respondent.

   response rate The number of people who
   respond to a questionnaire, as compared
   with the number of people who received
   the questionnaire. Evaluators often fol-low-
   up with non-respondents to raise the
   response rate and obtain more accurate
   results.

   sample A subset (of people, documents,
   or things) that is similar in characteristics
   to the larger group from which it is
   selected. In evaluating large programs,
   CBOs might interview a sample of
   participants or review a sample of meeting
   notes instead of interviewing all partici-pants
   or reading all meeting minutes.

   summative evaluation Data collection ac-tivities
   and analysis which help determine
   how successful a program has been at
   achieving its goals. These activities
   generally occur toward the end of a
   program, or at appropriate breakpoints in
   multi-year or ongoing programs.

   survey A method of collecting information
   by mail, phone, or in person. Surveys in-volve
   a series of steps including selecting
   a sample, collecting information,
   following up with non-respondents, then
   organizing and analyzing data.

----------
End of Document


VICUG-L is the Visually Impaired Computer User Group List.
To join or leave the list, send a message to
[log in to unmask]  In the body of the message, simply type
"subscribe vicug-l" or "unsubscribe vicug-l" without the quotations.
 VICUG-L is archived on the World Wide Web at
http://maelstrom.stjohns.edu/archives/vicug-l.html


ATOM RSS1 RSS2