Lab Section Overview:

Practice, work through, and apply various aspects of the evaluation process using the labs in this section: Creating a Logic Model, Formulating and Prioritizing Evaluation Questions, Human Subjects and Informed Consent, Identifying Evaluation Types, Identifying Evaluation Methods, Planning for and Collecting Data, Analyzing and Interpreting Data, and Reporting and Using Findings.

Each Lab Contains:

  • Key Information: a recap of content you read about in Evaluation 101 that you will need and use to complete the lab.
  • Think About: questions or items to think through on the lab topic in preparation for completing the lab activity.
  • Activity: instructions on and guided practice for trying out and applying what you have learned on the lab topic.
  • Resources: links to Web pages or other sources containing information to help with the lab work.

Important Information:

You will need to use information about a project or program to complete the brainstorm and activity for each lab. You may choose to read a case study (or sections of multiple case studies) and use the project or program information contained in the case to complete the lab. Alternatively, you might want to use an actual, potential, or imagined project or program from your work or life on which you are, may, or could be evaluating.

Lab 1: Building a Logic Model

A logic model is a visual way of showing how you believe (or your theory behind how) your program will work to solve a problem.

Key Information:

In a logic model, you describe and depict the relationship among these program factors:

Inputs / Resources

What materials, money, staff, and other assets are available and necessary for the program’s operation?

Actions / Activities

What needs to be done to achieve the program goals?


What are the tangible results produced by program actions/activities?


What are the specific consequences of the program or changes in program participants ’ learning, knowledge, attitudes, behaviors, skills, conditions, status, etc?


What are the fundamental changes in communities, organizations, etc. that occur as a result of the program activities?
Question 1

Think about a program or project that you would be interesting in evaluating (or use one of the Case Studies you read). What are the goals or objectives of this project or program?

Question 2

What data and information already exist? What are the sources?

Question 3
Is a logic model needed for this project? Why?
Question 4

If you want to develop a logic model, start by thinking about the impact you are trying to achieve and then think backwards about what will be needed to achieve this desired impact. (Refer to the Key Information section of this page for definitions or details about each of the following.)

  • What is desired overall impact (or long-term outcome) of the project?
  • What are the project outcomes that will lead to this impact?
  • What are the outputs (or tangible results) of the program activities that will lead to the program outcomes?
  • What actions and/or activities needed to achieve these program results?
  • What inputs/resources are needed to complete the project activities?

Outcome: Create a logic model for a program or project that you would be interested in evaluating (or for one of the Case Studies you read).

Note: To complete this activity, choose and use the type of tools you prefer: computer-based or tangible such as paper, scissors, a marker, and glue. Using tangible materials allows you to move things around and to discuss with others if you’re working with a team, but you may be used to using a computer for generating information displays.

Part 1

Download (and print) Kellogg Logic Model Development template.

Part 2

In Think About, you chose a program or project that you would be interested in evaluating. Using the template and your work in Think About, jot down the project/program factors for this program or project. (If there aren’t enough boxes or space to list everything, add additional boxes or create your own sheet.) Make sure you list the program elements:

  • Impact (list this at the top of the page)
  • Outcomes
  • Outputs
  • Actions/Activities
  • Inputs/Resources

Part 3

Review the information you entered on the template. Remove or delete any information that doesn’t apply, or merge listings that seem similar.

Part 4

Does the order that you have items/factors make sense? If you have time, rework the graphic and create a new document to better show the relationships among factors. (Remember, you could do this electronically or physically with paper, scissors, a marker, and glue.)

  1. At the top of a new document, type or write “Logic Model for…” followed by the program name. Add the impact from the original template below this heading.
  2. List the heading names from the template on the new graphic: (from left to right) inputs, activities, outputs, and then outcomes.
  3. Now, copy or cut out the program factor boxes from your original template, and arrange these boxes (or program factors) on the new graphic. Start with each outcome and work backwards to inputs/resources. (What output is needed for this outcome? Which activity or activities are needed for this output? And so on…)
  4. Once the arrangement of the boxes seems solid, glue down each box (or save the document on the computer), and then add lines or arrows between related or dependent factors.

Part 5

View example logic models:

and two models created using Case Study 3 - model 1 and model 2. How does the logic model you created compare to those? Examine the type of information that was put in each category. Note any needed changes or adjustments.

W. K. Kellogg Foundation’s Logic Model Development Guide

Everything You Wanted to Know About Logic Models but Were Afraid to Ask

Harvard’s Logic Model Development Worksheet

Web-based course on developing logic models

PowerPoint presentation with information on developing logic models

Lab 2: Formulating and Prioritizing Evaluation Questions

Evaluation questions must be clear and measurable and ask about all or most aspects of the project.

Key Information:

Use of the Logic Model

Components of a logic model can be helpful when framing evaluation questions. The main questions would be about outcomes and implementation.
It may also be important to ask questions about inputs and outputs.
Phases and Stages
Questions will depend on the phase of project development and the purpose of the evaluation.
Stakeholder Input
Meet with stakeholders to brainstorm and formulate questions.
Prioritize and select evaluation questions that are most critical, informative, and realistic to answer.
Question 1

Think about the program or project that you considered in Lab 1: Logic Model, or think about another program or project that you would be interesting in evaluating. (You can use one of the Case Studies you read as an example). What is/are the desired outcome(s) of the program? (Refer back to the outcome or outcomes you listed on the logic model you created in the Lab 1: Logic Models activity.)  

Question 2

Identify the program stakeholders – or people with an interest in the effectiveness of the project or the evaluation results. (Project staff, funders, and participants? Community leaders, partners, or others?)

Question 3

Determine a few potential questions about the project or program, and then for each question, reflect on the following:

  • Which stakeholders would be interested in the question?
  • How important is the question to stakeholders?
  • Is data collection required to answer the question you posed or do the data already exist?
  • What resources are required to answer the question? Are these resources available? (If you are using a project for which you created a logic model in Lab 1: Logic Models, refer back to the resources listed on the logic model.)
  • What timeframe is required to answer the question?

Outcome: Generate and prioritize evaluation questions for the project.

Part 1

Refer back to the logic model you created in the Lab 1: Logic Models, or use one of the examples listed on the Activity page of Lab 1: Logic Models. List as many questions as you can that would be of interest about project implementation and impact for example:

Formative questions about implementation could be created to determine…

  • whether or not program developers are doing what they said they would do. (Refer to the activities section of the logic model.)
  • how well the program is going. (Refer to the outputs section of the logic model.)

Summative questions about impact could be created to determine…

  • if the program actions/activities are making a difference or having an impact. (Refer to the outcomes/impact of the logic model.)

Use each component of the logic model to frame questions: Inputs, Activities, Outputs, Outcomes, Impact. For example, for “Inputs,” you might ask whether the funding for a particular aspect of the program or project was adequate. For “Outputs,” you might ask how many parents attended the “Parent Resource Night” designed to increase parent involvement in school programs.

Part 2

Use the bulleted questions in Think About question 3 and determine the priority (high, medium, or low) of each evaluation question you created. Write the priority level next to each question. Remove or erase any questions that could or should be eliminated.

Part 3

Is each question you created clear and measurable? If not, revise the question to make it more specific. Are there ways you can improve or eliminate any of the questions you developed?

Part 4

How do you think the program will turn out? Make some guesses about how program implementation will go and what impact the program will have.

Chapter 5, Overview of the Design Process for Mixed Method Evaluations from The National Science Foundation's User-Friendly Handbook for Mixed Method Evaluations:
Link to the entire handbook:

Evaluation Questions that Specify Impact:

Lab 3: Informed Consent and Human Subjects

If your study involves human subjects, you are responsible for protecting and promoting the well being of these study participants.

Key Information:
Federal Requirements
According to federal policy, human subjects are “living individual(s) about whom an investigator (whether professional or student) conducting research obtains (1) data through intervention or interaction with the individual, or (2) identifiable private information."
Federal laws require that research studies using human subjects obtain informed consent from these participants. Informed consent is when a participant voluntarily makes an informed decision to be involved in the study.
Informed Consent
In order for a potential participant to make an informed decision about whether or not to participate in the study or data collection, you must provide the participant with information to make sure they understand the study – its purpose, potential risks and benefits, alternatives to participation, and how the information collected will be used.
Oftentimes, a participant signs a written consent form to document that consent is given.
Participants must understand that study participation is voluntary, and they can choose not to participate in or withdraw from the study at any time.
If children are involved in the study, you must obtain informed consent from the child’s parent or legal guardian.


Question 1

Who are the participants in this program? From or about whom will data be collected?

Question 2

Does the data collection involve intervention or interaction with these participants?

Question 3
Will you be able to identify participants from the data collected or will collected data include private or personal information about the participants?

Outcome: For the projects described in the scenarios under the two activities below, determine what procedures were or need to be undertaken to protect human subjects.

Activity 1:

To complete this activity, follow these steps: 

Part 1

Read the scenario below. Then answer this question:

  • What did the researchers do to protect and promote the well being of the study participants? Write down your reflections.


In the Lab Schools project, our data collection efforts primarily focus on students. We are collecting data on students' assessments scores, their grades in school, demographic characteristics, and their perceptions about their academic abilities and experiences. In addition to obtaining student and parent consent for every student, we ensure anonymity of individual responses. For instance, we only use identifying information to link individual student records (that is, we want to be able to know which survey responses, which assessment scores, and which set of grades all belong to Johnny Smith). In addition, we never use any identifying information (such as student names, student ID numbers, or teacher names) when reporting the eventual results.

Part 2

View two example consent forms: example 1 and example 2. Then adapt that consent form to be appropriate for the study described above or for the project or program you used in Lab 1: Logic Models or Lab 2: Framing Questions.


Activity 2:

To complete this activity, follow the steps below.

Part 1

Read the scenario below. Go back to the questions under Think About and answer them according to the information in the scenario.


A research firm was asked to undertake an evaluation of a training program for day care center workers. In initial negotiations, the stakeholders expressed a strong desire for an impact evaluation based on program goals. The goals include those that address the knowledge and learning of the workers, their ability to implement what they learn from the training, as well as impact on children in the care of the workers. Children in the day care centers range from 2 to 6 years of age.

Part 2

Read the Key Information presented at the beginning of this lab. Based on that information, make a list of the things someone would need to do in the study described in the scenario to protect human subjects.

Human Subjects

Office of Human Subjects Projections (OHRP)

Informed Consent / Consent Forms

Consent Form Wizard:

University of Michigan site on informed consent:

Lab 4: Identifying Evaluation Types

There are different types of evaluation, and different terms are used to describe them. Common terms to use for two major types of evaluation are formative and summative. It’s important to know what type of evaluation is appropriate for the questions you want to answer.

Key Information:
Formative Evaluations
Focus on the "development and improvement of a project" or project progress.
Usually take place at the beginning of the project and occur more than once.
Include questions such as:
  • Are components of the project being carried out as intended? If not, what has changed and why?
  • Is the project moving according to the projected timeline?
  • What is working well? What are the challenges?
  • What needs to be done to ensure progress occurs according to plan?
Summative Evaluations
Focus on the outcomes and impact of a project after a project has been completed.
Include questions such as:
  • Were the project’s goals met?
  • What components of the project were the most effective?
  • Are teachers incorporating new strategies into classroom practices?
  • What have students learned? How does their achievement compare to previous students? To others at the same grade level? In other schools, districts?
Question 1

When would you consider doing only a formative evaluation and not a summative evaluation?

Question 2

Who would be important people to consult or get information from when conducting a formative evaluation?

Question 3

If you decide to do both a formative and summative evaluation, you need to think about whether and when you will make changes to the program based on the data you collect from the formative portion of the evaluation. If you make changes to the program, you also have to think about adjusting your summative evaluation timeline so that you collect impact data on the program as it was modified. How would you do that?

Question 4

When do you think the data collection for a summative evaluation would occur? At the endpoint only or at appropriate “break points?”

Question 5
Why would it be important to collect “baseline” information for a summative evaluation?

Outcome: Select appropriate evaluation approaches for example evaluation scenarios.

Part 1

Think about the project or program you considered if you did Lab 1: Logic Models or Lab 2: Framing Questions, or review the four scenarios below to determine which type of evaluation—formative or summative, or a combination—would be most appropriate in each case. Remember that formative data are often used to make modifications in the project (or for course corrections or direction), and summative data are used to examine impact and to make decisions about whether to continue the program. Which approach seems most appropriate? Why? Are there other needs in the project that can’t strictly be addressed with one of these approaches?


#1. Two years ago the Carlton Senior Citizens Association initiated a Community Safety Project. The project involves service personnel visiting the homes of elderly people and giving advice about safety. Follow-up visits are designed to check on the implementation of the advice given. The project is well managed by the director of the association. The project committee wants a study that will determine whether the project has been effective.

#2. A government department was in the process of developing a workplace basic education program, in which classes in reading, writing an oral communication could be provided during work time for employees. An important objective was to develop a program that would be responsive to the changing needs of industry, while at the same time acknowledging the need for a basic education for working people. They needed to locate workplaces where the need for such programs was greatest and to use the most effective teaching strategies. They also wanted to make sure a plan was in place for assessing the effectiveness of the program.

#3. Policy-makers in a state department of education and the arts were concerned about the content and practice in the teaching of music in public schools. While most schools offered music programs, some officials believed that the curriculum was unresponsive to the needs of the large majority of students. An evaluation study was commissioned to investigate the accuracy of these perceptions and to provide guidelines for a revision of music curriculum policy.

#4. A research firm was asked to undertake an evaluation of a training program for day care center workers. In initial negotiations, the stakeholders expressed a strong desire for an impact evaluation based on program goals. The goals include those that address the knowledge and learning of the workers, their ability to implement what they learn from the training, as well as impact on children in the care of the workers. Children in the day care centers range from 2 to 6 years of age.

Part 2

Refer back to Lab 2: Framing Questions. Select one of the scenarios above and write a few formative or summative questions that would need to be answered to conduct the study and meet its intended purpose.

NW Regional Educational Laboratory’s information on formative evaluations:

NW Regional Educational Laboratory's information on summative evaluations:

Intro to Evaluation – Types of Evaluation:

Formative vs. Summative Evaluation:

Lab 5: Identifying Evaluation Methods

One of the most complex—but interesting—aspects of developing an evaluation plan is deciding what methods to use because there are so many options. The challenge is to select the right combination to answer the questions you have about the project or program.

Key Information:
Approaches to data collection
Approaches to data collection methods can be categorized as:
  • Quantitative (numbers): surveys, assessments, grades, etc.
  • Qualitative (words): interviews, focus groups, open-ended questions, etc.
  • “Mixed Methods” -- a combination of the two
Differences in approaches
Keep in mind that it is difficult to make hard and fast distinctions between qualitative and quantitative methods, but in general:
  • Quantitative approaches tend to be less time-intensive and easier to administer to a large number of participants and in turn, more generalizable.
  • Qualitative approaches tend to be time intensive and more difficult to use with a large number of participants but provide more in-depth information.
Question 1
For each of the evaluation questions you have identified for a project, what data already exist that could be used to answer those questions, and what is the advisability of using those data (e.g., completeness, age, representativeness, etc.)?
Question 2

What additional information is needed to answer the questions? What are the potential sources for those data?

Question 3

Can the questions be answered using quantitative data (e.g., achievement, survey, or other count data)? What form are those data in? What additional data should be collected? How often should the data be collected?

Question 4

What additional qualitative data (e.g. interviews, focus groups) would complement what is already available or any quantitative data that you might collect?

Question 5
What resources are available to support the collection of data (e.g., staff time, expertise for developing/analyzing surveys, appropriate respondents, etc.)

Outcome: For a project you select, develop evaluation questions and plot on a matrix along with selected evaluation methods.

To complete this activity, follow these steps:

Part 1

Select a project—either one of interest to you that you perhaps used in an earlier lab, or using one of the scenarios from the Lab 4: Evaluation Types Activity.

Part 2

Identify what you think some evaluation questions should be and enter them on the matrix (you may want to review Evaluation 101: Framing Questions or the Case Studies).

Part 3

Next, identify appropriate evaluation methods for answering the questions, plot them across the top of the matrix, and make X’s in the appropriate boxes. Use the chart below as a reference for methods, and refer back to Evaluation 101: Select Methods or the Case Studies as needed.

Remember that it is important to keep in mind the available resources/cost-effectiveness and how realistic and credible each technique would be.


  • Surveys or questionnaires
  • Statewide assessment data
  • Teacher content assessments
  • District assessments
  • School or classroom assessment or student work or grades
  • Non-cognitive data such as attendance and discipline referrals


  • In-depth interviews
  • Focus groups
  • Concept maps
  • Student work
  • Teacher observations
  • Open-ended questionnaires or reflection forms
  • Teacher journals
  • Lesson plans

Lab 6: Planning for and Collecting Data

A key to conducting a successful evaluation is the development of a careful plan for collecting and analyzing the data and for having discussions about the interpretation of the findings.

Key Information:
Importance of Planning
It is essential to develop a careful plan for data collection so that you are able to collect data in an efficient and timely manner, within the resources available.
An important part of planning is to establish an appropriate timeline for data collection. Having baseline data from the beginning of the project is often important for assessing change or impact. You also want to give the project enough time to gain traction before assessing progress or impact.
Set aside time for trying out or piloting any instruments you plan on using—even if it means simply trying them out with two or three people who would be representative respondents. Even after many reviews of a questionnaire, for example, you don’t know how the questions will work until someone tries to answer them. If possible, have the person “think aloud” with you as they go through it.
Be Realistic and Efficient
Set aside 5-10 percent of staff time for evaluation activities and 10-15 percent of the program budget for evaluation activities.
Stay focused on the information needed to answer your specific evaluation questions.
While it is unquestionably necessary to be mindful of your time and resources, it is also important to try and “triangulate” your sources for answers to specific evaluation questions, i.e., it’s not advisable to have only one source of information for any one question. For example, a good idea is to compare the results from a survey to similar questions from interviews from several people.
Question 1

Who needs to be involved? What points of view do you need to be represented?

Question 2

What multiple ways can you use to gather information related to your specific evaluation questions? 

Question 3

What are key points in time when you need to collect data? When will the project begin? Is there an end point or interim points when you need to be sure to collect data? 

Question 4

How will you select individuals from whom you want to gather information? How can you obtain their buy-in?

Question 5
What preparation activities do you need to plan for, e.g., obtaining consent forms, piloting instruments, getting support of administrators?

Outcome: For a project you select, develop a plan and timeline for data collection activities.

To complete this activity, follow these steps: 

Part 1

Select a project—either one of interest to you that you perhaps used in an earlier lab, or using one of the scenarios from the Lab 4: Evaluation Types Activity.

Part 2

If you created a logic model for this project under Lab 1: Logic Models, refer back to that. If not, either create a logic model or simply list the activities that you will need to accomplish as steps in the evaluation. If you use the logic model, look at the list of activities for the project. Note whether they are in sequential order and then create a list of evaluation steps that would follow that order.

Part 3

Think about key data collection points for the evaluation. Plot the activities and respondents on a timeline. Click here to view an example.

Part 4

For each activity, think of the length of time and participants needed. Be sure to plot project start and end dates. Make sure all of the work falls within those dates.

Example Timelines

Project STEP Timeline:

Designing Instruments

Evaluation Tools:

Online Evaluation Resource Library (OERL) online collection of evaluation instruments:

Lab 7: Analyzing and Interpreting Data

Analyses of both qualitative and quantitative data can yield a rich pool of information, but pulling it out of the raw data requires that you follow a few basic steps--carefully.

Key Information:
Data Analysis Plan
After choosing the approach (qualitative or quantitative) for data analysis, use the program logic model and evaluation questions to develop a data analysis plan. Make sure your analysis plan will lead to answers to the evaluation questions.
Quantitative and Qualitative Approaches
Data analysis approaches differ depending on the type of evaluation or data collected—quantitative or quantitative.


  • Coding non-numerical survey data – assigning numbers to each non-numerical response option for each survey question
  • Creating a coding key to explain the coding of each question (1= “yes”, 0 =”no” or 1=”male” 0=”female” or for grades - 1 = “F”, 2=”D”, 3=”C’, 4=”B”, 5=”A”).
  • Entering data (numbers) into a spreadsheet or database program (Excel, Access, SPSS, or SAS)
  • Checking data for errors (such as, by running frequency reports in SPSS or SAS)
  • Running data reports (frequencies, averages, etc)
  • Comparing the data (among participant groups or before and after data from a single participant)


  • Checking raw data
  • Coding data – labeling ideas by category or topic
  • Grouping data by code to identify patterns or themes in the data
  • Identifying themes to answer evaluation questions
  • Identifying amount of evidence required for a finding based on project's size and scope
After Analysis
Bring all of the sources of data together and try to make sense out of or interpret your findings. See what you what you have learned, if it’s what you expected, and how you can use the information to answer your evaluation questions.
Question 1

How much data needs to be analyzed? What time, resources, and expertise are available to accomplish the task?

Question 2

How do the quantitative data relate to the qualitative? For example, are changes in the classroom congruent with teachers’ comments on those changes?

Question 3

What trends or patterns show up in the quantitative data, e.g., student scores, scaled items on questionnaires, or observation data?

Question 4

What trends or patterns do you see in the qualitative data, e.g., interviews and focus groups?

Question 5

What do the data tell you relative to your evaluation questions?

Outcome: Use sample data sets to practice some review and interpretation tasks.

To complete the activity, follow these steps:

Quantitative Data:

Part 1

Review the sample coded survey protocol. Remember that coding a survey means assigning numbers to each non-numerical response option for each survey question.

Part 2

Print out the survey protocol. Using the sample coded survey protocol as a guide, code (assign numbers) to each non-numerical response options for each survey question on the survey protocol. 

Part 3

Then review the coding of the survey that we did. How does yours compare? There’s no “right” way to numerically code survey protocol question responses, as long as each response for a single question has a different number. Also, it is helpful to have consistent numbering for the same responses across questions, such as “1” for all “yes” responses and “0” for all “no” responses. Since coding varies, we’ll use the coding from our survey protocol for the remainder of this exercise – download and print the coded survey protocol.

Part 4

Download the data entry spreadsheet Excel file. Add up the data for the responses on each item. Look for findings in the data—things that seem to stand out, e.g., particularly low or high results for any one item, differences between the gender or age of respondents, etc. 

Part 5

If you are working with a group, discuss your observations of the results.

Qualitative Data:

Part 1

Read through these three interview transcripts - pieces of raw qualitative data.

Part 2

Think about how you would code this data. Would you create codes according to category, setting details, types of situation observed, events, processes, etc.? Come up with a list of codes or labels for recurring ideas or topics.

Part 3

Gather highlighters of varying colors. Assign and mark a color for each code on the list.   

Part 4

Print the three interview transcripts. Using the different color highlighters, highlight text from the interview that applies to codes you created, making sure to highlight using the appropriate code color. 

Part 5

Using the code list as a guide, choose one color/code and read through all text highlighted in that color. What themes or recurring ideas do you notice? Discuss your observations with others.  

Data Analysis and Interpretation

Analyzing, interpreting, and reporting basic research results:

Qualitative Data Analysis

An example interview coding scheme:

Quantitative Data Analysis

Choosing the correct statistical test tutorial:

Information on basic statistics:

An example Excel workbook that shows how frequencies and means are calculated:

Lab 8: Reporting and Using Findings

All of the formative and summative data you collect can quickly become mount up. What does it all tell you? How can you use it to judge your programs? How can you present it to your board, funders, community, and others who have a stake in the program?

Key Information:
Reporting Formats
A good evaluation report is clear, concise, and provides adequate evidence for claims and enough explanation to make sure readers understand the interpretation of the data.
It can be useful to develop several methods of presenting your findings, such as written reports, press releases, flyers or posters, and a meeting with stakeholders.
When reporting findings, keep program stakeholders in mind and focus on the information or findings that are important to each of these invested groups.
Develop an action plan based on your findings to implement program changes or improvements. Changes or improvements may focus on the content of the program, format, delivery, staffing, follow-up strategies, activities, setting, resources, etc.
When making decisions about “next steps,” remember to rank steps by importance and consider available resources to determine the feasibility of each possible step.
Question 1

Who are the stakeholders?  What types of information do you think these stakeholders need and want to know as a result of the study?

Question 2

What were the findings from the data?  What are some of the most important findings?  Are these findings equally important to all stakeholders?

Question 3

Do you think that the findings from the study are valid?  Why or why not?  (Remember: if different sources of data suggest the same or similar findings, it strengthens the case that the findings are valid.)

Question 4

What actions or decisions are suggested by the findings?  Think about the importance and feasibility of each of these possible actions or decisions.  Make sure to consider such factors as time and resource constraints.  Based on these factors, which actions or decisions would you recommend?  What are some next steps in implementing?

Question 5
Is more information needed to answer the evaluation questions or confirm the findings from the study?

Outcome: Use sample data sets to practice some review and interpretation tasks.

To complete the activities, follow these steps:

Part 1

Review the Reporting/Taking Action sections of the three Case Studies on this site. For each of them, read carefully what the findings were.

Part 2

Review any decisions or recommendations that were made in the Case Study. Go back to the findings and determine what data they had to support the actions and whether you think it was sufficient.

Data Reporting

Reporting Methods:

Example Reports

Online Evaluation Resource Library (OERL) online collection of evaluation reports:

Sample ROCKMAN ET AL reports:
Quality Teaching for English Language Learners
Bill Nye the Science Guy
The Buddy System Project

Using the Findings

Using Evaluation Findings for Decision Making:

Using Evaluation Findings: