Case Study 1: Learning Spanish at a Distance
When Sharon Miller and Pam Thompson met for lunch one day, they didn’t know that they would foster an idea for a computer-based program that would both meet a state mandate and become an exciting new approach to learning a foreign language. Pam asked Sharon if she knew about the new state regulation that required that every middle school offer a foreign language. “Students aren’t required to take a course, but the school has to offer it,” she explained. “I don’t know how we’ll do it across the state,” she said with regret in her voice. “We don’t have enough foreign language teachers to staff the limited number of classes we have now.” Since Sharon was in charge of all online learning courses in the state, she began to think about how it might be done.
According to the 2000 U.S. Census Bureau, West Virginia has a population of 1,816,856 (estimate for 2005), of which only 2.2 percent ages 5 and older can speak a language other than English. Nationally, about 17.6 percent of the population speaks a language other than English. In this regard, West Virginia ranked last in the nation.
What was the problem faced by the State Department of Education?

The West Virginia Board of Education issued Policy 2510, requiring that foreign language courses be offered in grades seven and eight. Beginning in fall 2002, all counties in WV were required to offer two years of foreign language for students in grades seven and eight. Completion of Spanish 1A (7th grade) and Spanish 1B (8th) would give students one year of high school credit, for Spanish I.

WV faced a critical shortage of licensed foreign language teachers, especially Spanish teachers. The West Virginia Department of Education recognized that they had a serious challenge to confront in order to meet the mandate and provide foreign languages in schools throughout the state.


What could be done?

The West Virginia Virtual School was created by legislation on July 1, 2000 to offer high quality educational courses via Internet technology to students regardless of school location and size. Staff from the Virtual School met with Foreign Language staff and discussed the possibility of creating a virtual program that would combine the use of computer-based resources and lead teachers with Spanish-speaking ability who would speak with the students on a regular basis. They all thought it was worth trying. The WVVS applied for and received a grant from the USDOE to develop a middle school Spanish course, delivered via the Internet.

 

How did they go about it?

The goal: Design and deliver a technologically hybrid Web-based course—the West Virginia Virtual School Spanish 1A and 1B—to meet the needs of schools and counties that could not provide foreign language courses in 7th and 8th grade in a face-to-face learning environment.

West Virginia partnered with the Florida Virtual School to develop and deliver Spanish courses. Twenty-seven middle schools in WV used the courses to meet West Virginia State Board Policy 2510.

How does the program work?

The middle school virtual foreign language program is a modified version of the one-year high school Spanish I. The Virtual School (VS) Spanish program includes a 3-member instructional team made up of a lead teacher, an adjunct teacher, and a classroom facilitator. To meet the developmental needs of middle school students, the high school program was expanded from one year to two years and enriched to include many in-class activities as well as online activities. The new blended-delivery program is a hybrid of a face-to-face classroom and an online course. Students use paper-based modules as well as online content modules for instruction and practice.

In the classroom:

  • Students engage in online activities via the VS Internet site and WIMBA voice tools.
  • Students complete a variety of paper-based activities from their module notebooks.
  • Students submit work and quizzes electronically, and correspond via email with their virtual amigos.
  • Once a week, students interact with the lead teacher via telephone.

Outside the classroom:

  • Students complete assignments and communicate with facilitators, adjunct teachers, and lead teachers via email.
Classroom

In order to get a clear picture of how the program worked and the factors that affected implementation, the researchers developed a logic model to portray these elements. This helped them to think about all the factors that would affect implementation and success and to think about the questions they wanted to answer by conducting an evaluation of the program.

For detailed information about developing a logic model, go to Evaluation 101: Prepare and to Lab 1: Logic Models.

How would they know if the project was successful?

When a Request for Proposals (RFP) issued by the U.S. Department of Education was brought to the attention of the West Virginia Department of Education, they recognized that it might support the opportunity both to conduct a study of the Virtual School Spanish Program and to develop state capacity for carrying out evaluations. The RFP was, in fact, directed to technology-based programs and focused on the goal of building capacity. This case study describes a project that was funded by the U.S. Department of Education and awarded to the West Virginia Department of Education. It was referred to as the Educational Development for Planning and Conducting Evaluations (ED PACE) study, and it was designed to assess the implementation and impact of the program. Other collaborators in the research study were Rockman et al (San Francisco, CA) and The EdVenture Group (Morgantown, WV).

For more information on developing a plan, go to Evaluation 101: Design Plan.

What did WVDE and the researchers want/need to know?

As a first step, the researchers talked with the WVDE staff about the potential outcomes of the program and what they hoped would happen in the program. Clearly, they wanted middle school students to learn Spanish and to be prepared to continue taking 2nd and 3rd year Spanish in high school.

The researchers included these evaluation questions in their proposal. They were later refined to focus on many more elements of implementation.

  1. Does participation in a Distance Spanish Course (DSC) affect rural students’ achievement?
  2. How does the effect on students’ achievement of participation in a DSC compare to the effect of participation in a Spanish class with an on-site, face-to-face teacher (“face-to-face class”)?
  3. Does participation in a DSC affect rural students’ enrollment in more challenging high school courses? Does it affect their college enrollment?
Framing Questions

For more information on formulating evaluation questions, go to Evaluation 101: Design Plan - Framing Questions and to Lab 2: Framing Questions.

What type of evaluation would they conduct?

In this case, the researchers knew that the major questions were about outcomes—whether students would learn Spanish using this distance learning model. That was the summative part of the evaluation. This would require one set of methods (discussed below). But they also thought about the importance of looking at implementation of the program—how it’s carried out—in order to understand (contextualize) the achievement results. In other words, they thought it would be important to know what factors of implementation seemed to be associated with whatever outcomes were attained in the end. This was identified as the formative part of the evaluation because it would give them information about those factors of implementation. With so much to keep in mind, they decided to use some particular evaluation tools to ensure that they were “covering all the bases” as they worked out the evaluation plan.

For information on types of evaluations, go to Evaluation 101: Formative/Summative and to Lab 4: Evaluation Types.

What tools did they use to develop a comprehensive plan and to organize data collection?

Since these researchers were conducting a major study of a very complicated program over a three-year period, they made decisions each year about the major constructs (big ideas, topics, or program components) they wanted to study and then framed questions for each of those constructs. For the study of this Virtual School Spanish Program (the name it was eventually given), the constructs were things like “the instructional team,” “program delivery,” and “classroom activities.” The first tool they developed was a chart to show the constructs and factors of implementation that would be examined under each of those constructs. This helped them to think through what they needed to look at as they considered implementation.

The researchers then used the logic model and chart of constructs to refine a comprehensive list of questions that they thought would be the ones that WVDE and other state educators would want to have answers to.

Armed with the list of questions and a thorough knowledge of the program, they could begin to think about the methods and sources of data they would need to collect.

The researchers also needed to think about protection of human subjects in research, so they reviewed guidelines and developed the necessary consent forms.

For information on the need to obtain informed consent and protecting human subjects, go to Lab 3: Consent and Human Subjects.

What data would they need to collect?

To conduct the summative part of the study, they knew they could obtain student achievement data from the test that was administered to all students in the state. They realized, though, that a true test of whether students learned Spanish would require a Spanish outcome measure. After a search for an existing instrument did not turn up an up-to-date, standardized test of Spanish 1 based on current standards, they decided to develop one. They also decided to use an instrument developed by the Center for Applied Linguistics to assess students’ oral proficiency in Spanish.

To evaluate implementation, the researchers reasoned that they would need to conduct some observations of the program in operation. They also decided they would need to ask people involved in implementing the program about the process of implementing it through a combination of surveys and interviews. They also wanted to ask both students and parents about their reactions to the program. This meant that their evaluation would involve the collection of both quantitative and qualitative data—it would be a mixed-methods study.

For information on evaluation methods, go to Evaluation 101: Select Methods and to Lab 5: Evaluation Methods.

Again, they saw a need to organize the information, and they developed a matrix that displayed constructs and research questions. They also indicated the sources for data on each of the questions.

How would they collect the data? Collecting Data
The instructional team for the Virtual School Spanish Class includes three members: the facilitator in the classroom, the lead teacher who conducts a weekly telephone call in Spanish with each class, and the adjunct teacher who grades all the papers and tests that students do and gives feedback to the students. The researchers wanted to collect information from all members of the team. There were only three lead teachers, so it made sense to do interviews, rather than surveys. The facilitators and adjunct teachers were both groups with fairly large numbers, so they decided to administer surveys to them. They also decided to do them in an on-line format since the respondents were very used to using computers for communication. Students would also respond to student surveys on-line as part of their daily program. For parents, they knew they would have to use paper-and-pencil versions, and they would have to send the parent surveys home with the students.

The observations the research team wanted to do presented a different kind of challenge. They knew they would need to develop an instrument that would allow them to look at a whole range of activities being implemented by different people in different classrooms with different numbers of students. They would need to be consistent about what they observed and how they compared what they saw from site to site. To refine the instrument, they conducted some exploratory observations to see what kinds of activities seemed most important and what “mattered” or made a difference in the program. They categorized these constructs and determined that they would have to use a time-interval observation to be able to quantify what they observed.

For a detailed copy of the Edpace observation protocol, click here.

For information on data collection procedures, go to Lab 6: Data Collection.

Each year of the study, the researchers developed a design that would give them additional information (in some cases deeper information) about the project. This was always discussed with the staff from WVDE, and then they developed and reviewed instruments. They also discussed which sites would receive which instruments—virtual sites, targeted sites, or face-to-face sites. Virtual sites were all those that participated in the Virtual Spanish Program. Targeted sites were ones that became case studies during the second year—when additional data were collected on those sites to provide a more in-depth look. Face-to-face sites were those that had a classroom teacher in a traditional setting and served as comparison sites. For the third year, those decisions were organized in a table.

How would they analyze the data?
Testing The first task was to score the outcome measures—the tests of students’ ability to read and write as well as speak and understand spoken Spanish. For the multiple-choice written assessment, this required straightforward scoring, but the oral assessment required the skills of examiners who were trained both to administer and score students’ performance in Spanish. The assessment of their writing ability required the development of a rubric with critical elements examined and scored.

The qualitative data for the study included interviews and open-ended items on the surveys. These long narrative files were coded for themes and patterns that emerged in the interviews and the results were integrated with other information gained from observations and surveys.

Quantitative data included forced-choice items on the surveys and data from the observation protocol—reflecting frequencies of occurrences of behaviors by instructional team members and by students.

Data analysis of the implementation findings fell into four categories: (1) descriptive analyses of all items on the student, parent, facilitator, and adjunct surveys and of observation indicators and other open-ended, qualitative data to identify preliminary findings and themes; (2) psychometric analyses of the different measures associated with the various constructs (listed in this chart) to determine their reliability and validity as measures of these constructs; (3) examination of the variation in student academic and non-academic outcomes, and implementation outcomes and processes, to identify factors that may differentiate sites, or students within sites, from each other; and (4) use of statistical models to assess the strength of relationships among implementation processes and outcomes, intermediate academic and non-academic outcomes, and learning outcomes.

It was the linkage analyses or those designed to examine relationships among factors of implementation and outcomes that revealed a great deal about which activities or other aspects of implementation affected student outcomes. Researchers used their “hunches” about these relationships—gathered during the first year of the study—to determine which linkage analyses would be conducted.

For more information about data analysis, go to Evaluation 101: Analyze Data and to Lab 7: Data Analysis.

What did they find out?

With all of the data they had available once the study was completed, the researchers were able to report about many findings. These findings were related to the original logic model they developed and to key constructs they wanted to examine in depth. Since the evaluation design evolved over the course of the study, the findings were a bit different each year. Those discussed here were summarized from three years of the study.

  1. Data collected over 3 years showed that students in the Virtual Spanish classes learned Spanish.
    • Students in virtual classes performed as well as those in face-to-face classes on the Spanish Assessment.
    • Students in the virtual Spanish class maintained a relatively high level of achievement over 3 years of assessment. For specific results on Spanish outcome performance by students, click here.
  2. Students who participated in the Virtual Spanish program had positive attitudes towards Spanish and towards learning, strong work habits, and felt more prepared for high school and beyond.
    • More than 75% of the students surveyed said they learned a lot in Virtual Spanish.
    • Approximately 90% said they:
      • liked learning a foreign language
      • thought learning a foreign language is important
      • wanted to continue Spanish in high school
    • Students thought speaking Spanish would:
      • prepare them for more advanced classes in high school
      • help them in college
      • help them function in a workplace where others speak Spanish
      • live in a more diverse world
  3. High school Spanish II teachers said that students who took the middle-school Spanish I Virtual course did well in Spanish II, often outperforming students who take Spanish I in a high school face-to-face class. Teachers said that Virtual Spanish students excelled in:
    • Language proficiency.
    • Attitudes toward class and work habits.
    • Technology skills.

Who did they need to report to? What decisions could be made?

The researchers found that the program worked best when…

The 3-year study allowed researchers to visit all the sites, collect observation, interview, and assessment data, and examine the key elements of the Virtual Spanish model—facilitation by a three-member instructional team, technology, feedback and communication, use of Spanish, and site support. Analysis of the data helped the researchers identify factors that characterized effective implementations— and that were statistically associated with students’ achievement and engagement. They found that the program worked best under the conditions discussed below.

  1. Students in Virtual Spanish learned more Spanish and were more engaged when …
    • there was more interaction with the instructional team and when team members make connections to other subjects and provide scaffolding.
    • the facilitator was actively involved in the learning process and guided students smoothly through the daily lessons by maintaining flow, giving directions, reviewing activities, and asking questions.
  2. In classes where students heard more Spanish, from either the facilitator or the lead teacher, they:
    • performed better on the test of oral proficiency,
    • were more engaged,
    • valued foreign language more, and
    • wanted to continue Spanish in high school.
  3. In classes where students received more feedback on their learning and had high quality and frequent communication with the instructional team, students tended to. . .
    • learn more Spanish and be more engaged,
    • value learning a foreign language, and
    • want to continue Spanish II in high school.
  4. In classes where technology worked well, and students had access to the necessary tools (e.g., enough headsets and microphones)…
    • students learned more Spanish,
    • students were more engaged,
    • facilitators provided more instructional support and feedback to students, and
    • communication and interaction between facilitator and students was more frequent and of higher quality.
  5. In classes where students wrote more on the computer—filling in blanks, writing words or phrases, or composing open-ended responses—they had higher Spanish achievement and oral proficiency.
  6. Listening to Spanish via technology (CDs, Wimba tools) was positively related to Spanish achievement; listening to others—facilitators, lead teachers (via telephone), and peers—was associated with higher writing and oral proficiency.
  7. When there was a high level of school support—support from administrators, support
    from other teachers, an appropriate time in the school schedule, an appropriate class space—students tended to…
    • be more engaged and learn more Spanish,
    • value learning a foreign language, and
    • want to continue Spanish II in high school.

The researchers were able to make specific recommendations based on the findings…

Based on a detailed analysis of the extensive data collected over the past three years, including observations, questionnaires, and assessments, we see opportunities for the West Virginia Department of Education to:


Enhance the existing program:

  • The program should take full advantage of the existing technology to provide more opportunities for students to hear Spanish, and explore other new technologies that expose them to as much Spanish as possible. This is especially useful, given students’ preference for technology-based activities.
  • Students would appreciate and benefit from more site visits from adjuncts, giving them a more frequent opportunity to speak to and interact face-to-face with their Spanish teacher.
  • Facilitators in the virtual classrooms, especially new facilitators, could benefit from training and support that includes examples of best practices and effective classroom scaffolding. Facilitators should be encouraged to learn along with students and use Spanish as much as they are able to.
Extend the program to reach more students:
  • The program has thus far attracted higher-achieving students, as do most middle-school foreign-language programs. Administrators and facilitators have also found that students who work well independently and take responsibility for schoolwork are more likely to succeed. To extend the benefits of the program to a broader population, program leaders could consider guidance for schools in identifying, attracting, preparing, and supporting students who fall into one or neither category.

Replicate the existing program to provide other languages to middle school students.

  • The success of the Spanish program suggests that it could easily be replicated with different languages.

Replicate the existing program to provide Virtual Spanish II program.

  • Given the success of the current program in teaching students Spanish and instilling an interest in foreign language and other cultures, it is vitally important that the state make every effort to allow students to continue. Because of the shortage of foreign language teachers, some Virtual Spanish students may not be able to continue or may not pick up Spanish II until their sophomore or junior years. By offering the next level via a program patterned after this one, the state could ensure that students continue to develop important skills in language, technology, and life.