Case Study 3: Providing Web-based Support for Teachers
White Plains County School DistrictEric Johnson leaned back in his chair and tossed his pen onto his desk. He sighed as he watched his pen roll across the list of new teachers coming into the White Plains School District again this year. As superintendent of the White Plains District, Johnson had watched teachers come and go in previous years and could tell it was just the start of another rough year. He and his staff could barely keep up with all the novice teachers they were putting into the classrooms. His students, in particular, were not faring well with the slew of teachers cycling through the schools year after year. Something had to be done. White Plaines School District Frustration

What was the problem faced by the school district?

In 2003, the White Plains School District:

  • was 6th largest school district in the state
  • served over 15,000 students in 32 locations
  • had an extremely socially challenging demographic to educate
  • was among the poorest districts in the nation, with two-thirds of their enrolled student body eligible for free or reduced lunch
  • had low student achievement, high dropout and low graduation rates, and high turnover with their new teachers
  • had almost 10% of students in grades 5 or higher considered at-risk by West Virginia State Standards
  • reported a graduation rate of just 58% for the class of 2003
With 10% of teachers retiring each year and an additional 7% leaving for other reasons, White Plains was not building a very stable teaching force. In 1999-2000, White Plains hired about 100 new teachers. Within their first year of teaching, almost 20% of those new hires left the district. Furthermore, an additional 17% of hires left the district after working just two to five years. In order to fill the vacancies, White Plains was hiring at increasing rates each year. In 2001-2002, almost 150 new teachers were brought in to replace those who had left and to fill newly created positions within the district. Coincidentally, the highest teacher attrition was also at schools with the lowest academic achievement. It was clear to Johnson that student achievement in White Plains could not improve unless the district did something to retain its teachers.
Since the early 1990s, White Plains had sustained a successful face-to-face teacher mentorship program for new teachers. With the influx of new hires into the district each year, it became increasingly difficult for the limited group of mentors to provide the breadth of support needed by teachers with new certifications, new roles, and new challenges. In the same year, 2001, the district started looking at other ways of providing economical, efficient, and scalable support for its new hires.

What could be done?

Johnson consulted with the Director of the Department of Technology who had been tracking the attrition rates. They agreed that something had to be done about the district’s poor academic achievement and teacher attrition, but acknowledged that such a change could only happen if it were a district-wide initiative. Together, they identified opportunities for collaboration and support in changing the way White Plains worked. Since the Department of Technology had been handling much of the district’s teacher training needs for several years, the group looked to technology as a way of providing future support for the new teachers. They decided to develop a web-based portal that would support teachers by giving them access to information, mentors, and their peers. (For a general description of a portal, see Exhibit 1.) A portal project team was created to shoulder the day-to-day responsibility for the portal’s development and outlined how teachers might use the features provided in the portal.

How would they go about it?

The Team

White Plains embarked on a strategic initiative to distribute resources and provide new teacher support via this newly conceived Web-based portal. A portal project team was created to shoulder the day-to-day responsibility for the portal’s development and outlined how teachers might use the features provided in the portal.

The vision for the portal was to enable the development of communities of practice and learning by providing

    1. convenient and comprehensive access to district forms and documents,
    2. a vehicle for teachers to share in the building of professional knowledge and
    3. opportunities for teachers to build collegial supportive networks and relationships.
   
The rationale was premised on the idea that increased teacher retention and professional development would ultimately lead to increased student achievement. White Plains assumed that the high attrition rate was largely due to teachers feeling isolated and not supported as a community. By taking the community building approach, White Plains believed that teachers would come together, communicate, learn, and support one another. It was assumed that if teachers felt connected with each other and to the district and were actively engaged in knowledge building, they could in turn provide higher quality teaching to their students. The portal would offer new teachers a central location to find district news, documents, and guidelines. Teachers would have round-the-clock peer and mentor support via cadre chat rooms, one-on-one chats, and discussion forums.

How would the project work?

The project team had many discussions and meetings with vendors to determine how the portal would work. In order to fully understand what they were planning and what they foresaw as outcomes, they developed a logic model to display what the inputs and activities would be in addition to the outcomes.

For detailed information about developing a logic model, go to Evaluation 101: Prepare and to Lab 1: Logic Models.

For examples of logic models developed to portray the project, click here: example 1 and example 2.

Support
By December 2001, many ideas addressing the problem of teacher attrition had been put forward. White Plains had a vision of how to address the problem of teacher attrition and had set the steps in motion with the development of the Teacher Support Portal. If all went well, the portal would provide new teachers with what they needed in order to be successful and remain in the district.
   

How would they know if the project was successful?

Once the portal was up and running, the project team wanted to collect information about how teachers were using it and what they identified as benefits as well as how it could be improved so that they could make changes before it put into full-scale use.

For information on types of evaluation, go to Evaluation 101: Formative/Summative and to Lab 4: Evaluation Types.

Applause

What did they want/need to know?

In one team meeting, they divided into two groups. One group was going to list questions they wanted to answer as a pilot group of teachers started using the site. The other group identified questions about what they thought would be indicators that the portal was successful.

Group 1 identified the following questions:

  1. What were teachers’ initial reactions as they began using the portal?
  2. What information did they report getting from the portal?
  3. How often do teachers return to the portal?
  4. Which features of the portal were most useful to the teachers?
  5. What barriers do teachers identify to using the portal?

Group 2 listed these questions:

  1. How many teachers use the portal on a regular basis?
  2. What outcomes do teachers report as benefits of using the portal?
  3. What additional features have teachers requested to be part of the portal?

For more information on formulating evaluation questions, go to Evaluation 101: Design Plan - Framing Questions and to Lab 2: Framing Questions.

For information on the need to obtain informed consent and protecting human subjects, go to Lab 3: Consent and Human Subjects.

What data would they need to collect?

For this first version of the portal, the team wanted to collect information on how the initial group of pilot teachers would use the portal. This data would be used to inform further development of the portal as it was released to additional users across the district. Data collection techniques included web metrics, observations, and surveys.

How would they collect the data? Portal Login

Since this site was a portal, with a unique ID for each user, White Plains could track teachers as they logged on to the portal for the first time, see who was logging on repeatedly, see what areas teachers were visiting, and track teachers’ participation in online discussion boards, chat groups, and email communications.

Content analysis rubrics were developed to analyze online chats and discussions to determine which groups had sustained and meaningful communications.

Training sessions were also scheduled for pilot teachers to introduce them to the portal and the different features it offered. At each training session, teachers were asked to fill out a pre- and post-training questionnaire that collected background information and gauged interest in and appeal of the portal.

For more information on evaluation methods and data collection, go to Evaluation 101: Select Methods, and to Lab 5: Evaluation Methods and Lab 6: Data Collection.

How would they analyze the data?

The project team used both quantitative and qualitative techniques to analyze and interpret the data. Web metrics and some of the survey data were analyzed quantitatively, and observations and remaining survey items were analyzed qualitatively. Teacher trainings continued through the summer of 2002 and, at the end of the summer, the project team got together and went through all the data collected since the launch of the portal.

The team found that, while teachers who visited the portal thought that it was a useful tool to link them to the school and to provide venues for community development, many pilot teachers were not aware of what the portal was or how to access it.

They also found that most teachers who used the portal would log on initially during a training session, but not log on again to investigate its offerings further.

Of the visitors who were repeat users, many complained that they could not find things they were looking for (e.g., district forms, student rosters, etc.) and had to contact their school office or the district to locate hard copies of the materials.

In terms of community development, the project team found that some chat groups and discussion threads were more active than others. The team looked further into teacher demographics and assignments and found that teachers who were assigned to participate in specific chat and discussion groups were less active than those who chose the cadre they wanted to participate in. Those who voluntarily joined the groups had more frequent and more meaningful conversations. Teachers who actively participated in online discussions reported feeling more connected to their peers and to their community.

For more information about data analysis, go to Evaluation 101: Analyze Data and to Lab 7: Data Analysis.
Fit In

What did they find out? What decisions could be made?

Since the portal was still in development, the project team took the data they collected and began making revisions just before the new school year started.

The first action the team took was to do away with assigned discussion groups. They allowed teachers to join existing groups or create new ones as they saw a need. Training sessions to introduce the portal continued as the new school year started. The portal team sought buy-in from the district by providing data on the usefulness of the portal. The district then distributed announcements about the new portal and advertised upcoming training sessions. A content management system was set up so that administrative staff could easily add resources and documents to the portal for teachers to download.

Success

With each training session, the portal team continued to administer pre- and post-training surveys. They also continued collecting Web metrics and added focus groups and teacher interviews to gather more feedback for the next portal iteration. The feedback continually informs the development of future iterations of the portal, ensuring its relevance and usefulness to the success of new teachers in the district.

For more information on making decisions, go to Evaluation 101: Take Action and Lab 8: Taking Action.