New tablet technology for digital data collection is streamlining classroom observation research at the Center for Advanced Study of Teaching and Learning (CASTL) by replacing paper-and-pencil score sheets.
Twenty-five observers working with the Preschool Relationships Enhancement Project are piloting the new technology using both the Individualized Classroom Assessment Scoring System™ (inCLASS) and the Classroom Assessment Scoring System ™ (CLASS). This project, led by Amanda Williford, examines the effectiveness of a strategy for improving the problem behavior of preschoolers. It calls for strengthening the teacher-child relationship through brief, periodic one-on-one play periods together. Observers use both the inCLASS and the CLASS as they assess children and teacher-child interactions before and after the intervention. They are observing classrooms in areas both around Greensboro, North Carolina, and Hampton Roads, Virginia. Development of the tablet technology was funded through this project by the National Center for Special Education Research
Both the inCLASS and the CLASS are instruments developed by CASTL researchers. The instruments require highly trained observers to view classroom activity and score specific behaviors from low to high on a 7-point Likert scale. With the inCLASS, observers characterize preschool children’s patterns of interactions with teachers, other children, and learning activities. With the CLASS, they characterize the quality of a teacher’s interactions with children.
Until now those scores and all information related to the observation were recorded on paper score sheets, said Jennifer LoCasale-Crouch, who co-leads the project and headed the electronic data collection. Thousands of classroom observations have been recorded this way in locations across the U.S. in previous research projects. Observers have had to mail the paper copies back to the CASTL offices in Charlottesville, where each is checked twice for completeness and accuracy and then entered manually into a database for analysis.
“There was lots of room for human error,” LoCasale-Crouch said, “and there is a long stretch of time from observation to availability of data, which delays our ability to answer the questions in our studies.” In previous projects the data have taken as long as one or two years to be ready for analysis, she said. The electronic version of the score sheets has changed the equation dramatically. It automatically fills in certain information, such as beginning and end times of observations. Required-entry fields ensure that data entry is complete and conforms to protocol.
The touchscreen feature allows the observer to focus more attention on watching the research subjects than on writing. Afterwards, the data is uploaded instantly to a centralized location where it can be reviewed within minutes.
“This capability is something we have wanted for a long time, but the technology to do it in a cost-effective, scalable way has just recently become available,” LoCasale-Crouch said.
Since October more than 2,700 observations have been made, with CLASS and inCLASS scores entered digitally and ready for immediate analysis once completed. “We’re finding that the technology also allows observers to make more observations per day because of the way it streamlines data collection,” Williford said.
The Oracle-based, dashboard-style interface allows project leaders to see at a glance what site their observers are visiting and when, as well as the results of their assessments. Research protocol requires double-coding on 20 percent of observations, which means that two observers visit the same classroom at the same time so reliability of their observations can be checked. LoCasale-Crouch said this is another area where the technology has made a difference.
“If there is a significant difference in the scores of the two observers, we can provide support right away to correct misunderstandings about how to code child and teacher behavior, which increases precision in subsequent observations,” she said. “We can also see exactly how long it took to code each observation and provide helpful feedback if it appears the observer exhibited some uncertainty.”
The potential future use of this technology extends beyond research, according to Williford. “If we find this technology works well, we can apply it to our professional development efforts, too,” she said. Instead of CASTL consultants waiting days or weeks to receive observation data in the mail, they could have access to that data within minutes of an observation, she said. “Classrooms are very dynamic places, and the capability for our trained consultants to provide nearly immediate feedback to teachers will be much more effective in helping them improve their teaching.”
by Lynn Bell