Monday, July 12, 2010

Automated Essay Scoring – WriteToLearn Software

Automated Essay Scoring is defined as the computer technology used to evaluate and score the written prose (Shermis & Barrera, 2002). As previously noted, this type of technology originally became available in 1966 with a system call Project Essay Grader, or PEG, designed by Ellis Page at the request of the College Board, which was looking for a more efficient and effective way to score the vast amount of written essays they deal with annually. (Dikli, 2006). Since then several other researchers and companies have developed their own technologies and methods of scoring essays automatically. One such company is Pearson Knowledge Technologies.

Pearson Knowledge Technologies markets its Intelligent Essay Assessor system (IEA) as part of its WriteToLearn software. Write to learn is a web-based tool that combines IEA, which focuses on students responding to essay prompts, with another tool called Summary Street, which has students read short passages and summarize the information they have read. The goal of these tools is to provide students with immediate feedback for their writing activities. Both components feature tutorial feedback and opportunities for revision. (Landauer, Lochbaum, & Dooley, 2009). According to a case study performed by researchers from the University of Colorado, the use of Summary Street has a statistically significant positive effect on all students using the program. Students were given four expository texts over the course of the four-week trial. Half of the class was given access to Summary Street and the remainder was kept as a control group. The authors point out that since the essay prompts increased in difficulty students showing no change in their raw scores should be considered as improving. A summary of the shows an obvious pattern of improvement for students using Summary Street in all but one category, mechanics. One must note students in the control group were allowed to use a word processor that included a spelling and grammar checker. The authors go on to point out that although students with higher skill levels, based on the previous year's standardized test, had higher overall raw scores, as would be expected, but students from the lower- and medium-ability levels made the greatest gains. The researchers concluded their case study with student interviews. As part of these interviews, students were asked to name components of a good summary. Those students who had used Summary Street were reported as giving the more detailed and substantive answer. (Franzke, Kiintsch, Caccamise, Johnson, & Dooley, 2005)

References

Dikli, S. (2006). An overview of automated scoring of essays. Journal of Technology, Learning and Assessment, 5 (1).

Franzke, M., Kiintsch, E., Caccamise, D., Johnson, N., & Dooley, S. (2005). Summary street® : Computer support for comprehension and writing. Journal of Educational Computing Research, 53-80.

Landauer, T. K., Lochbaum, K. E., & Dooley, S. (2009). A new formative assessment technology for reading and writing. Theory Into Practice, 44-52.

Shermis, M. D., & Barrera, F. D. (2002). Automated essay scoring for electronic portfolios. Assessment Update, 14 (4), 1-5.

No comments:

Post a Comment