is Measurement Incorporated's automated writing evaluation (AWE) program. AWE programs support the teaching and learning of writing by providing automated scores and feedback to students' writing. By easing the burden of providing feedback, ³Ô¹ÏÍøWrite allows teachers to assign more writing and focus their feedback efforts. In turn, ³Ô¹ÏÍøWrite affords students the increased writing practice opportunities they need to improve writing quality. Moreover, ³Ô¹ÏÍøWrite's automated writing quality scores provide timely and reliable assessment data—which can be used to examine changes in performance over time—and automated feedback helps students improve their knowledge of writing quality criteria. ³Ô¹ÏÍøWrite is distinguished by the following features:
- Appropriate for grades 3-12
- Immediate scores and feedback aligned with Education Northwest's 6+1 Trait Writing Model
- Pre-packaged writing prompts—many including stimulus material—for a range of content areas
- Capabilities for teachers to create and assign custom prompts
- A library of pre-writing tools to support writing planning
- Peer review tools
- Interactive student lessons
- Usage and performance reports (for students and teachers)
- Integrated teacher feedback and communication tools
- Tools to support differentiation (prompt recommendations, grade level scoring options, and personalized feedback)
- Accessibility resources such as adaptable font size, background color, and highlighting
- Rostering and class management tools
Most importantly, ³Ô¹ÏÍøWrite is supported by an extensive research base. Researchers have examined (1) the efficacy of automated scoring and feedback in improving writing outcomes, (2) the accuracy of automated scoring as a screener for at-risk writers, (3) effects of AWE in naturalistic implementation contexts, and (4) best practices in AWE implementation to improve writing instruction. Links to select peer-reviewed publications are available below.
Efficacy of AWE in improving writing outcomes
This research uses experimental and quasi-experimental designs to evaluate the efficacy of ³Ô¹ÏÍøWrite in improving writing outcomes.
Delgado, A., Wilson, J., Palermo, C., Cruz Cordero, T., Myers, M., Eacker, H., Potter, A., Coles, J. & Zhang, S. (2024). Relationships between middle-school teachers' perceptions and application of automated writing evaluation and student performance. In M. Shermis & J. Wilson (Eds.), The Routledge International Handbook of Automated Essay Evaluation (pp. 261-277). New York, NY: Routledge.
Cruz Cordero, T., Wilson, J., Myers, M., Palermo, C., Eacker, H., Potter, A., & Coles, J. (2023). Writing motivation and ability profiles and transition after a technology-based writing intervention. Frontiers in Psychology—Educational Psychology, 14.
Palermo, C., & Thomson, M. M. (2018). Teacher implementation of self-regulated strategy development with an automated writing evaluation system: Effects on the argumentative writing performance of middle school students. Contemporary Educational Psychology, 54, 255-270.
Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers and Education, 100, 94-109.
Wilson, J., Palermo, C., & Wibowo, A. (2024). Elementary English learners' engagement with automated feedback. Learning and Instruction, 91.
Wilson, J., & Roscoe, R. D. (2020). Automated writing evaluation and feedback: Multiple metrics of efficacy. Journal of Educational Computing Research, 58(1), 87-125.
Wilson, J., Zhang, F., Palermo, C., Cruz Cordero, T., Myers, M., Eacker, H., Potter, A., & Coles, J. (2024). Predictors of middle school students' perceptions of automated writing evaluation. Computers & Education, 211.
Writing screening with automated scoring
This research examines the viability of ³Ô¹ÏÍøWrite as a screener for at-risk writers.
Chen, D., Hebert, M., & Wilson, J. (2022). Examining human and automated ratings of elementary students' writing quality: A multivariate generalizability theory application. American Educational Research Journal.
Wilson, J. (2018). Universal screening with automated essay scoring: Evaluating classification accuracy in Grades 3 and 4. Journal of School Psychology, 68, 19-37.
Wilson, J., Chen, D., Sandbank, M. P., & Hebert, M. (2019). Generalizability of automated scores of writing quality in grades 3-5. Journal of Educational Psychology, 111, 619-640.
Wilson, J., Olinghouse, N. G., McCoach, D. B., Andrada, G. N., & Santangelo, T. (2016). Comparing the accuracy of different scoring methods for identifying sixth graders at risk of failing a state writing assessment. Assessing Writing, 27, 11-23.
Wilson, J., & Rodrigues, J. (2020). Classification accuracy and efficiency of writing screening using automated essay scoring. Journal of School Psychology, 82, 123-140.
Naturalistic implementation contexts
This research examines outcomes associated with naturalistic and large-scale implementation of ³Ô¹ÏÍøWrite.
Huang, Y., & Wilson, J. (2021). Using automated feedback to develop writing proficiency. Computers and Composition, 62, 102675.
Palermo, C., & Thomson, M. M. (2019). Classroom applications of automated writing evaluation: A qualitative examination of automated feedback. In L. Bailey (Ed.), Educational Technology and the New World of Persistent Learning (pp. 145-175). IGI Global.
Potter, A., & Wilson, J. (2021). Statewide implementation of automated writing evaluation: Analyzing usage and associations with state test performance in grades 4-11. Educational Technology Research and Development, 69(3), 1557-1578.
Wilson, J. (2017). Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities. Reading and Writing, 30, 691-718.
Wilson, J., Ahrendt, C., Fudge, E. A., Raiche, A., Beard, G., & MacArthur, C. (2021). Elementary teachers' perceptions of automated feedback and automated scoring: Transforming the teaching and learning of writing using automated writing evaluation. Computers & Education, 168, 104208.
Wilson, J., & Andrada, G. N. (2016). Using automated feedback to improve writing quality: Opportunities and challenges. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (pp.678-703). IGI Global.
Wilson, J., Huang, Y., Palermo, C., Beard, G., & MacArthur, C. A. (2021). Automated feedback and automated scoring in the elementary grades: Usage, attitudes, and associations with writing outcomes in a districtwide implementation of ³Ô¹ÏÍøWrite. International Journal of Artificial Intelligence in Education, 31, 234-276.
Wilson, J., Myers, M. C., & Potter, A. (2022). Investigating the promise of automated writing evaluation for supporting formative writing assessment at scale. Assessment in Education: Principles, Policy & Practice, 29(2), 183-199.
Wilson, J., Olinghouse N. G., & Andrada, G. N. (2014). Does automated feedback improve writing quality? Learning Disabilities: A Contemporary Journal, 12, 93-118.
Best practices in AWE implementation
This research investigates how to best implement ³Ô¹ÏÍøWrite to improve writing instruction.
Wilson, J., Zhang, S., Palermo, C., Cruz Cordero, T., Zhang, F., Myers, M., Potter, A., Eacker, H., & Coles, J. (2024). A latent dirichlet allocation approach to understanding students' perceptions of automated writing evaluation. Computers and Education Open.
Palermo, C., & Wilson, J. (2020). Implementing automated writing evaluation in different instructional contexts: A mixed-methods study. Journal of Writing Research, 12(1), 63-108.
Wilson, J., Potter, A., Cordero, T. C., & Myers, M. C. (2022). Integrating goal-setting and automated feedback to improve writing outcomes: A pilot study. Innovation in Language Learning and Teaching, 1-17.
Scoring Services
We offer on-demand essay scoring services to researchers and others seeking reliable, generalizable essay scores. Scoring services use the same automated essay scoring models used in ³Ô¹ÏÍøWrite. Models can be used to score essays written by grade 3-12 students in response to any informational, narrative, or persuasive/argumentative prompt. How it works:
- Contact us at MIMarketing@measinc.com with your scoring request.
- Send us your essays using the formatting and secure delivery specifications we provide.
- Receive essay trait scores for each of Conventions, Ideas, Organization, Sentence Fluency, Style, and Word Choice.