Social validity refers to users’ perceptions of the usability, usefulness, and desirability of a tool or intervention. Numerous studies have examined the social validity of MI Write, examining the viewpoints of both elementary and middle school students as well as teachers. Overall, it is evident that both students and educators generally find the fundamental features of MI Write easy to navigate. However, there appears to be an underutilization of certain aspects, such as interactive lessons and collaborative tools like peer review functions. Several studies have highlighted that a lack of proper professional training or resources can lead to less effective implementation of the tool.
Teachers commonly concur that MI Write serves as a valuable classroom aid, aiding in structuring writing activities and fostering a writing-friendly environment. Nonetheless, there exists a degree of skepticism among some educators concerning the automated scoring data that the tool provides. Conversely, students tend to exhibit greater confidence in the scoring process, although their attitudes toward writing and confidence in their own ability tend not to change over time.
Key Findings
Teachers and students express generally positive views about the usability and usefulness of MI Write.
• Based on survey results, most studies report that both students and teachers find MI Write easy to use, though certain features (interactive lessons, peer collaboration tools) tend to be underutilized.
• Teachers agree that MI Write makes writing instruction easier overall; it serves as a time-saving aid in the classroom but does not replace their input. MI Write helps teachers to diagnose strengths and weaknesses, establishing a starting point to work with each student and to discuss their progress with parents/guardians.
Teachers have reported advantages and disadvantages of features of MI Write.
• Survey responses from Grade 3–5 teachers reflect mixed feelings of various components of MI Write. [2]
• While automated feedback reduces grading time, it is not always aligned with the skills currently being taught in the classroom. Moreover, students often need help to interpret feedback, which can include unfamiliar terms.
• Immediate scoring motivates students to improve, but students may become overly focused on scores or may even adopt dishonest strategies such as plagiarism or try to game the system or otherwise increase their score.
• Some teachers have expressed frustration that MI Write may not always produce scores for some good-faith submissions (e.g., due to too many misspellings) and can penalize creative writing choices (e.g., use of all caps, single-word sentences for emphasis, onomatopoeia).
• Some teachers have been skeptical about the accuracy of the automated scores. [4] Different teachers have described scores as both too harsh and too lenient. Such mixed feelings are reflected in teachers’ survey responses, which have shown teachers divided about the accuracy and fairness of scores. [1]
MI Write does not generally produce changes in students’ attitudes and beliefs about writing.
• Students generally do not report being more motivated to write with MI Write, though teachers report that students produce more writing [4] and work harder to improve scores [2]. There is an exception to this conclusion, as Grade 8 students using MI Write reported greater motivation to write than comparison students who used Google Docs. [3]
• Grade 5–8 students engaged in monthly writing activities expressed improved confidence in making progress toward immediate goals over time, but their beliefs and attitudes about writing did not change. A more collaborative, supportive writing environment combined with MI Write could show stronger effects. [5]
References
1. Palermo, C., & Thomson, M. M. (2018). Teacher implementation of self-regulated strategy development with an automated writing evaluation system: Effects on the argumentative writing performance of middle school students. Contemporary Educational Psychology, 54, 255–270. https://doi.org/10.1016/j.cedpsych.2018.07.002
2. Wilson, J., Ahrendt, C., Fudge, E. A., Raiche, A., Beard, G., & MacArthur, C. (2021). Elementary teachers’ perceptions of automated feedback and automated scoring: Transforming the teaching and learning of writing using automated writing evaluation. Computers & Education, 168, 104208. https://doi.org/10.1016/j.compedu.2021.104208
3. Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers and Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004
4. Wilson, J., Huang, Y., Palermo, C., Beard, G., & MacArthur, C. A. (2021). Automated feedback and automated scoring in the elementary grades: Usage, attitudes, and associations with writing outcomes in a districtwide implementation of MI Write. International Journal of Artificial Intelligence in Education, 31, 234–276. https://doi.org/10.1007/s40593-020-00236-w
5. Wilson, J., Potter, A., Cordero, T. C., & Myers, M. C. (2022). Integrating goal-setting and automated feedback to improve writing outcomes: A pilot study. Innovation in Language Learning and Teaching, 17, 518–534. https://doi.org/10.1080/17501229.2022.2077348