Archive

Posts Tagged ‘essay grading software’

Grammar Checkers for Teachers

Conscientious teachers still mark up and comment on student essays. Despite recent trends toward holistic grading and the views of some kind-hearted souls who believe that “red marking” student writing irreparably crushes self-esteem, the vast majority of teachers do respond to student writing. Of this majority, some comment on writing content; some on essay structure; some on the quality and relevance of evidence; some on the proper use of citations; some on grammar and usage; some on mechanics (punctuation, capitalization, spelling, etc.); and some attend to matters of writing style. Rarely does a teacher do it all.

It’s exhausting and time-consuming. So, naturally, teachers look for short-cuts that will save energy and time, but ones which will still give students what they need as developing writers. Enter spell checker and grammar checker software. Whereas spelling checkers, either as a stand-alone software or as a tool embedded in word processing programs such as Microsoft Word®, do a reasonable job of finding spelling errors (other than troublesome homonyms), grammar checkers simply cannot replicate that effectiveness. But there are some helpful resources to lighten the teacher’s load…

Wikipedia has a nice article, Grammar Checker, which explains the programming limitations of grammar checkers, but suffice it to say for non-techies: grammar checking software is a whole lot harder to program than is spelling. My take is that we should encourage students to spell check and revise accordingly, but skip the grammar check and proofread instead. Geoffrey K. Pullum, Professor of General Linguistics at the University of Edinburgh, agrees with greater reservations:

“For the most part, accepting the advice of a computer grammar checker on your prose will make it much worse, sometimes hilariously incoherent. If you want an amusing way to whiling away a rainy afternoon, take a piece of literary prose you consider sublimely masterful and run the Microsoft Word™ grammar checker on it, accepting all the suggested changes.” (Monkeys Will Check Your Grammar, 2007)

The popular website Top Ten Reviews does a nice job reviewing the four most popular grammar checkers, although their top choice, Grammarly, did happen to advertise rather prominently on their site. In the review site’s testing, Grammarly caught 10 of 14 “grammar” errors. Now, to put on my English teacher’s hat, these were not all grammatical errors, but I nitpick. Of course, I had to try my own writing submission with the Grammarly software:

To pee, or to pee not: that is not the question. When in the path of alien invasions, it becomes necessary for the rights of the governed to outweigh the rights of the graham crackers, it is the right of the fig newton to abolish that nonsense speak.

The results? I could break down all of the issues, but you get the idea.

So, are there any computer short-cuts for essay response and grading that do help the conscientious teacher in providing quality essay response throughout the writing process? Yes there are, but these must remain where they belong: in the control of the teacher. At present, computer-scored essays remain a pipe dream.

However, a comfortable balance can be struck between technological efficiency and teacher judgment. Using the computer to grade paper and online essays can achieve both purposes. For those teachers interested in saving time and doing a more thorough job of essay response and grading, check out the The Pennington Manual of StyleThis style manual serves as a wonderful writer’s reference guide with all of the writing tips from the author’s comprehensive essay writing curriculum:  Teaching Essay Strategies. The style manual (included in the Common Core aligned Teaching Essay Strategies, also includes a download of the 438 writing, grammar, mechanics, and spelling comments teachers use most often in essay response and grading. Placed in the Autocorrects function of Microsoft Word® 2003, 2007, 2010, 2013 (XP, Vista,  Windows 7, 8, and 10), teachers can access each comment with a simple mouse click to insert into online student essays or print/e-mail for paper submissions. Each comment identifies the error or writing issue, defines terms, and gives examples so that student writers are empowered to correct/revise on their own. This approach to essay comments produces significantly more accountability and transfer to subsequent writing.

Grammar/Mechanics, Writing , , , , , , , , , , , , , , , ,

Computer-Scored Essays

Teachers recognize the value of essay compositions as vital tools for learning, self-expression, and assessment. The essay remains a staple of college and post-graduate applications, as well as job applications. In terms of formulating coherent explanation, analysis, or argument, the essay best provides that means. Even a well-constructed objective exam cannot match the essay in assessing the degree to which teaching objectives have been mastered.

“Essays are considered by many researchers as the most useful tool to assess learning outcomes, implying the ability to recall, organize and integrate ideas, the ability to express oneself in writing and the ability to supply merely than identify interpretation and application of data. It is in the measurement of such outcomes, corresponding to the evaluation and synthesis levels of the Bloom’s (1956) taxonomy that the essay questions serve their most useful purpose.” (Valenti, Nitko and Cucchiarelli 2003)

However, essays are rather subjective vehicles of expression. Even the best attempts to develop objective evaluation criteria with analytical rubrics and check-lists fall short of unbiased objectivity. Yet, this shortcoming has not eliminated the cherished role of the essay in the British and American educational establishments.

There’s just one problem. Essays just take too much time to read, respond to, and evaluate. A conscientious teacher may realistically spend an hour per student essay if that teacher responds to multiple student drafts in the context of the writing process evaluates the final published essay.

Of course, teachers can spend less time, if they use simplistic holistic rubrics or buy-in to the convenient notion that making comments on a student’s essay somehow disenfranchises the autonomy of the writer. However, most teachers recognize that interactive dialogue between student and teacher on the student’s essay is unavoidably essential. And it does take time.

Enter the age of computers. Word processing, spell check, grammar check, word count, reading level, data bases, etc. Is there a savior?

Computer-scoring of student writing is being actively marketed to K-12 schools and universities. Multinational corporations, such as Educational Testing Services (ETS), claim that current technology is able not only to provide objective assessment, but is now also able to give accurate and useful feedback to the student writer. Criterion, a machine-reading service marketed by ETS, has become widely popular in both American K-12 schools and universities. Other similar automatic grading programs are open for business.

Both of the new assessment consortia that have been delegated the tasks of developing national assessments for the Common Core State Standards (now adopted by 43 states) have indicated that they are using machine-scored essay software. “Automated assessment systems would provide consistency in essay scoring, while enormous cost and time savings could occur if the AES system is shown to grade essays within the range of those awarded by human assessors,” suggest the aforementioned researchers.

But, what to teachers say about computer-scored essays?

The National Council of Teachers of English (NCTE) summarizes its position on machine-scored writing:

We oppose the use of machine-scored writing in the assessment of writing.  Automated assessment programs do not respond as human readers.  While they may promise consistency, they distort the very nature of writing as a complex and context-rich interaction between people.  They simplify writing in ways that can mislead writers to focus more on structure and grammar than on what they are saying by using a given structure and style.

“Writing Assessment: A Position Statement.” NCTE.org. Nov 2006

The Conference on College Composition and Communication (CCCC) summarizes its position on machine-scored writing:

“We oppose the use of machine-scored writing in the assessment of writing.” Automated assessment programs do not respond as human readers. While they may promise consistency, they distort the very nature of writing as a complex and context-rich interaction between people. They simplify writing in ways that can mislead writers to focus more on structure and grammar than on what they are saying by using a given structure and style… We believe ourselves that machine-scoring fundamentally alters the social and rhetorical nature of writing—that writing to a machine is not writing at all.”

The CCCC Position Statement on Teaching, Learning, and Assessing Writing in Digital Environments

But, is there a middle ground? Can teachers use technology to save time and doing a more thorough job of responding to student essays? Can teachers maintain autonomy in the evaluation process and exercise their own judgment about which comments need to be made, which grammatical errors need to be marked, and which grade needs to be assigned?

Perhaps so. There can be a balance between technological efficiency and teacher judgment. Using the computer to grade paper and online essays can achieve both purposes. For those teachers interested in saving time and doing a more thorough job of essay response and grading, check out the The Pennington Manual of StyleThis style manual serves as a wonderful writer’s reference guide with all of the writing tips from the author’s comprehensive essay writing curriculum:  Teaching Essay Strategies. The style manual (included in the Common Core aligned Teaching Essay Strategies, also includes a download of the 438 writing, grammar, mechanics, and spelling comments teachers use most often in essay response and grading. Placed in the Autocorrects function of Microsoft Word® 2003, 2007, 2010, 2013 (XP, Vista,  Windows 7, 8, and 10), teachers can access each comment with a simple mouse click to insert into online student essays or print/e-mail for paper submissions. Each comment identifies the error or writing issue, defines terms, and gives examples so that student writers are empowered to correct/revise on their own. This approach to essay comments produces significantly more accountability and transfer to subsequent writing.TESThe Pennington Manual of Style

Grammar/Mechanics, Spelling/Vocabulary, Writing , , , , , , , , , , , ,