Tag Archives: rubrics

A simple rubric for on-demand writing

6.1 Designing Student Assessments around Criteria and Standards: Assessment criteria and standards are clear.

Ensuring that assessment criteria and standards are clear requires teachers to know exactly which skills and levels of understanding they expect students to master, and to communicate those expectations plainly to students. Rubrics can help with both tasks. Shermis and Di Vesta (2011) outline three key steps to formulate good rubrics:

  • identify a critical dimension of behavior at a particular developmental level
  • articulate the rubric to a point that different evaluators can obtain consistency
  • communicate with learners what is expected of them (paraphrase, p. 137).

On-demand writing about literature requires particularly succinct and quickly comprehensible rubrics, since students have a limited amount of time in which to develop their ideas about the text and form them into an organized, focused, and convincing essay. At the end of our Voice and Protest unit, I wanted to assess whether our 10th-grade honors students were able to synthesize multiple vignettes from our anchor text, Fountain and Tomb, in order to support an interpretive claim about the implications of the book’s structure. Students knew there would be a final in-class essay about the book, but were not given the prompts ahead of time. When they arrived, they were given the following, on a half-sheet of paper:


For each question I included brief directions on how to create a focused, supported answer: “Choose one (reason or influence/effect or theme) to focus on and use specific textual evidence from three different vignettes.” These directions reminded the students of two critical dimensions of strong writing that we had focused on throughout the unit (focused claims/theses and citing specific textual evidence, either in the form of concrete details or exact quotations); they also prompted students to see this as a synthesis task, one that required combining evidence from multiple stories to infer a big idea about the text’s format and purpose. In this sense, the prompts themselves are part of the rubric, because they identify the writing standards and imply the critical reading task for the assessment. Since I provided three high-level prompts to choose among, students had some power to shape their own assessment experience according to individual interest or preference.

The half sheet next articulates those standards further by delineating the three components on which I will base my evaluation. I have included quantitative expectations (one claim, at least three pieces of evidence from three different vignettes) as well as descriptive ones (claim is focused, evidence is clearly related, adequate analysis follows). In the preceding unit we worked on how to choose strong evidence in support of an argument, and how to analyze evidence by picking it apart and paying close attention to the exact language of quotations, the context surrounding details and quotations, and the implications of word choice and syntax. Consequently, students were familiar with those (bold-type) descriptors.

Because I wanted my students to use their time thinking deeply about the prompts and developing their written responses, I planned an assignment format that would take less than five minutes to read, comprehend, and question for clarification. I chose to present grading criteria as bullet-point descriptions of what a successful in-class essay would include, and to limit the criteria to three, with equal point value. When I asked for clarifying questions, I received three: one question about how many paragraphs I wanted students to write; one question about whether they only needed one piece of evidence from each vignette; and one about whether they could use three pieces of evidence from one or two vignettes instead of dealing with three vignettes total. These questions indicated to me that students understood my descriptive criteria (i.e., what it meant to have clearly related evidence and adequate analysis) and the purpose of the assignment (i.e., level of thinking required by the prompts) but not what the final product should look like. Consequently, one change I would make to this assignment in the future would be to spend some class time beforehand discussing the differences between common types of on-demand writing (for example, a claim paragraph used as a reading quiz, versus an in-class essay at the end of a unit) and presenting a few “real-life” student work examples of each.


Shermis, M. and Di Vesta, F. (2011). Classroom Assessment in Action. Lanham, MD: Rowman and Littlefield Publishers, Inc.