Rethinking Rubrics: Rubrics that Make You Think

In 2010, my colleague (and mentor) Dr. Sharon Harsh was presiding over a meeting with staff from the Appalachia Regional Comprehensive Center (ARCC—our organization) and the Virginia Department of Education (VDOE), with whom we had collaborated for five years. She was summarizing the trends in education from the past decade or so and going out on a limb by making predictions of trends that were soon to influence education. She hit them right on the head, especially with one prediction: learning progressions will become prevalent and guide the work educators do at all levels.

Simply put, learning progressions describe the most likely steps people will take when developing new knowledge and skills. For example, before students can combine fractions with different denominators, they have to recognize what fractions are and understand what they represent. They have to know that a larger number in the denominator doesn’t mean it’s a larger fraction. Later they come to understand how different fractions are related—focusing on how to express two fractions with equivalent denominators, then unlike denominators. There’s more, but that’s a portion of the idea of how some concepts related to fractions progress.

Sharon got this so right! Learning progressions strongly influenced the way new standards were developed. And state departments of education, including VDOE staff in the present, are developing and sharing the learning progressions behind their standards so teachers can better understand how students master standards within and across a grade level. Teachers, too, are developing learning progressions at a finer grain that help them understand how students develop skills and knowledge within a single standard (like the idea of combining fractions above). I find learning progressions really intriguing, but I’m a little geeky like that.

Applying Learning Progressions

I’ve long used rubrics to support my instruction and to score student work. In the graduate class I taught, every activity used a rubric, and the students got all of the rubrics on day one and were encouraged to use them as they worked through activities. I’ve never really given multiple-choice tests. Ever. I’ve also helped a lot of teachers develop rubrics, especially when they need to assign some sort of score or grade to complex problems or projects. In many cases, a multiple-choice question isn’t the best option.

Below is an example of a rubric I created in the past. It’s typical of many I’ve seen. If you’re a student who wants to score well, you don’t make mistakes. As you make more mistakes, your score is lower. It seems logical, at first.

 

Learning Outcomes

Novice Developing Approaching

Expert

Grammar and mechanics of language The product contains numerous (7 or more) errors in grammar, punctuation, and/or capitalization of written text, or 3 or more errors in spoken language. The product contains several (4-6) errors in grammar, punctuation, and/or capitalization of written text, or 1 or 2 errors in spoken language. The product contains a few (1-3) errors in grammar, punctuation, and/or capitalization of written text, or no more than 1 error in spoken language. The product contains no errors in grammar, punctuation, or capitalization of written text, or no errors in spoken language.
Solve multistep problems with fractions The student does not show his/her work, presents incomplete work, or inaccurately presents work in regard to the guidelines. The student designs a solution that has more than one error in calculation. The student designs a solution that has no more than one error in calculation. The student designs a solution that meets the guidelines with no errors.

While this is a pretty typical rubric, it isn’t really very helpful for promoting learning. Why? It’s not the number of errors that’s important, it’s the kind of errors that students make that’s most important. If a student makes two or three errors, but there’s no clear pattern to them, it may just be a mistake because of a lack of time or sloppiness. That doesn’t tell me anything about what they do or don’t understand or how I need to re-teach them. But when a student makes consistent errors, like using “its/it’s” incorrectly over and over, or writing too many run-on sentences, or confusing larger denominators with larger fractions, then I know what to focus on. I needed something that showed me common errors, as well as that progression of how learners move from being a novice to mastering the standard.

Improved Rubrics

I’ve finally been able to connect that sage prediction that Sharon Harsh made with my own practice. Since standards are based on learning progressions, we should be monitoring where our kids are along those progressions. This helps not just teachers, but students too! Both can see what skills and knowledge they’ve mastered, where they need to go, and even suggestions as to what steps they might take to get there. Some might recognize that this is also a critical component of using formative assessment strategies to support learning, especially as proposed by Margaret Heritage (e.g., Where am I going? Where am I now? How do I get there?).

So over the past couple of years, I’ve been pushing myself to improve my rubrics. Instead of just counting errors, which tells me little about what my students truly know or can do, I’m now designing rubrics that describe the progression of learning students go through when mastering a content standard.

Please note: In the examples, the scoring categories are labeled as Learning Outcomes, but many teachers will recognize that the language used is drawn from actual standards, in these examples, the Virginia Standards of Learning, Common Core State Standards, and a WIDA ELD standard. So, in this way, the rubrics are actually standards-based. In fact, they’re probably more standards-based than any forced-choice assessment can be, at least for sophisticated learning outcomes.

Now when I work with teachers on complex problems or performance tasks, we co-develop rubrics that describe learning progressions. See the examples below created recently with some great teachers from the Crestwood School District in Dearborn, Michigan. These are rough drafts, but even at this stage I can see the progression learners go through for each of these learning outcomes. I learned this from these teachers, but every time I do this, the discussion we have about learning progressions is great.

 

Learning Outcomes

Novice Developing Approaching

Expert

Write opinion pieces on topics or texts, supporting a point of view with reasons and information.  The student’s product does not contain a clearly stated opinion or goes off topic and there’s no evidence. Possibly no reasons. The student’s product does include a clearly stated opinion, but lacks support through reasons that are expanded or supported by evidence from the texts. The student’s product does include a clearly stated opinion with some evidence, but the reasons lack coherence, may not be clearly sequenced or organized. The student’s product contains a clearly stated argument (or point of view) with reasons supported by evidence drawn from the texts and is clearly organized and coherent.
Students read informational articles on globalization to consider its impact on their lives (e.g., Internet, mass media, food and beverage distributors, retail stores).   The student’s product includes an opinion but does not include information from the articles. There’s no indication the student has or can read the articles. The student’s product contains phrases or some keywords from the articles but may not be explained or connected to a position related to their lives. The student’s product includes some examples from the articles but they may not support their position as it relates to their lives. The student’s product includes citations of examples from the articles that support their position and relates those citations to their lives.
Make a line plot to display a data set of measurements in fractions of a unit (1/2, 1/4, 1/8).  The student’s product does not create a line plot or creates something different from a line plot. The student’s product contains a line plot with simple fractions (e.g. ½ and ¼), with fractions out of order (because of denominator). Something’s out of order. The student’s product contains a line plot with points inaccurately plotted, so it does not match the data, though the fractions are in order. The student’s product contains an accurate line plot that displays the appropriate data and the fractions are in order.

 

Summative assessment is just one use of this type of rubric. Now that we’ve described learning progressions for these standards, these rubrics have multiple uses. Teachers can hand them out at the beginning of any unit, lesson, or activity that uses these learning outcomes so students know what they can do to get the grade they want. It saves teachers time because they don’t need to create rubrics for every activity, just for each standard. More importantly, students can use the rubrics to monitor their own progress. Schools wanting to move towards mastery learning or standards-based report cards can also use these types of learning progressions to truly describe what the difference between an A or a B (or other two grading categories) really means. It’s not just a score, it’s a point along mastery. Finally, this type of rubric is helpful when talking with parents. When parents want to know, “Why didn’t my kid get an A?” teachers can show parents exactly where their child’s current performance is along the progression and where they need to get to master the outcome (and get that A!). Maybe in the future, parents will ask, “How can I help my kid master the standards?” Maybe.

Supporting Formative Assessment with Technology

A few strands of my work have come together recently, and they focus on using technology to support formative assessment. This has been one of the most common requests recently from teachers/schools I’ve been working with and is the focus of two additional projects I’ve been working on. (You can skip directly to the tools here. Updated with two new tools on March 2.)

Through my work with the Appalachia Regional Comprehensive Center (ARCC), I’m collaborating with staff from the Virginia Department of Education to pilot a statewide cadre of teachers exploring formative assessment. Teachers from six schools across the state are working through training materials developed by Dr. Margaret Heritage and the Center for Standards and Assessment Implementation (formerly the Assessment and Accountability Center).

Margaret Heritage's book

Formative Assessment by Margaret Heritage

Dr. Heritage is probably the nation’s foremost expert on formative assessment and has been implementing and studying formative assessment strategies in multiple districts for years. (Check out her book on the topic!)

 

What Dr. Heritage does is provide concrete steps teachers can follow to embed formative assessment in their teaching. I agree with Heritage when she notes that formative assessment is not a single “thing” or event. It’s not a quiz. It’s not a test. It’s an ongoing dialog between teachers and students that ids designed to collect evidence of where students are in their journey of mastering skills and knowledge.

From my perspective, I overlay this idea of formative assessment being a process to the selection of relevant technology resources. An online quiz or a classroom responder (clicker) is itself not a formative assessment tool, unless used that way. For that reason, I’ve grouped a range of digital resources that can be used formatively, but you have to first identify the way you want to use it. That’s the trick to picking any technology for classroom use, actually. Figure out what you need to accomplish, then find a resource to match.

Check out the resources here

This is not an exhaustive list, and it was just updated this past week after my visit to Dubuque where I learned Infused Learning is on the way out and GoFormative.com is a new resource to be considered. I’d appreciate hearing from you about the digital resources you use and how you use them to support formative assessment so I can update this over time.

My Great Day! (at PHES)

Yesterday was such a great day that I wanted to share. I helped to pilot a new fifth-grade performance task at one of the elementary schools I’m working with. This is one of those events that can go well…or not. While probably not the intent, recent trends in education have been pushing teachers away from student-centered instruction. If you haven’t bought into it, it can be challenging . There’s definitely more activity going on, and if you like a pristine,  quiet classroom with kids in rows all doing the same thing, you can find the buzz of activity a little disconcerting. More importantly, if you’re not used to it, it can be challenging at a philosophical level.

The school and district administrators and I had talked about this last point. In previous visits, we observed what I see in a lot of classrooms—teachers so concerned about their students being successful that they don’t give them opportunities to struggle and even fail. There’s little challenge, as sometimes teachers do all the work and students spend their day copying. These teachers must be exhausted by the end of the day! But their students are just bored. I hope these performance tasks are a way to help teachers understand that student-centered instruction is possible, manageable, and a lot more fun—for students and teachers. I think we saw some of that yesterday.

The task

The performance task was intended to target key skills and knowledge the fifth-grade teachers covered in core content areas during the first nine-weeks grading period. It’s not a multiple-choice test, though, because those overarching skills oven require students to analyze and evaluate information and then create something. That’s hard to do when you’re selecting which bubble to fill in.

I unpacked and reviewed all the standards from the first nine weeks and described characteristics of the task. I spoke with the school administrators to identify a relevant topic. They talked with the teachers and came up with the idea of comparing white and wheat bread, because the kids are not happy about the switch to wheat bread in school lunches this year (as part of the new USDA guidelines for school lunches). Because a simple comparison—of cost or health benefits—didn’t reach the rigor of the standards, I ended up expanding the topic and had the kids create the best sandwich possible.

Without going into detail (but I’ll post the task), the kids were introduced to the topic through the guidelines their own school cafeteria faces. They have to create lunches that meet certain nutritional guidelines but that also have cost limitations. We simplified a few things and rounded out some numbers, but the final goal was that students had to design a healthy sandwich choosing from a number of ingredients and then design a product to convince their teachers, the school principals, and the other fifth-graders that it was the best solution. The total lunch had to cost less than $1.75, be less than 650 calories, and no more than 1/10 of the calories could come from saturated fat. Look at all that math! The presentations just had to be awesome—and some truly were.

Lunch TrayThe day

Teachers were prepared—both mentally, emotionally, and with resources—by the building leaders. They did a good job both logistically and professionally in setting up the day. The students had 2 hours to complete the task and had access to laptop carts, videocameras, posterboard, and other materials. Unfortunately, one of the teachers was absent, but it resulted in a fortuitous learning opportunity. The substitute teacher gallantly went on with the task. If designed correctly, the students should be able to complete the task on their own, so we (me and the building leaders) observed with interest to see how that class would work out.

The most striking observation was that the students in the class with the substitute immediately got down to work on the task and needed little guidance. The substitute introduced the task and let them at it! She gave little more direction than what was provided to the students in the task documents. The kids stayed on task pretty well throughout and we had a variety of PowerPoint presentations, some hand-drawn posters, and even a video-based commercial—all completed entirely by the students.

In comparison, in the other three classes, teachers were a little more reluctant to let go of control. Some of the teachers worked through the math components with the kids (so we don’t really know if the kids could create multi-step problems on their own, which is one of the standards). Some were prescriptive about what the students should create (limiting student input on creativity). Others controlled the pace of the class and wouldn’t let students begin until their work had been checked (preventing us from determining how well students could use their own problem-solving skills), with one teacher taking 50 minutes to review the task until gently prompted to let the kids get started. In one class, one young girl urgently repeated, “When can we get started?”

Ultimately, I think the teachers discovered that the students could work on their own on their projects. I certainly observed that. Some students, of course, needed support from teachers, but it appears to those of us observing that most kids were authentically engaged in the task and stayed on target throughout. After a quick review, the products from the class with the substitute weren’t substantially different from the other classes, but we decided we might find some middle ground in which the teachers provide some attention to the task requirements but without being so prescriptive. And the students in that class started finishing up after about 90 minutes, while some of the other classes took almost 3 hours. That’s a pretty telling piece of data on it’s own.

What’s Next?

I’m waiting to debrief with the principals after they have a chance to chat with the fifth-grade teachers. Were all the student projects wonderful? No, not really. Some did the barest minimum and others were so caught up in finding images and playing with backgrounds and fonts that they missed some of the critical details they were supposed to provide. But that’s valuable information, too.

The students had little or no problem searching the Internet and putting together PowerPoint presentations. And there were a couple of videos that kids did on their own. Incorporating technology more strategically during instruction and using it as a resource for solving problems is a logical focus moving on. But no one could deny the kids had a blast! One young man couldn’t control his excitement about sharing his solution he presented on a poster that he made up a song for it. There were several ingenious solutions, and lots of variety.

I’m hoping the teachers felt good after it was all over. They handled it well. Like I said, it can be hard to let go of the reins, but as we build more of these opportunities into the curriculum I’m sure they’ll do fine. And by the end of the year, I’m hoping they (and other teachers in their school and across the district) begin building their own tasks. When done well, these tasks can help students see the connections of why they’re studying something. It’s no longer just something to do for the teacher or for a grade. This was relevant to the kids and they not only expressed their opinions, but backed them up.

We identified our favorite top 10 and the kids are going to vote on the best presentations. I’m hoping the cafeteria actually makes some of the winning sandwiches! Talk about real-world application. Now I’m off to make a sandwich of my own. That task made me hungry.