Monthly Archives: June 2014

Transitioning to SBG 2

After my last post, I started analyzing many of my standards to try to think what a 2 would look like versus a 3, 4, or 1 and vice versa. With some standards this was very easy, but for others I found this to be very hard. The more specific standards are very hard to break into four different levels. Then I ran into this  conversation between Frank Noschese and Bob Kuhn (couldn’t find the entire conversation that included many more collaborators, but this one covers a lot of the main ideas).

I feel that my standards are very granular and, with that said, I really like the idea of a three-point system as opposed to the, traditional (almost weird to use that word when referencing SBG), four-point system. However, for standards that are written more broadly, such as  Casey Rutherford’s , not that there’s anything wrong with any certain level of specificity.

One of the main hurdles I face with this sort of implementation is how to report it in PowerSchool. I really want to do everything I can to divorce learning levels from points. Frank mentioned that in Active Grade you can report with colors: green, yellow, red. PowerSchool doesn’t have anything comparable, to my knowledge. At this point it looks like I am going to have to just post 100 for green, 75 for yellow, and 50 for red. If anyone knows of any other methods I’d love to hear them.

I’m also still wrestling with my district’s required Formative-40% Summative-60% grading policy and how it’s going to fit in with my attempt at SBG. At this point I’m still planning on keeping only student’s most recent two attempts. Since I really care about retention, the most recent attempt will always be in the summative column. If the student already has one grade, then that one will be bumped down to the formative column, anything older will just be disregarded. The only other option I can see is to have 60% standards and 40% standards, but that would imply some are more important than others and I don’t think that I have set up my standards in that way. If I’m not mistaking, that is what  Kelly O’Shea has done with the “A” and “B” standards.


Transitioning to SBG

Having completed my first full-year of teaching, I now get to sit back and reflect upon the successes and failures on my pedagogy, while sipping a warm beverage in my living room which contains many leather-bound books and smells of rich mahogany. All in all, I loved my first year. I had great classes that did amazing things; but now its time to make changes and start laying the foundations for more success stories.

The primary change that I am implementing next year is standards-based grading. I really wanted to start this year, but with all of the other aspects (challenges) that come with being a new teacher I just couldn’t find time to sit down and put forth the effort necessary to make such a change.

Why I want to change.

To give students obtainable learning targets – Towards Christmas break last year, I figured out that I had never really given my students any clear goals for what I wanted from them, other than, “to learn physics/geometry”. Students always do much better on projects when they are given a rubric which clearly outlines expectations of a project; by moving away from reporting grades as “Chapter 7 Test” to “Can properly apply conservation of energy to analyze the motion of a system” students are much more cognizant of what is expected of them and in a better place to succeed.
Remediation – This year I allowed students unlimited retakes, which I believe everyone should do, but I was really upset with the system in place. Students would often come to me with “what can I do to pull my grade up to…”, which translates to “I don’t care to learn anything, I just want this grade.” By assessing standards instead of assignments, students should be able to look and clearly see in which areas they need to improve. Many of our students lack the metacognitive skills to manage their own education. This is a skill that all students need to be successful in subsequent endeavors and one that we should foster as educators.
Here is a (hardly exhaustive) list of resources that I have used while researching standards-based grading policies and procedures

Action-Reaction: Frank Noschese
Physics! Blog!: Kelly O’Shea
Think. Thank. Thunk.: Shawn Cornally
Always Formative: Jason Buell

As a first step I wrote my standards for each class I will be teaching next year, Astronomy, Physics, and Geometry, all are subject to change. I have read a lot of stuff about how many standards you should have, but there is no consensus. I think mine are a bit granular, in that I have broken things down a lot, but I’ll reassess after this year and see if I want to make my standards more succinct.


With the standards in place, I have started to make quizzes that address as few standards as possible. I really wanted my quizzes to be very specific to certain standards, while assessments will tend to have a lot more standards-crossing questions, primarily for reassessment purposes. This way, if a student comes to me needing help on a single standard, I don’t have to necessarily give them the whole test over again.

Thus far, I have been doing my best to keep quizzes short, 3-10 questions, so that they can, for conservation purposes, be printed on a half-sheet of paper. While I am off this summer, I am also trying to make several versions of each quiz for reassessments. Here’s an Example.

I got the “Circle your answer and Cross out a response that you know is not correct” idea from P-dog’s blog. I really like it and feel like it could help me further probe student thinking and understanding on multiple choice questions, but I’m starting to wonder if it is really going to help or not. The only time I see it really being helpful is if a student chooses the incorrect answer and then crosses out the correct one, probably shows that they have almost no knowledge of the concept being assessed and, worse, are confident in that. The other problem I see is students just picking one of the other three options to cross out and putting no thought into it. Maybe, asking students to rate their confidence in their answer would be more beneficial, I’m not sure.

I’d also like to have buffet semester exams, where students choose the standards they need to reassess and therefore, no two students should have the same semester exam. Barry Fuller wrote a great piece about it here.


I plan on scoring students against a 4-point scale (including 0) with each number actually representing a level of understanding. I like how Jason Buell explained each of these levels on his blog, referenced earlier.

0 = No evidence of learning
1 = Can do most of the simple stuff with help
2 = Can do all of the simple stuff
3 = Can do all of the simple stuff and all of the complex stuff
4 = Can go beyond what was directly taught in class
He then goes on to the next logical progression, which I have yet to do, and breaks down each standard into more ‘rubric-like’ categories and says what a 4, 3, 2, or 1 is on that standard.


When I hand the standards out to the students I plan on giving them a document that has the standards, the learning levels for each standard (0, 1, 2, 3, or 4), the state standards that each is aligned with, and some general information, most notably that they will be assessed on each standard twice.

Currently I am planning on quizzing on standards after they have been covered in class and then assessing over all standards in a unit at the end. Our school uses PowerSchool to report grades to parents and students. What I plan on doing in PowerSchool is listing all of my standards as assignments twice, one of which being in our district’s mandatory 40% formative and the other being a 60% summative. Whenever a student assesses a standard their grade goes in the 60% category and, if they have already assessed it once, they old grade gets bumped down to the 40% category. I’m doing this because I want what they know at that time to weigh the most on students grade.

NOTE: PowerSchool does have a SBG system, but I haven’t been able to glean much from it and the website wasn’t very helpful regarding the topic.

End-of-Year Projects

In an effort to finish the year off right, I borrowed heavily from several Physics Teachers and had my students do capstone projects. I was very impressed with many of the projects. Here is the rubric they were given and graded against as well as some images of several of the posters they completed.

photo 1photo 2photo 3photo 4

Since this was my first year, I wasn’t able to give them many examples of what i wanted; however, this wasn’t too big of a deal. All of the groups with the exception of a couple had all of the necessary elements plus some.

The hardest part of the whole ordeal was getting students to think scientifically. Many just wanted to do some very simple experiments that featured several variables that couldn’t be measured. After the groups and I went back-and-forth for a while with me primarily asking the question, “but, what are you going to be measuring”, they got the gist of it.

Some sample projects are listed below.

  • Volume change in different brands of soap after being microwaved.
  • Mass vs. Distance of a paper airplain
  • Strength of finger nail polish vs. drying time.
  • Height of geysers of various types of soda.
  • Effect of salt on boiling point of water.
  • Distance of a toy car vs. Number of Alka-Seltzer
  • Number of Coffee Filters vs. Fall Time
  • Height of Basketball Bounce vs. Inflation Pressure
  • Temperature Difference vs. Color of Water in Microwave
  • Circumference of a Balloon vs. Amount of Baking Soda
  • Skin color vs. Brand of Bronzer
  • Amount of sugar in different types of soda.

One thing I’d like to see next year is more students using video analysis and/or computer simulations. Only a couple of the above projects incorporated either of the elements into their analysis. On the other hand, one thing I was very pleased with was the large number of groups using linear regression and the other data analysis methods we discussed at the beginning of the year to create equations in order to make predictions.