Buy that special someone an AP Physics prep book, now with 180 five-minute quizzes aligned with the exam: 5 Steps to a 5 AP Physics 1

Visit Burrito Girl's handmade ceramics shop, The Muddy Rabbit: Yarn bowls, tea sets, dinner ware...

16 November 2017

Pivot Interactives - these videos are so worth the money.

Screenshot from Pivot Interactives
As you might have noticed, my all-time favorite internet physics resource Direct Measurement Videos has migrated.

Vernier now is selling access to "Pivot Interactives", for $150 per year or $5 per student, whichever is greater.  On one hand, I wish the National Science Foundation had stepped up with a seven-figure grant for primary DMV creator Peter Bohacek so that he could continue to provide these resources free to physics teachers.  I mean, I can list probably 102 NSF grants that flush my tax money down the toilet. Nevertheless, I'm happy that Vernier has seen the extraordinary value in these exercises to keep them alive.  

The question for the physics teaching community is, then, do we spend the couple hundred beans to get access to Vernier's new site?  For me, the answer is definitely "yes."  

I've made it my official teaching goal this year to replace as many textbook-style homework problems as possible with Pivot exercises.  Since Peter and company have been hard at work adding to their video library, I'm finding this goal easy going.

Take, for example, the car-around-a-traffic-circle problem.  I've always started my circular motion unit there, as it's a situation all my students have experienced.  I ask, how fast can the car go around the curve?  We discover that the maximum speed depends only on the curve's radius, g, and the coefficient of (static) friction. 

Great.  But this is an abstract "imagine if" problem.  I can't take my students to a traffic circle for experimentation - the nearest one is 20 miles away, and is too busy for fooling around, anyway.  All I can do is suggest that the yellow suggested maximum speed signs don't include a mass variable - they say "max 25 mph", not "max speed in mph is 0.025 times the mass of your car in kg."  Interesting... but not experimental.

Well, look at the screenshot at the top of the post.  Peter took his drone to a traffic circle.  He drove the gray car around the circle at a speed that was always on the verge of slipping.  When he imported the video, he included tools to find angles around the circle, the radius of the circle, and a frame-by-frame timer.  I can't do this experiment, but Peter can. And did.

So for a homework problem later in the circular motion unit, I link the class to this Pivot Interactives video.  The site allows me to customize the assignment - the default is quite good, but I can add or eliminate questions and guidance.  For me, I like a clean prompt like "Determine the (maximum) coefficient of static friction between the car's tires and the ground."  The site allows students to input their solution and reasoning directly in the space provided; you can then scroll simply from one response to the net, awarding points if you'd like.  I prefer to have students answer on paper, but that feature seems to work as well (paper not provided by Vernier).

There's so much more that Pivot does.  I prefer the simple open-ended "determine this parameter" exercises.  But Pivot also has some modeling exercises, providing an easy graphing interface that allows students to make and linearize plots.  These multi-layered videos allow you to change multiple parameters.  The prompts guide students through what essentially is a complete lab exercise which you might never be able to do in your own classroom; or, a lab exercise you don't have the time to do in your classroom but can assign for homework.  I know that Vernier has gotten some serious physics teaching experts, including Kelly O'Shea, writing these exercises.  

And Peter is adding videos every time I look.  

Look, I know it's disappointing to have to pay for what had been available for free.  And I generally don't recommend paying for physics content on the internet.  Nevertheless.  

Vernier has hired the varsity for this project.  Everything I see on the site is something that makes me say "wow."  Pivot cannot replace a physics teacher doing active lab work in the classroom, because nothing can.  Using Pivot gets students as close as I believe it is possible to come to an online laboratory experience.  I highly recommend.

GCJ

(Note that Peter and Vernier have not paid me in any way for this endorsement.)

15 November 2017

Umpires, and Training Students to Grade Each Others' Daily Quizzes

The most effective tool I've ever discovered for 9th grade conceptual physics is not the daily quiz - it's the grading of the daily quiz.  Students trade papers, pick up the red pens I provide, and mark right or wrong as I go over each answer.  

The purpose here is pedagogical, not logistical.  I'm perfectly capable of rephrasing these questions as multiple choice for ease of grading; and it's not like I don't have time to grade these simple quizzes, anyway.  No, the reason we trade and grade daily quizzes is that students pay attention to each question threefold:  once when they take the quiz, once to figure out how to grade the quiz in front of them, and once to think about whether they themselves got each question right. 

It's essential, though, to establish an appropriate tone and ground rules in order for trade-and-grade to be successful.  Here are some principles that my class learns that make the daily quizzes work.

(1) We are a team.  Daily quiz performance is just like a football team's conditioning sprints.  Teams have fast and slow players.  Good teammates respect the effort and talent of those who are the fastest; good teammates encourage and respect those who are slower, too.  

The team atmosphere must be established from the first day of school.  The highest form of sin in my class is to denigrate  a classmate for a wrong answer.  Even negative body language directed at another student - a teammate - is unacceptable.  

Because we are a team, we can depersonalize the grading process.  A grader is acting as an umpire for the team.  Umpires don't call someone out because they dislikes the players, they call safe or out honestly based on their judgment, and on a shared respect for the game.  A good team doesn't want a biased umpire, even if the bias is in their favor - they want to win or lose with integrity.  Those who play sports know that a tainted win is nothing to be proud of. 

(2) Don't look at or talk to the person grading your paper.  You have enough to focus on with the paper you are grading.  Trust your classmate to evaluate your work appropriately.  No one respects the player who continually tells the umpires how to do their job.

(3) Don't talk to the person whose paper you're grading.  Imagine an umpire who, presented with a close pitch, asks the catcher: "Hey, did you mean for that pitch to be over the outside corner?"  The batter would go berserk!  So students should never ask "hey, did you mean this line to be straight?"  If you can't tell, mark it wrong.  The burden is on the student writing the quiz to communicate clearly.  

(In my AP classes, I ask regularly: "Are you allowed to travel with me to Kansas City in order to accompany your exam from reader to reader explaining what you meant?  No?  So don't get in that habit now.")

(4) Ask for help on borderline calls.  The class is instructed to raise their hands and read me any answers they're not sure about.  Not generalities like "What if the student kinda hinted at the answer?"  I ask the student to read, verbatim, what is on the page.  I then make a call one way or the other, and we move on - just like an umpire.  

And if the student who wrote the answer tries to chime in what he meant, I politely say, "Since the answer needed clarification, it must be incomplete.  Please mark it wrong."

(5) Be transparent.  Just as a league doesn't assign the same umpire to the same team too many times, be sure you're mixing up occasionally who gets whose paper.  Have the graders write "graded by" and their name at the top of the page, and make it clear that accurate grading is a skill that you are assessing.  Ask students to place their own pencils and pens on the floor while grading goes on, so that the only writing utensil available is the red pen - that way, it's not possible to change an answer dishonestly.  

If you suspect someone is grading inappropriately to benefit (or hurt) a classmate, don't ignore it, but don't get preachy!  I've had small issues early on - students giving credit when they shouldn't, hoping to curry favor with a classmate.  Those were easily and definitively solved when I asked publicly but earnestly, "Luke, can you please show me why you marked this answer correct?  I think it's clearly wrong, but am I missing something?"  That's all it takes.

(6) Make instant replay available.  We have a standing rule - students may never, ever, argue or discuss with the person who graded their paper.  But they may circle in red a question they think was graded inappropriately, signaling me to take a look.  They can't talk to me, they can't tell me what they meant, they can just circle the question.  And, if the grading was indeed incorrect, I will change the score and let both student and grader know what happened and why.

Early on, usually I get a bunch of students circling their answers, hoping to argue for a better score.  That ends quickly when they can't talk to me to lawyer up.  If someone can't stop arguing and circling obviously wrong answers, I tell that student that I'll take off an additional point for each answer they challenge incorrectly.  (Kind of like how in the NFL a coach loses a time out if he incorrectly challenges a ref's call.)

Thing is, students are more careful graders than I am.  The process is quite fair - even moreso than if I combed through all the quizzes.

The most important point to keep in mind, though, is that the grading process is not even about fairness.  The goals are twofold - to provide an arena in which students are paying attention to physics facts, but also to establish that physics has right or wrong answers.  

Within a week my class has stopped arguing with me about grades.  They know what my answer will be - it is the student's responsibility to communicate correct physics the first time, in writing.  They've seen that the entire class is held to the same standard, that no one can sycophantically beg for points after the fact.  

26 October 2017

Vertical motion simulator for changing g

Screenshot from vertical motion simulator
at computercow.net
One of my favorite early-season assignments asks how much higher or lower a ball will fly when g doubles.  Most students recognize, or guess, that doubling g leads to a halved maximum height.  To their chagrin, they also discover that doubling both the launch speed and the gravitational field does NOT lead to "canceled out" effects and the same maximum height.*

* ...because height goes as launch speed squared.

Even in the first weeks of class, my students expect experimental verification of mathematical predictions, especially counter-intuitive predictions.  But, I can't easily take my PASCO projectile launcher to a planet with gravitational field 2g.

And here, then, is a perfect place to insert an animated simulation.  I stay away from simulations because they're emphatically NOT real.  Physics is in the business of predicting how the natural world works, not how programmers make it seem to work.  Nevertheless, appropriate programmed simulations can be useful for giving students a feel for experiments that can't be quickly or easily set up in lab.  

Problem is: I've never found a free-fall simulation that allows me to change both g and the initial vertical speed with which an object is launched.  There are some wonderful dropped-ball simulators, and others that do a good job with projectiles.  But nothing purely vertical with a varying g and v0.  If you know of a free simulation that does what I want, please let me know in the comments.

(Barry Panas, the Official Humorist of the AP Physics Reading and master Manitoban physics teacher, is right now screaming at me the same way I scream at baseball commentators: The Interactive Physics platform will do exactly what I describe!  When I had IP installed on my old computers, I used to use it.  Barry taught me everything I know about using IP.  But, I don't have that program anymore.  I need something free and quick. Sorry, Barry.)

Good news - I have a student this year who's a programmer.  I explained what I was looking for; in a day, he had something basic but useful.  Take a look at this link. I don't exactly understand how the "pixel" button works, but it allows me to zoom in or out.

And so, after we did this problem, I projected this simulation in front of the class.  We doubled both g and v0, and wouldn't you know, the maximum height didn't remain the same.  Physics works.

Disclosure - the link is to my son Milo's site; he is the student who programmed the simulation.

08 October 2017

Should universities award credit for AP Physics 1? Yes, a thousand times yes.

Gary writes with a big-picture question about teaching college physics at the high school level:

A college physics course would be either 4 or more likely 5 credits.  A history or language course is often  a 3 credit college class.  Why do so many high schools  treat AP Physics the same as the other AP courses?  It is difficult to explain to students, parents, school boards etc that the AP Physics will require much more time and effort than students are used to... I know that many universities are concerned about giving AP Credit especially to science majors.

Firstly, a university "concerned" about giving credit at some level for AP Physics 1 either hasn't done their research, or is operating from a set of assumptions about what "credit" means which is as out of touch with the reality of my students' experience as the Republican party is out of touch with contemporary musical theater.

The answer to Gary's first question is sort of sad, and often damning to the credibility of high school science courses.  Teaching rigorous high school physics is incredibly challenging, because we have to know our content backwards and forwards, we have to develop unique pedagogy... but we also have to navigate the most difficult political environment this side of congress.  As far as I can tell from extensive anecdotal evidence and personal experience: 

1. Most high school administrations have no clue about the comparative difficulty of the AP courses.  They understand *that* lab exists, but they don't have a real clue why or how laboratory work should be done.  They see their students write baloney with big words in English class, and earn passing scores; but then the same technique earns a 1 on the AP physics exam because physics can't be finessed.

2. Sometimes when the administration does recognize that physics is a different animal, it's not politically possible to treat physics differently.  Let's say they give AP physics more class time, or more academic credit, or a smaller student-teacher ratio, or they give an AP physics teacher an extra planning period to set up laboratory work... there WILL be complaints about "fairness" from other AP teachers.  Not just administrators, but lots of other teachers think that school should be about reading a book, remembering what's in it, then BSing the way through a discussion and a paper.  Why should physics be any different?  After all, they hated physics.

But what I don't understand is why any of that should impact universities awarding credit for good scores on the AP Physics 1 exam.  

The AP Physics 1 (and AP Physics 2) exam is outstanding and challenging - much more challenging than a typical college physics exam.  A student who earns a 5 has not only shown extraordinary mastery of the requisite concepts and skills, but also the aptitude to handle any further physics topic you can throw at her or him.  A student who earns a 3 has not just thrown dung at the exam hoping some would stick; such a student has shown considerable though incomplete mastery of the topic.

As a rule, scores from the old AP Physics B exam have shifted down one number.  Students who get 4s on AP physics 1 would have gotten 5s on AP Physics B.  Most importantly, those who used to pass (i.e. earn at least a 3) on the old physics B exam do NOT pass the physics 1 exam.   Why?  Because a mathematically talented student without physics knowledge could, on a physics B exam,* plug random numbers into random equations and earn enough credit to pass.  Not so in AP physics 1.  There are no "pity points" of any sort.  If you pass, you understand.  

* and in most college courses - that's the university level's dirty secret

In my mind, universities should be actively seeking out students with passing AP Physics 1 scores in order to give them credit and woo them to their school.  They are well prepared: perhaps for a physics major eventually, definitely for any rigorous quantitative work at the university level.  And the ones with 4s and 5s - department chairs should be recruiting them as they would a quarterback with NFL potential.


07 September 2017

How should a physics teacher use a "learning management system"?

A reader indicated that he's being pushed by his administration toward using the learning management system Canvas, especially for paperless testing.  Where do I stand, he asked?

My school adopted Canvas about four years ago.  While I'm not personally a fan, I recognize and acknowledge the major benefit to the student, especially the student at a day school.  All assigned work is clearly indicated on and accessible from a simple-to-use calendar.   Student misses a day of class - no worries, assignments are on Canvas.  Student is disorganized and loses assignment sheet, or forgets where to look online for my assignment - no worries, all on the Canvas calendar.  

My concern is partly that we are doing a large amount of organization for the student, rather than the student internalizing the organization skills for him- or herself; my concern is partly that significant technical hurdles to uploading and downloading assignments too often get in the way of teaching.  Nevertheless, in these matters I yield my own judgment to the consensus of our faculty.  Canvas has in general been a positive development for us.

I do not yield my judgment about physics teaching.  Physics teaching is different from teaching other subjects, and way too many people don't recognize that.  

It sounds like this reader's administration isn't considering how physics is done.  Yes, in some classes at my school, essentially all assignments and tests are handed out and submitted through Canvas.  That works fine for multiple choice, for pure text in an English or history paper, for straight-up numerical responses.  

As you indicated, though, physics demands communication in writing.  A well-presented problem consists of words, diagrams, equations, and numbers.  Annotated calculations or derivations often include circles and arrows and, well, handwriting techniques that I can't well describe or draw in a blog post.  I'd need hard copy.  

And so, when I create assignments in Canvas, I simply attach a file consisting of the problem set questions.  The actual assignment submission is always on unlined paper.  Canvas is still an important and useful tool, though, ensuring that everyone has the assignments available electronically in calendar format.

It is NOT POSSIBLE to appropriately test physics students using Canvas, or using any computer input.  Physics assignments require paper.  They require pen or pencil, and sometimes ruler and protractor, graphs, the ability to create or annotate diagrams, to draw and refer to pictures... 

If administrative fiat demands that you use Canvas for the multiple choice portion of your tests, so be it... I mean, multiple choice is multiple choice.  But for the free response homework and tests, I encourage the entire physics teaching community to continue to require hard copy.*  To do otherwise, I think, is professional malpractice for a physics teacher.

* Okay, okay, if you have a seamlessly working tablet with a stylus for everyone, perhaps that could work, 'cause that's the same input method as paper.

When the administration similarly requires computer tests and projects in your Studio Art class, then perhaps I'd reconsider.  Perhaps.  :-)

03 September 2017

Can I go to the bathroom?

Of course.  In the future, please don't ask... 

Just place your phone on your desk and go.


23 August 2017

What does good learning look like?

Our faculty has been discussing the excellent book What the Best College Professors Do.  Until today, we've been focused on what good teaching looks like.

The focus shifted today.  We were asked to write, "what does good learning at our school look like?"  Here's my response, written for an audience of teachers, but not science teachers.  

I speak to science in particular.  Yet, my hope, and the hope of the science department, is that an evidence-based model of learning becomes an innate part of a student’s personality, something that goes on without conscious thought.

In pursuit of the goal, one activity that I’ve done with all ages of student aims to get students to evaluate pseudoscientific claims that they have likely heard repeated as gospel.  Everyone is given a “quiz” on which they mark a number of statements true or false.  Next, they are asked to research in depth a statement they marked as true, treating the claim as scientific.  A scientific claim is false by default – it only acquires the status of “true” once diverse and compelling evidence supporting the claim can be produced.  So, students are charged to lay out for me, as a representative of the scientific community, evidence supporting the claim; or, conversely, to acknowledge that he claim is false and to give me background explaining why some folks might think it true.

For the purpose of describing great learning at our particular boys' boarding school, I’ll use one of the claims from this activity:

Human women have one more rib than do human men.

(1) Great learning begins with an interesting, yet answerable and in-scope, question.  

It’s the teacher’s job to help present and refine relevant questions, especially for younger students – “how can I build a working nuclear missile” is interesting and perhaps relevant, but out-of-scope for most of our classes.

How do I know this particular question about ribs is interesting?  I observe students marking it both true and false; I observe spontaneous initial discussions among classmates explaining their thoughts.  They seem to care about the answer.

(2) Great learning means searching not for the direct answer to the question, but rather for evidence with which to answer the question.  

Note that the research portion of this project doesn’t say just “find an expert who tells us the answer.”  No, students are to look for evidence.  It’s important that I don’t shame the students who marked this true by telling them, as the authority figure, that they’re wrong.  It’s also important that they don’t just quote their 1st grade Sunday School teacher as the authority.  They must find their own evidence.  
In researching the rib question, students have found online pictures of x-rays.  They’ve discussed with the school trainer.  One student asked a young lady if he might please feel her ribs.  All’s fair in the pursuit of science, I say.

(3) Great learning continues by evaluating the quality of evidence.  

For many questions, students find that the top google hit is Big Bob’s Website of Dubious Rigor.  I usally don’t have to explain much to students, even 9th graders, about what a reliable or unreliable website looks like.  That they have to present quality evidence to me in person keeps most ridiculousness away.  Because they are asked for evidence, not authority, they stay away from slick, shallow, yet plausible-sounding sites.  Because they have to present in person to me, they are careful to be intellectually honest.  It’s one thing to use Sean-Spicer-logic on dorm or on twitter; it’s another thing to state clearly disingenuous baloney to a science teacher in the front of the classroom.

When a student does have a legitimate misunderstanding of a claim, I argue about quality of evidence, not about the conclusion.  Students take it personally if I question, in this case, their adherence to the literal truth of the bible.  Yet, they respect and acknowledge my request for more diverse evidence: “A scientific claim requires an abundance of clear evidence from multiple sources before we say it’s true. What sort of evidence could you find that would convince not Big Bob or you, but that would convince me and other scientists?”

(4) Great learning requires a continual re-evaluation of one’s model of how the world works.

Virtually all of the claims on the quiz are false; but generally students mark about half of them true.  The beauty of this exercise is that the students have confronted their misconceptions for themselves, in front of their peers.  I hear them telling classmates what they’ve discovered: “Yeah, my dad said that women have more ribs, but I saw the x-rays.  It’s not true.”  “Yeah, my cousin used to campaign against vaccines; but they don’t cause autism, that’s been debunked.”

Overhearing those conversations is a piece of evidence I use to see that this exercise has produced great learning.

17 August 2017

Mail Time: What if I have to miss the first week or two of school?

A reader will be unavoidably absent until the second week of school.  The question to me: What would I do with an AP Physics 1 class, knowing that the sub is a random adult rather than a physics teacher?

Of all the times you'd have to miss.  Guh, not the first week(s) of school.  This is when physics students most need your guidance.

I don't have an easy answer for this one.  I'd really rather just start school two weeks late than have a random sub for a week.  It's too easy for bad habits to get ingrained, for them to get a false sense of what physics is. 

But I'll bet your school isn't about to cancel physics class for two weeks.  My only suggestion would be to do unrelated enrichment work for a week or two, and then start the course as normal when you return.  See, I think many of us take time out during the school year, or perhaps after the AP exam, to do some one-off activities: a bridge building contest, research about the history of science, etc.

It's not ideal... but you could move some one-off activities to the start of the year.  It's important to choose activities that are not directly related to physics content, I think, so you don't ingrain misconceptions.  So those projects about motion picture physics, or making a video about a physics concept - I don't recommend at year's beginning.  I offer two options that I've done in the past that might be useful here.  I suspect this post's comment section might provide even better ideas.

Option 1: Here is a link to a "pseudoscience" activity I've often done at year's end... they take it as a quiz, then they choose one or two things they marked true to investigate.  In science, all claims are false unless clear evidence is presented to convince an audience they are true. This activity can be done with minimal supervision; the students will discuss well with each other.  Give a few suggested sites to jump-start student research, such as snopes, the straight dope, and the skeptical inquirer.  

Option 2: Pose some astronomy questions, and have students investigate them using some online tools.  I like the Regents earth science astronomy questions, paired with a University of Nebraska set of simulations.   Just pick some of the questions from recent tests about observational astronomy, the motion of the sun and moon, or the phases of the moon; then ask students to teach themselves the underlying geometry using the simulations.  If you email me, I can forward you a few assignments I've given using this method.

Other than that, I don't know.  I seriously don't recommend lab work or math review.  I don't recommend any physics content at all until you're there.  I'd love to hear good ideas in the comments.  Best of luck.


26 July 2017

Methods of in-class collaborative work: 421

This weekend I attended a workshop given by Kelly O'Shea and Danny Doucette.  They showed us their outstanding approach to lab practicals, which they assign as group tests.  

The discussion in the room at several points turned to balancing the group / individual dynamic in the classroom.  On one hand, physics is a collaborative endeavor. Cooperation and communication are skills which we must teach and assess.  On the other hand, we are teaching result-obsessed teenagers, who default to letting the (perceived) smart kids do all the work, probably while making fun of them behind their back.  

If we're going to encourage, let alone require, cooperative work in physics class, we must incentivize appropriate collaboration.  Remember, incentives can and should take forms other than mere grades.  Although others have found success in assigning a direct grade for the quality of participation in group work, I have not; I find students spend more time gaming the grade than actually collaborating.

My personal approach to encouraging effective collaboration is enforcement of the five foot rule.  As always, my way is not the only way.  Another workshop attendee -- I dearly wish I remember who -- mentioned an extraordinarily clever approach to evaluated group work, one that I'd like to try.

He called it the 421 method.  The laboratory exercise or problem to be solved is presented to the class, and then the class is divided randomly into groups of four.  Then, work proceeds in three stages, with clear time limits assigned to each.  (Yes, stages are numbered strangely.  You'll see.)

Stage 4: Discussion.  Each assigned group of four may discuss the problem together; but they may not write anything down.  No pen, no whiteboard, nothing.

Stage 2: Representation.  The groups are subdivided into pairs.  Each pair may communicate orally and using a whiteboard.  However, they may only write representations - no numbers or words.  This means they can use equations, free-body diagrams, energy bar charts, etc.  

Stage 1: Solution.  Now students separate to use pen and paper.  They are assigned to write a thorough response, including representations, numbers and words.  This is turned in for evaluation.

People in the workshop asked, do you evaluate the group work?  Thing is, by evaluating the individual solution in this case, you are evaluating the group work!  If the students were effectively working together, communicating clearly with one another, pooling their talents well, then necessarily the product should be that each individual student can communicate by him or her self.  The student who held back from the group, who didn't actively participate, won't have the benefit of the four folks working together.  

This method does require that you assign lab exercises or problems that are beyond the simplistic.  AP-level questions are good here, or a simpler version of Kelly's group test-style lab practicals could work in this style.  If the whole approach to the problem is immediately obvious to more than one or two students in your class, there's little incentive for high level students to converse in stages 1 or 2.

I'll need to experiment to figure out the precise level of difficulty for this approach.  Nevertheless, I love the idea.  Let me know if/how it works for you.

24 July 2017

Ask for an answer LAST

Greetings from the American Association of Physics Teachers meeting in Cincinnati.  The exhibit hall opened last night with self-serve all-you-can-eat Skyline chili.  I have ascended to my eternal reward.

Since I arrived, I've done a wee bit more than eat chili and tour Great American Ballpark.  Yesterday I attended the High School Teacher Camp, organized by Kelly O'Shea and Martha Lietz.  We spent the day talking shop, meeting colleagues from around the country.  The keynote address was from Kathy Harper, discussing student perceptions of "mistakes" in physics class and how to channel those perceptions in a positive direction.

I've got a *lot* of notes on my phone which will inspire future posts.  For now, I'm going to relay an idea from Martha about her revised approach to AP Physics 1 justification problems.

I've written before about issues teaching students to articulate their reasoning on semi-quantitative or conceptual questions.  In sum: English class, history class, geometry class, and Fox News have taught students to begin arguments by picking a conclusion; then, to construct quasi-logical arguments twisting evidence to support that result, truth be danged.  Students are not used to the idea of beginning with the logical evidence, and then dispassionately asking what conclusion should be drawn from that evidence.

Of course, getting students even to articulate a quasi-logical chain of evidence is a tough challenge in physics class.  Come on, teacher, you know the answer, (think I) know the answer, if I'm right why do I need to say any more?  To break this first barrier, Martha had been an advocate of the "Claim-Evidence-Reasoning" approach to justifications.  For each problem, she would give space for the student's claim, i.e. their answer; for the student to write evidence from the problem statement or experiment; and then for the student to link the evidence to the claim through verbal and mathematical reasoning.  She required every student to address every element on every problem set.  

And it worked, sort of - Martha's students were willing to articulate their reasoning using words and equations.  Great.

But it was obvious to Martha that many students were merely guessing at the right answer, then cherry-picking evidence and reasoning that could support that original guess.  I've seen this intellectual stubbornness as well.  I don't know why people's brains have so much trouble adapting knowledge to new evidence.  I just know that they do.  Once a student decides that the answer is choice C, it takes an actual invasion by the Red Army to convince him that maybe the evidence points to choice D instead.

So, Martha suggested... why not ask for an answer LAST?

She subverted the paradigm to Evidence-Reasoning-Conclusion.  After the problem statement comes space for students to write evidence: facts, equations, and information relevant to the situation.  Then comes space for the reasoning: use logical connections to explain where the evidence points.  And finally, at the end, the conclusion: that is, the answer.  

Because the answer comes last, because students are not asked to commit to a conclusion before examining the evidence, students actually, well, examine the evidence.  They stop contorting their logic into pretzels to prove themselves right, and they start doing physics like a physicist.  Martha no longer has to suggest that their answers might be more likely to be correct if they'd use physics.

How am I using this?  I intend to rewrite some of my problem sets, especially in conceptual physics, making just one small change.  Problems have previously looked like this:

[Problem statement blah blah blah]

Answer:____

Justification:









But I'm going to take a cue from Martha, and rewrite this way:

[Problem statement blah blah blah]

Justification:








                                                                                                                        Answer:___


Let me know what thoughts you have, including whether this approach does or doesn't work for you.