I believe the common thread between these 3 anecdotes is that one of my assumptions when grading students was that problem-solving necessarily involves analysis rooted in a sound understanding of the underlying principles. A decade of anecdotes and observations challenged this assumption!
Are my students competent?
The competencies targeted by the 3 mandatory physics courses in the Science program all fall under the analysis level of Bloom’s taxonomy (“Analyze physical situations and phenomena using the fundamental laws and principles of [a given area of physics].”). But do students actually analyze when they solve physics problems? The challenge lies in the fact that cognitive processes happen inside the mind, and we can’t truly know a person’s reasoning just by looking at what they’ve written on paper. I’ve identified at least 2 reasons for doubting, at times, that students are genuinely engaging in analysis:
- As my fellow physics teachers have likely noticed, conceptual questions, like in anecdote #3, tend to receive lower scores than problem-solving questions on exams. (And students in physics courses often perform poorly on concept inventory tests such as the Force Concept Inventory.)
However, according to Bloom’s taxonomy, comprehension is at a lower cognitive level than analysis. It should therefore be easier. Analysis involves comprehension (along with knowledge and application) to go further. Consequently, comprehension questions (those that simply ask students to explain a concept without having to apply it to a specific problem) should logically be easier.
- My colleagues will also confirm that many students forget a large part of the material assessed just a few weeks after their exam. Yet, the higher the cognitive level on Bloom’s taxonomy, the better the retention should be. Analytical skills shouldn’t simply disappear over the Christmas break!
In physics, the typical learning sequence is as follows:
- Students familiarize themselves with the theoretical content (principles, concepts, variables, equations, etc.) through teacher explanations, demonstrations, readings, and so on.
- Students study examples of solved problems.
- Students work on problem-solving exercises themselves.
- Students review the answers (or solutions) to the problems they have completed.
If a student gets an exercise wrong, they may react in different ways:
- They might notice, for example, that their answer is half the expected result and assume they probably missed multiplying something by 2 somewhere, without further questioning.
This reflects a beginner’s metacognitive attitude; the student limits their analysis to the level of the answer itself.
- They might wonder if they misunderstood the question.
The student goes back to the exercises.
- They might question what makes this exercise different from the example on the same topic that was shown to them.
The student goes back to the examples.
- They might wonder if there is a concept they didn’t fully understand.
The student goes back to the theory. This reflects an expert metacognitive attitude, one that fosters the development of analytical skills.
I’m convinced that all students would benefit from questioning their understanding of concepts more often. Unfortunately, most of them seem to see a clear divide between the concepts seen in class and the problems they have to solve. I’ve observed that students often adopt a memorization-based study strategy: they learn by heart the “right methodology” for every possible type of problem and try to reproduce it during exams. Some spend countless hours redoing the same exercises 3 times to ensure they have memorized every variation of the given situations!
As a teacher, I’ve always found this situation frustrating, since students don’t seem to recognize the importance of basing their problem-solving on well-mastered physical principles. It’s easy to blame the student for an inadequate learning approach, but discussions with students have forced me to admit 2 things:
- This study strategy enabled them to achieve excellent results in high school (and for many, also in CEGEP), so why change it?
- I need to reflect on the message I’m sending through the exam preparation work I assign. When I suggest a list of 120 textbook exercises for practice and the only feedback students receive is the numerical answer, it’s not surprising that they conclude that the goal is simply to calculate the correct answer as quickly as possible.
Shifting from a teaching approach to a learning approach
A course I took at the Université de Sherbrooke with Professor Judith Cantin helped me realize that early in their careers, teachers tend to focus mainly on the courses they are teaching. It’s only over time that they start to take an interest in the course as experienced by their students. In other words, they shift from a teaching approach to a learning approach.
For a long time, I believed that my role as a teacher was to provide students with the building blocks they needed to construct the glorious “edifice of knowledge”, and I did exactly that. However, over time, I realized that this famous edifice often ended up being more of a shack that barely stood up after one semester.
To help my students develop a deeper understanding of the central concepts of the course, I therefore started incorporating ConcepTest activities into my class. I present them with a multiple-choice conceptual question and have them vote on the answer using Wooclap. They then have 5 minutes to discuss their answers with one another before voting again. Finally, we review the correct answer as a group. (This kind of activity with ConcepTest questions has been shown to have a significant impact on the results of the Force Concept Inventory test.)

Diagram showing the peer-instruction method by Eric Mazur that I use with ConcepTest conceptual questions. This method was developed by Eric Mazur at Harvard University and was mentioned by Luc Tremblay, a physics teacher at Mérici Collégial Privé, who was using it in his courses in 2009 [in French].
However, this method doesn’t solve the problem of the students’ barrier between conceptual questions and problem-solving. ConcepTest questions can help improve students’
understanding of concepts covered in class, but they fall short when it comes to developing their
analytical skills.
From the students’ perspective, improving their ability to answer conceptual questions asked in exams doesn’t actually help them solve the problems presented to them in those same exams.
I therefore decided to focus on the exams themselves …
The need to rethink the evaluation process
In its 2018 report Evaluating So It Truly Counts [PDF], the Conseil supérieur de l’éducation (CSE) identified 2 broad goals of evaluation:
- to certify achievement
- to support learning
As I reflected on my own practices, I began to question whether the written exams I was using truly certified what my students had achieved. I also didn’t believe that they were actually supporting learning either.
Following the recommendations of the CSE report, and aiming to better achieve these 2 goals, I wanted to make my evaluations more authentic.
The point of school is not to get good at school.
— Grant Wiggins
Grant Wiggins identified 5 characteristics of authentic assessment:
- focus on essential (real-world) tasks
- allow access to internal and external resources
- do not rely on time constraints
- promote deep learning
- include a collaborative dimension
However, meeting the 1st criterion isn’t easy for me. Coming up with truly authentic problem situations in pre-university physics is a real challenge. The situations found in textbooks are often oversimplified. While banks of authentic problems do exist, they usually contain a large amount of contextual information and sometimes require students to do their own research to fill in missing details, which doesn’t work well within a traditional exam context.
Instead of trying to invent authentic problems, I therefore decided to make the evaluation context itself more authentic.
In addition to meeting the other 4 criteria identified by Wiggins, I also took into account Viau’s motivational dynamic model [in French]. I wanted my students to clearly understand:
- the value of the activity
- their own competence in completing the task
- their control over their own performance
The solution I found was to replace written exams with individual oral interviews.
How individual oral interviews work
Before the oral interviews
First, students define their own learning goal. Those who aim for the highest grade possible will naturally have more work to do.
- Fundamental exercises are essential to passing the course and must be completed and mastered by all students to demonstrate achievement of the course competency.
Students who complete the fundamental exercises only can obtain a maximum grade of 75%.
- In-depth exercises build on the course content and must be mastered by students who wish to demonstrate a deeper level of competency.
Students who complete the fundamental and in-depth exercises can obtain a maximum grade of 85%.
- Integration exercises connect the most abstract concepts of the course and must be mastered by students who want to stand out by demonstrating an exceptional mastery of the course competency.
Students who complete all 3 levels of exercises (fundamental, in-depth, and integration) can obtain up to 100% in the course.
Students must complete all the exercises up to their targeted level before their oral interview.