Close×
February 24, 2026

Individual Oral Interviews in Science: An Exam Format That Truly Evaluates the Targeted Competency

This article is a translation of a text published on Eductive’s French website.

When I grade written exams, I often wonder whether my students have truly developed the targeted competency or simply memorized information and procedures to get the right answers. Hoping to evaluate their skills more accurately, I began conducting individual oral interviews in Winter 2024. I am very pleased with the results! I believe this approach has encouraged my students to engage in deeper learning, and it has enabled me to evaluate them more accurately. As a bonus, it has also revitalized the classroom dynamic and reduced anxiety within my groups. Here’s how I did it.

Triggers: 3 anecdotes (among others)

The reflection that led me to conducting oral exams was shaped by countless events throughout my career. Here are 3 examples.

Anecdote #1: a top-of-the-class student who doesn’t understand

I remember a student who had excellent grades on all the evaluations but told me at the end of the semester that she didn’t really feel as though she truly understood the course content. Yet, she had just scored 97% on the final exam.

Anecdote #2: patterns and luck

I was teaching in an active learning classroom at one point and noticed that a team of 3 students had a minor mistake in their written process on their whiteboard. When they raised their hands, I thought it would only take a few seconds to help them understand their mistake. However, to my great surprise (and disappointment), I realized that they didn’t understand anything! They hadn’t mastered the basic concepts underlying the problem. They had come up with an almost perfect procedure by reproducing patterns they didn’t understand, and integrating numerical values, due to a bit of luck, into equations whose meaning was completely foreign to them. Yet, if this had been an exam, I would have given them an almost perfect grade …

Anecdote #3: 1 concept; 2 outcomes

In a written exam, I asked students to explain the photoelectric effect, a phenomenon we had studied extensively in class, both in theory and in the lab. The answers were disastrous! While grading the exams, I feared the worst, because I knew I had asked another question about the same phenomenon a few pages later. In this question, however, I was not asking the students to explain the phenomenon, but to solve a problem related to the photoelectric effect using the appropriate mathematical equations. Guess what! Most students did very well on that problem.

I believe the common thread between these 3 anecdotes is that one of my assumptions when grading students was that problem-solving necessarily involves analysis rooted in a sound understanding of the underlying principles. A decade of anecdotes and observations challenged this assumption!

Are my students competent?

The competencies targeted by the 3 mandatory physics courses in the Science program all fall under the analysis level of Bloom’s taxonomy (“Analyze physical situations and phenomena using the fundamental laws and principles of [a given area of physics].”). But do students actually analyze when they solve physics problems? The challenge lies in the fact that cognitive processes happen inside the mind, and we can’t truly know a person’s reasoning just by looking at what they’ve written on paper. I’ve identified at least 2 reasons for doubting, at times, that students are genuinely engaging in analysis:

  • As my fellow physics teachers have likely noticed, conceptual questions, like in anecdote #3, tend to receive lower scores than problem-solving questions on exams. (And students in physics courses often perform poorly on concept inventory tests such as the Force Concept Inventory.)
    However, according to Bloom’s taxonomy, comprehension is at a lower cognitive level than analysis. It should therefore be easier. Analysis involves comprehension (along with knowledge and application) to go further. Consequently, comprehension questions (those that simply ask students to explain a concept without having to apply it to a specific problem) should logically be easier.
  • My colleagues will also confirm that many students forget a large part of the material assessed just a few weeks after their exam. Yet, the higher the cognitive level on Bloom’s taxonomy, the better the retention should be. Analytical skills shouldn’t simply disappear over the Christmas break!

In physics, the typical learning sequence is as follows:

  1. Students familiarize themselves with the theoretical content (principles, concepts, variables, equations, etc.) through teacher explanations, demonstrations, readings, and so on.
  2. Students study examples of solved problems.
  3. Students work on problem-solving exercises themselves.
  4. Students review the answers (or solutions) to the problems they have completed.

If a student gets an exercise wrong, they may react in different ways:

  • They might notice, for example, that their answer is half the expected result and assume they probably missed multiplying something by 2 somewhere, without further questioning.
    This reflects a beginner’s metacognitive attitude; the student limits their analysis to the level of the answer itself.
  • They might wonder if they misunderstood the question.
    The student goes back to the exercises.
  • They might question what makes this exercise different from the example on the same topic that was shown to them.
    The student goes back to the examples.
  • They might wonder if there is a concept they didn’t fully understand.
    The student goes back to the theory. This reflects an expert metacognitive attitude, one that fosters the development of analytical skills.

I’m convinced that all students would benefit from questioning their understanding of concepts more often. Unfortunately, most of them seem to see a clear divide between the concepts seen in class and the problems they have to solve. I’ve observed that students often adopt a memorization-based study strategy: they learn by heart the “right methodology” for every possible type of problem and try to reproduce it during exams. Some spend countless hours redoing the same exercises 3 times to ensure they have memorized every variation of the given situations!

As a teacher, I’ve always found this situation frustrating, since students don’t seem to recognize the importance of basing their problem-solving on well-mastered physical principles. It’s easy to blame the student for an inadequate learning approach, but discussions with students have forced me to admit 2 things:

  1. This study strategy enabled them to achieve excellent results in high school (and for many, also in CEGEP), so why change it?
  2. I need to reflect on the message I’m sending through the exam preparation work I assign. When I suggest a list of 120 textbook exercises for practice and the only feedback students receive is the numerical answer, it’s not surprising that they conclude that the goal is simply to calculate the correct answer as quickly as possible.

Shifting from a teaching approach to a learning approach

A course I took at the Université de Sherbrooke with Professor Judith Cantin helped me realize that early in their careers, teachers tend to focus mainly on the courses they are teaching. It’s only over time that they start to take an interest in the course as experienced by their students. In other words, they shift from a teaching approach to a learning approach.

For a long time, I believed that my role as a teacher was to provide students with the building blocks they needed to construct the glorious “edifice of knowledge”, and I did exactly that. However, over time, I realized that this famous edifice often ended up being more of a shack that barely stood up after one semester.

To help my students develop a deeper understanding of the central concepts of the course, I therefore started incorporating ConcepTest activities into my class. I present them with a multiple-choice conceptual question and have them vote on the answer using Wooclap. They then have 5 minutes to discuss their answers with one another before voting again. Finally, we review the correct answer as a group. (This kind of activity with ConcepTest questions has been shown to have a significant impact on the results of the Force Concept Inventory test.)

Diagram showing the peer-instruction method by Eric Mazur that I use with ConcepTest conceptual questions. This method was developed by Eric Mazur at Harvard University and was mentioned by Luc Tremblay, a physics teacher at Mérici Collégial Privé, who was using it in his courses in 2009 [in French].

However, this method doesn’t solve the problem of the students’ barrier between conceptual questions and problem-solving. ConcepTest questions can help improve students’ understanding of concepts covered in class, but they fall short when it comes to developing their analytical skills.

From the students’ perspective, improving their ability to answer conceptual questions asked in exams doesn’t actually help them solve the problems presented to them in those same exams.

I therefore decided to focus on the exams themselves …

The need to rethink the evaluation process

In its 2018 report Evaluating So It Truly Counts [PDF], the Conseil supérieur de l’éducation (CSE) identified 2 broad goals of evaluation:

  • to certify achievement
  • to support learning

As I reflected on my own practices, I began to question whether the written exams I was using truly certified what my students had achieved. I also didn’t believe that they were actually supporting learning either.

Following the recommendations of the CSE report, and aiming to better achieve these 2 goals, I wanted to make my evaluations more authentic.

The point of school is not to get good at school.

Grant Wiggins

Grant Wiggins identified 5 characteristics of authentic assessment:

  1. focus on essential (real-world) tasks
  2. allow access to internal and external resources
  3. do not rely on time constraints
  4. promote deep learning
  5. include a collaborative dimension

However, meeting the 1st criterion isn’t easy for me. Coming up with truly authentic problem situations in pre-university physics is a real challenge. The situations found in textbooks are often oversimplified. While banks of authentic problems do exist, they usually contain a large amount of contextual information and sometimes require students to do their own research to fill in missing details, which doesn’t work well within a traditional exam context.

Instead of trying to invent authentic problems, I therefore decided to make the evaluation context itself more authentic.

In addition to meeting the other 4 criteria identified by Wiggins, I also took into account Viau’s motivational dynamic model [in French]. I wanted my students to clearly understand:

  • the value of the activity
  • their own competence in completing the task
  • their control over their own performance

The solution I found was to replace written exams with individual oral interviews.

How individual oral interviews work

Before the oral interviews

First, students define their own learning goal. Those who aim for the highest grade possible will naturally have more work to do.

  • Fundamental exercises are essential to passing the course and must be completed and mastered by all students to demonstrate achievement of the course competency.
    Students who complete the fundamental exercises only can obtain a maximum grade of 75%.
  • In-depth exercises build on the course content and must be mastered by students who wish to demonstrate a deeper level of competency.
    Students who complete the fundamental and in-depth exercises can obtain a maximum grade of 85%.
  • Integration exercises connect the most abstract concepts of the course and must be mastered by students who want to stand out by demonstrating an exceptional mastery of the course competency.
    Students who complete all 3 levels of exercises (fundamental, in-depth, and integration) can obtain up to 100% in the course.

Students must complete all the exercises up to their targeted level before their oral interview.

Allowing students to choose the level of “performance” they wish to aim for meets Wiggins’ criterion of no time constraints. Furthermore, it gives students a sense of control over their learning, aligning with Viau’s model. To complete the exercises, students can:

  • work in teams
  • come to me with questions
  • use the Internet and artificial intelligence

Here, I’m addressing 2 more of Wiggins’ criteria: a collaborative dimension and access to resources. In short, students can do whatever they find helpful to solve (and truly understand!) the problems.

The week before the oral interviews, I send students a link via Teams to the Calendly website so they can book a time slot for their interview. Each student books a slot by entering their name, email address, and the level at which they wish to be assessed (fundamental, in-depth, or integration).

Exam logistics

Oral exams take place twice per semester in Mechanics and in Electricity and Magnetism, and 3 times in Waves and Modern Physics. I also conduct a formative exam earlier in the semester to give students a chance to become comfortable with the idea of an oral exam. For the formative exam, students generally come in groups of 2 or 3 and choose an exercise they feel confident explaining. We have a short discussion about the problem, after which I provide feedback based on the evaluation grid. Students really appreciate having the opportunity to see what the oral exam would be like, and it was beneficial practice for me as well!

When I conduct the individual oral interviews, I cancel classes for the week to free up as much time as possible for the exams. Reservations are handled on a first-come, first-served basis. Students receive a confirmation email, which allows them to cancel and reschedule if necessary.

Example schedule for a week of interviews with 82 students [in French]

During the oral interviews

Students must bring their workbook (or tablet) to the interview to show me their work (which helps them recognize the value of the exercises they’ve completed, consistent with Viau’s model). Their work must be complete and clearly presented.

Students are expected to arrive 15 minutes prior to their scheduled time. 10 minutes before the interview starts, I tell them the 2 or 3 problems that will be discussed, allowing them to refresh their memory. They can reread the questions, check their solutions in their workbook, and review their steps.

I always greet each student with a warm smile and ask them a bit about how their preparation went, which helps reduce stress. I then ask the student to explain not only how they solved a given problem but also, more importantly, why they chose a particular equation or applied a specific concept. I might also ask how they would have adjusted their problem-solving if some elements had been different (for example, what would change in the problem if the electric charge were positive instead of negative).

Each interview lasts about 15 minutes. The discussions take place in a conversational, supportive manner. When a student goes off track in their explanation, I try to give them a chance to correct their mistake on their own, offering minimal guidance. If they cannot do so, I inform them they are mistaken and provide them with a direction for reflection to determine if they can then correct themselves, or if the concept is truly not mastered.

After each interview, I review the student’s workbook to check that the problems have been completed and to evaluate their problem-solving process (diagrams, mathematical formalism). The fact that students apply themselves in explaining their steps in their workbook encourages deep learning, aligning with one of Wiggins’ criteria.

After the oral interview

My evaluation focuses on both the workbook and the interview.

In practice, after each interview, I ask the student to wait outside the room and leave their workbook with me. I then bring in the next student, whose interview will start 10 minutes later, to show them which specific problems their exam will cover. During those 10 minutes, I evaluate the previous student’s workbook and write feedback.

Example of anonymized feedback for a student who aimed for a maximum grade of 85% (in-depth level) and achieved 65%

Student 4 (in-depth level) 65%

  • Problem 2.9.5: C
  • Problem 1.13.9: C
  • Approach: A
  • Diagrams: C

Hi Student 4,

Here is your feedback for the oral interview. You explained the problem of hyperopia well by correctly identifying that the light rays converge behind the retina in an eye that is too short. However, you needed some help understanding that a person with this condition can still see distant objects by straining their eyes, and that glasses are prescribed to eliminate the need for that effort. You also had some difficulty interpreting the symbol of q and understanding its connection as to whether the image was real or virtual.

For the problem on the Doppler effect, you explained the concept well but struggled more with justifying the signs. At first, you were confused about the sign of the receiver’s velocity. It was a bit unclear whether the receiver was unsuccessfully trying or actually managing to move away from the source, and how that affected the sign of the velocity. After a brief discussion, you understood.

In your workbook, your processes are clear and complete. Well done! Your diagrams are often included, although some are missing. It would be good to include a diagram for each problem whenever possible to demonstrate your understanding of the context, and to ensure all the problem’s parameters are clearly represented in your diagrams.

In the feedback, the letters associated with the problems, the approach, and the diagrams refer to the criteria-based evaluation grid presented earlier [docx] [in French] and shared with students at the beginning of the semester.

Grading System
Fundamental In-depth Integration
A: 75% A: 80-85% A: 90-100%
B: 70% B: 70-75% B: 75-85%
C: 60-65% C: 60-65% C: 60-70%
D: 45-50% D: 45-50% D: 45-50%
E: 30% E: 30% E: 30%

To share this feedback with students, I use the “comment on grade” feature in our Coba platform (col.net), which is similar to LÉA in many other colleges. I wait until I’ve completed all the interviews before making the grades and feedback available.

The weeks of interviews are, without a doubt, very busy for me. But by Friday evening, I have no grading to take home: everything has already been completed immediately after each meeting!

What do students think?

The results of an anonymous survey (80 student respondents) were extremely positive:

  • 60% found the oral exam less stressful (either much less or a little less) than a traditional exam, while only 23% found it more stressful
  • 82% indicated that it required more work (either much more or a little more) – even though my list of exercises was much shorter than the usual preparation for a traditional exam! (6% said they had worked a little less)
  • And most importantly, 81% reported that the oral exam had a positive impact on their understanding (they said they understood the material a little better or much better, thanks to this type of evaluation, compared to just 2% who said the opposite)

In the open-ended question (and in hallway conversations), several students mentioned that it was the 1st time they felt they understood what they were doing in a physics course.

Finally, when asked, “Would you like to have oral exams in your courses next semester?”
76% answered yes, and only 11% said no!

In short, students:

  • feel less anxious
  • have more confidence in their abilities
  • work harder than ever…
  • … and want more!

A win-win

For me too, the experience is very positive:

  • Students participate more actively in class.
  • Students discuss physics concepts with me and with one another, instead of simply asking, “Why didn’t I get the right answer?”
  • Makeup exams are much easier to manage.

I also appreciate that the semester ends with a conversation rather than the tense silence of a written exam. I get to greet each student, ask how their semester and studies went, wish them a lovely break, and tell them how glad I was to have them in my class. It’s truly meaningful to be able to end the semester on such a positive note.

On a personal level, I’m therefore very pleased with this new evaluation format. It aligns perfectly with my values and teaching style. Regarding physics learning, the next step will be to develop a robust research protocol to accurately assess how individual oral exams influence motivation, metacognition, and learning.

The data and anecdotal evidence I’ve gathered so far are very promising – I can’t wait to see what’s next!

Do you use oral evaluations in your courses? Tell us more in the comments section below!

About the author

Jean-François Désilets

Jean-François Désilets holds a Master’s degree in Mathematical Physics from the Université de Montréal and has been teaching physics at Collège Montmorency since 2013. He is interested in innovative teaching approaches to enhance his practice, and this led him to complete a microprogram in Higher Education at the Université de Sherbrooke in 2023. His main objective is to design pedagogical materials that help students deepen their understanding of physics while also supporting their growth as learners and critical thinkers.

Subscribe
Notify of
guest

0 Commentaires
Inline Feedbacks
View all comments