This article is a translation of a text published in Eductive’s French edition.

In my Accounting and Management Technology course entitled Plan d’affaires (Business Plan), I encourage my students to use generative artificial intelligence (AI) to complete certain assignments. This allows them to produce high-quality work more efficiently. It also prepares them for the job market, where they will need to use these tools. However, to ensure that this use of AI is ethical and responsible, I keep a close eye on it.

“But what is the world coming to if young people use AI at school?”

AI is getting bad press in education these days. I think it’s analogous to what happened when the calculator, Google or Wikipedia came along. At first, people wanted to ban them, thinking that these technologies would prevent students from learning properly, or even (to dramatize) that they would send civilization back to the Stone Age.

Personally, I’ve been interested in generative AI since its inception. I’m convinced that it’s a remarkable tool for our students today, and will continue to be so when they enter the workforce.

AI can help all students, in different ways

Some shy or anxious people may be reluctant to ask questions in class. (For example, a 3rd-year student may be embarrassed to have forgotten the definition of a term they remember studying in 1st year.) Before AI, these people could search the web on their own to find the answer. With AI, it’s even simpler: they can ask a conversational robot, then validate the answer as needed.

AI is also useful for dyslexic individuals or non-Francophone immigrants. It can correct a text, translate it into another language or even provide the basis for a text in proper French. It breaks down barriers!

Meanwhile, more advanced students can chat with the AI to get suggestions on how to improve an assignment, or simply to do additional research on a topic. While I help weaker students reach the pass threshold of the course, more advanced students can use AI to go further.

In this way, AI can help both weaker and stronger students. And what about “average” people? AI can help them too!

AI can save us from writer’s block. It can write a 1st version of a document and give us more time to improve it afterwards. AI can help us brainstorm and come up with original ideas or arguments. It can quickly correct errors in a text. In all of these cases, highly gifted students may not need it, but for average students, AI will bring significant productivity gains.

Is it ethical to use AI to simplify life?

In my courses, my students have to write letters (for example, to present a business project to a potential partner). In the past, I taught them to use Le français au bureau [in French] as needed, to find a letter template and personalize it. Was it cheating to use these templates? Of course not. Now, AI can write an even richer and more specific basis for the students than generic templates. This means that the final letter can be written faster and with a potentially more interesting result. Is this cheating? In my opinion, no!

In the job market, why would a company want to pay someone to do laboriously what another person could do more efficiently because they’ve used AI? Teaching students how to work with AI means equipping them to meet the expectations of the world of work.

Reflecting on the purpose of the tasks asked of students

To choose whether or not to encourage our students to use AI, we need to think about the purpose of the course or targeted activity.

In the 3rd-year project management course, one of the activities is to write a message to a company requesting funding for a project. In reality, the purpose of the activity would be for the company to respond in the affirmative. What I want to evaluate is not the syntax of the text or its literary qualities as such, but specifically its ability to generate positive action on the part of the recipient. In this context, it seems legitimate (and effective!) to use AI to produce or improve the basis of the text.

It’s the same thing when I ask students to write messages to the members of their work team. The purpose is to change the behaviour of their teammates. Students often tend to write only 1 or 2 sentences. AI generates richer, more detailed texts that give the recipient more information and increase the chances that they will understand what is expected of them. I believe this minimizes the risk of communication problems.

Revisiting Bloom’s taxonomy in the age of AI

In my opinion, Bloom’s taxonomy needs to be reinvented to take into account today’s tools. This has been done by the Ecampus team at Oregon State University.

Bloom’s taxonomy in the age of artificial intelligence (Source)

I wouldn’t want the 1st-year students in my program to use AI indiscriminately, because they need to get to grips with the concepts. But in the 3rd year, students apply the concepts. In that context, someone who uses AI will move faster than someone who doesn’t.

For example, in a course where the competency is presenting and formatting information, AI can help students gather information quickly. They can then analyze it and organize it into a slideshow for presentation. This allows students to spend more time actually working on developing the competency.

Mastering the tools of efficiency is certainly an asset in the job market. But it’s important to know where to draw the line. Students must distinguish between situations in which it is ethical to use AI and those in which it is not.

Supervising students

I support and monitor my students’ use of AI. First of all, I want students to tell me how they use it:

  • Name of the conversational agent (ChatGPT, Bard, etc. — At the moment, ChatGPT seems to give the best results, so I encourage my students to use it.)
  • Date of the conversation
  • Question asked

The students keep a log of all their interactions with the AI.

What’s more, instead of having them copy and paste the AI-generated text (with or without quotation marks), I ask them to paraphrase it, to rephrase the relevant ideas in their own words, citing the AI as the source. This forces them to think about the information rather than just copy it.

The student logs allow me to informally evaluate the quality of the questions posed to the AI. With generative AI, when a person is able to express their needs in a detailed and precise way, they get more interesting results.

Better results thanks to AI

In recent years, when writing a business plan, my students might have needed an hour in class to create an organizational chart. Today, with AI, it takes them 5 minutes. But I haven’t reduced the amount of time spent on the activity in class. Students take advantage of the time they save to refine and improve their work.

Their business plan writing is richer, more interesting and more in tune with the real world. The students are proud of their results. For my part, I enjoy evaluating their work even more.

Distinguishing truth from falsehood through AI hallucinations

One of the persistent challenges with generative AI is distinguishing the true from the false. AI doesn’t just give correct information, far from it…

I think students need to learn how to navigate that reality. They can use some of the time saved by AI to do additional checks. My students fall into traps, but they learn not to get caught again. Several of the business plans produced by my students mentioned the importance of obtaining an online business license (which doesn’t exist) or applying for other imaginary permits… Of course, such mistakes penalize their grade on the assignment…

There’s no point in trying to protect our students by putting them in a bubble. Otherwise, when they come out of it, once on the job market, it’s going to be a shock.

I think companies will eventually no longer want to hire people who can’t use AI effectively anyway.

Students need to be trained

One might tend to think that CEGEP students would naturally master the use of generative AI, but that’s not what I observed. You’d think that students would spontaneously try to use AI in all contexts, for all their schoolwork, but I get the impression that many are afraid of it.

We need to work with our students and train them to use AI. Personally, in the fall of 2023, I briefly explained to my students how to use AI, but I didn’t spend much time on it. (In the winter of 2024, I’m not teaching because I’m in charge of the Défi OSEntreprendre [in French] Saguenay-Lac-Saint-Jean.) I think it would be better to do this through lunchtime experimentation workshops in the library, for example. This could benefit students in all their courses. They could learn to:

  • formulate a prompt (which words to choose, which adjectives, etc.)
  • discover the possibilities offered by generative AI (many students don’t know that AI can be asked to rewrite a text by changing the verb tenses or the person, to adapt a text to the Quebec reality, etc.)
  • etc.

In the fall of 2023, I thought my students knew more about AI than they actually did. I thought AI would put everyone on an equal footing, but it actually increased inequality. Those who knew how to use it effectively were far ahead of the others…

A playground awaiting institutional guidelines

In my opinion, institutions will have to regulate the use of AI in courses to avoid ethical abuses. But right now, there aren’t really any rules at my college. Fall 2023 was an opportunity to experiment, a playground. Previously, my course plan stated that the student had to be the sole author of their assignment, or risk getting a 0. This fall, my course plan stated instead that the use of ChatGPT was allowed if I explicitly authorized it.

AI can’t (and certainly shouldn’t!) replace teachers. But we can be coaches in how our students use it.

I’m aware that the situation varies from program to program, from discipline to discipline. In my field, the notion of efficiency is key. I know that the situation is not the same in literature or philosophy. AI can be used to brainstorm or to formulate arguments for any position. There’s an interesting pedagogical potential there.

In any case, I think the current situation requires us to rethink our posture. What about you? How can AI transform the reality of your program? How have you adapted your practices and integrated AI into your courses?

Thanks to Eductive editor Catherine Rhéaume for her collaboration in writing this story.

About the author

François Cormier

Saguenay native François Cormier is the perfect example of the symbiosis between education and entrepreneurship. Since 2015, he has been passionately teaching accounting and management at the Collège d’Alma, where he has established himself as a pillar of education. At the same time, he ran a digital web agency for 5 years, demonstrating his strategic vision and boldness in the business world. His expertise in business strategy and his role as a mentor in the Mentorat Saguenay network testify to his deep commitment to professional development. François Cormier embodies the harmony between teaching and entrepreneurship, a source of inspiration for current and future generations.

Notify of

0 Commentaires
Inline Feedbacks
View all comments