Canadian universities are embracing generative artificial intelligence in their teaching plans as more students and instructors opt to use the rapidly evolving technology.
Several large institutions, including McGill University, University of Toronto and York University, said they are adopting certain AI tools because they can enhance learning. Those include tested tools that help students summarize academic research or assist professors in course planning.
The shift comes as post-secondary students' AI use continues to grow. A survey conducted in late 2024 by the online learning platform Studiosity found that 78 per cent of Canadian students used AI to study or complete their school work.
The Pan-Canadian Report on Digital Learning also found that the number of educators who reported generative AI use in student learning activities was 41 per cent last year, up from 12 per cent in 2023.
McGill University's associate provost, Christopher Buddle, said the school has integrated digital AI assistant Microsoft Copilot into its systems to help staff, students and faculty with their work. The tool can be used to make a first draft of a letter, summarize online content or to organize day-to-day tasks.
"People use it for all kind of things and from what I understand it's being used effectively and used quite a lot by our university community," he said.
Buddle said offering generative AI tools through the school's IT infrastructure ensures they are vetted properly to address privacy risks and ensure data protection.
"We've not approached it through the idea of banning (AI) or saying 'no.' In fact, what we'd rather see and what we support instructors doing and students doing is effective use of generative AI in teaching and learning," he said.
Buddle said the university has left it up to instructors to decide how much AI use they want to allow in their classes.
"We don't tell instructors what to do or not to do. We provide them tools and give them the principles and let them make the best decisions for their course because it's so discipline specific," he said.
Some professors, for example, have their students use generative AI to create a first draft of a written assignment and then the students evaluate the outcome, Buddle said.
The school is launching an online module for students and instructors this fall to help them navigate and understand the benefits and risks of AI in education, he added.
"Generative AI is pervasive. It's everywhere and it will remain that way going forward," Buddle said.
University of Toronto professor Susan McCahan, who led the school's task force on AI, said the institution is integrating AI tools but it's also taking a balanced approached that allows instructors to explore the technology while critically thinking about its value in education.
"We have a wide range of opinions on AI and the use of AI in classrooms and in teaching and in learning," she said. "And we want to support faculty who are interested in innovating and using it in their classes. We want to support faculty who find that it is not useful for them or for their students."
McCahan said the university has used AI systems for years, including for auditing financial reports and helping students find mental health resources. More recently, the school also made Microsoft Copilot available to all faculty, students and staff.
"They can use in any way they wish. And because it's within our system, you can do things like open a library article in the library, and ask Copilot to summarize it," she said. "It doesn't share that data back with Microsoft ... so you can put in more sensitive information into that."
McCahan said the university has also made ChatGPT Edu licences available to students and staff who would like to use the tool with added security protection. The school has been experimenting with AI tutors and will expand that in the coming school year with Cogniti, an open-source system developed at the University of Sydney in Australia, she added.
At York University, the goal is "to take a thoughtful and principled approach to this modern technology," deputy spokesperson Yanni Dagonas said.
"Transparency works to demystify AI, helping our community better understand its impact and potential," Dagonas said.
The university has created an online AI hub with a dedicated section for instructors, who are discouraged from using AI detection tools when evaluating students' work because many such tools are considered unreliable and raise concerns about data security and confidentiality.
Despite the "huge uptake" in students' generative AI use, many professors are still worried about bias in AI models, ethical and privacy issues, as well as the technology's environmental impact, said Mohammed Estaiteyeh, an assistant professor of education at Brock University.
"Students are kind of using (AI) to save time. They think it is more efficient for various reasons," he said.
But when it comes to instructors, "it depends on your domain. It depends your technological expertise. It depends on your stance towards those technologies," he said.
"Many instructors have concerns."
Estaiteyeh said most Canadian universities are providing guidance to instructors on the use of AI in their classes but leaving much of it to their discretion.
"For example, (at) Brock, we don't have very strict guidelines in terms of students can do this or that. It's up to the instructor to decide in relation to the course, in relation to the materials, if they want to allow it or not," he said.
"We are still navigating the consequences, we're still not 100 per cent sure about the benefits and the risks. A blanket, a one-size-fits-all approach may not suit well."
Estaiteyeh said instructors and students need AI training and resources on top of guidance to reduce the risk of relying too much on the technology.
"If you offload all the skills to the AI tools then you're not really acquiring significant skills throughout your three- or four-year degree at the university," he said.
"Those tools have been in place for around two years only. And it's too early for us to claim that students have already grasped or acquired the skills on how to use them."
The Canadian Alliance of Student Associations said AI technologies must complement the learning experience and universities should discourage the use of AI for evaluations and screening of student work.
The alliance said in a report released earlier this year that research has shown untested AI systems can introduce "bias and discriminatory practices" against certain student groups.
"For instance, AI-powered plagiarism detection tools have been found to disproportionately misclassify the work of non-native English speakers as AI-generated or plagiarized," the report said.
The alliance has been calling for "clear ethical and regulatory guidelines" governing the use of generative AI in post-secondary education.
This report by The Canadian Press was first published Aug. 19, 2025.
Maan Alhmidi, The Canadian Press