AI in education should be applied the same way we use GPS in cars.
Subscribe now for unlimited access.
or signup to continue reading
When used correctly, it becomes a powerful guide, helping students navigate complex ideas and find efficient routes to understanding by suggesting shortcuts and offering new perspectives.
However, students certainly shouldn't let the GPS do all the driving. Over-reliance on AI means they miss out on the thrill of discovery and the chance to build essential skills.

They might reach their destination, but without truly understanding how they got there, leading to a surface-level understanding of subjects. This undermines the critical thinking, problem-solving, and resilience that education aims to foster.
In a world where digital tools are essential in the workplace, knowing how to "drive" without AI is a vital skill for our future workforce. AI is undeniably an exceptional learning tool, but it isn't flawless and, in education, brings with it many challenges.
Educators can teach students how and when to use AI by setting guidelines and frameworks that keep students in the driver's seat, empowering them to make decisions, and even get a little lost sometimes.
Used wisely, AI becomes a co-driver, enhancing learning and giving students confidence. Used blindly, it becomes the autopilot, robbing students of the skills they need for life beyond the classroom.
The goal is to nurture, not restrict a transformative technology. AI should help students find their way, but never take the wheel. This approach will cultivate a more capable and responsible generation ready for a world where AI is ubiquitous.
A double-edged sword
AI has quickly become a fixture in classrooms, presenting both promise and peril.
On one hand, it's a powerful educational tool offering students ways to brainstorm, draft, and problem-solve like never before. On the other, misuse can turn it into a shortcut that undermines meaningful learning. However, the reality is clear. AI literacy is now a requisite for work and life.
A recent survey from Turnitin found 67 per cent of students agreed AI was essential in helping them prepare for the workforce and their future career, while 86 per cent agreed it was the responsibility of their institution, college, or university to educate them on how to use AI ethically.
Future graduates will enter a workforce where AI tools are ubiquitous. So, outright bans are both impractical and unwise. But this doesn't mean students should have free rein.
If left unchecked, students may produce polished work without ever engaging in the creative process, comprehending the topic, or retaining vital knowledge. They develop a surface-level understanding to get by and miss the opportunities that build foundational skills, like critical thinking, problem-solving, and resilience.
This is where guidance becomes key. Detection tools enable responsible adoption of AI. Rather than resisting this technology, educators should adapt their teaching methods to show students how to use AI ethically, critically, and creatively. Otherwise, we risk raising a generation who appear fluent in tools but are truly lacking in thought.
A tool, not a shortcut
As AI becomes more accessible, educators should focus on guided, intentional use. Harnessing its potential without undermining learning.
We know many educators are eager to explore AI with Turnitin's survey revealing 47 per cent of educators and academic administrators want to leverage AI to make better decisions, for their teaching, students, and organisation, but don't necessarily know how.
When used correctly, AI can play a meaningful role in helping students generate ideas and improve drafts. Obviously when used inappropriately, such as writing entire essays, its value as a support tool dwindles.
With detection tools, educators need to understand they aren't for punishment, they're part of ensuring students are engaged with their learning. Detection services help educators understand how AI was used, not just whether it was used - focusing on transparency and intent.
It is important educators revisit curricula and look to integrate AI into assessments with clear guidelines around usage. This puts AI in place to enhance student work, rather than replace their thinking.
Equally vital is transparency. Implementing systems that enable AI to be used, but also clearly shows educators how it was used and the intention the student had, are not just important, they are providing a whole new level of insight. Educators will be able to distinguish between genuine effort and overreliance, and truly understand how students are learning.
Visibility into the process, not just the final product, safeguards academic integrity, reveals learning behaviours, and makes faking understanding harder. Like "showing your working" in math, evidence of learning builds trust and creates a feedback loop that ensures key outcomes are being met. Detectors are one way educators can gain comprehensive pictures of student engagement, including ethical AI use, providing confidence rather than suspicion.
This feedback is crucial, as Turnitin's research revealed 64 per cent of students expressed worries about AI's impact in education more than educators or administrators.
AI isn't the end of learning, it's a partner in it. But like any partner, it needs guidance, structure, and accountability.
As detection tools have evolved to understand AI, so have safeguards, relieving anxiety for both students and educators. AI detectors look at more than just matching text, they evaluate behaviour, patterns, and process, mirroring the shift toward understanding.
Equipping students to critically engage with AI will allow digital literacy to flourish, while its application in curricula will become more refined.
The only way to truly develop a safe and ethical framework is to have open discussions about the strengths, limitations, and ethical implications of relying on such a transformative technology.
The goal shouldn't be to prevent AI use, but to nurture judgement. Students should be looking to AI as a GPS when navigating their education. Helping find the way, but not taking the wheel. Doing so not only builds a stronger workforce, but a more responsible one as well.
It's wisdom, not just information, that truly matters.
- James Thorley is regional vice-president at Turnitin.
