Building organizational readiness
For managers, the challenge extends beyond individual skill. Slaunwhite argues that organizations must create a culture that makes space for both curiosity and caution. That starts with clarity of purpose.
“The key is to identify high-value problems that AI can solve, and have the right tools to do the job effectively and responsibly,” Slaunwhite says. “If you’re not confident in your objectives, whatever tool you pick, you might not solve the problem.”
Strong adoption also requires early involvement from compliance, legal, and risk teams. Introducing a tool without those voices at the table can stall projects or expose organizations to unnecessary risk.
By contrast, when managers set objectives, define guidelines, and encourage staff to point out grey areas, employees feel empowered to experiment responsibly.
Organizational readiness is less about buying the right software and more about building the right culture. When leadership presents AI solutions as one tool among many, useful in some cases and unsuitable in others, that can help mitigate both reckless enthusiasm and skepticism.
Using AI as a thinking partner
In his classroom at Concordia Continuing Education, Slaunwhite encourages learners to use generative AI not as an answer engine, but as a partner for deeper thinking. Asking a model to explain possible approaches to a problem, without giving the final solution, prompts reflection rather than shortcutting the learning process.
He emphasizes that what matters is how professionals use the time saved.
“If we’re using AI, we are getting some time back,” he says. “The question is, what are you doing with that extra time to create impact?”
For managers, the lesson is similar. Auditing output is not just a question of catching mistakes. It’s also ensuring that the efficiency gained translates into better decisions, stronger relationships, or more meaningful contributions to the organization.
Human judgment prevails
Slaunwhite frames the adoption of AI as an opportunity to elevate professional standards. By pairing generative tools with expertise, professionals can test assumptions, surface counterarguments, and refine outputs to a higher level of quality. But accountability cannot be outsourced.
“The tools can speed up the process,” Slaunwhite says. “But they don’t replace our responsibility to apply judgment.”