Supt. Mark Pearmain: Canada’s schools need a voice in the country's AI strategy
Image via iStock
Anyone on the front lines of the education sector will tell you the arrival of AI is one of the biggest disruptions they have faced. The conversation goes far beyond whether students are using AI to do their work.
There are deeper and more urgent questions about how it will impact student and community safety, mental health, attention spans, social-emotional development, relationships and our kids’ future career aspirations.
From the advent of smartphones to the dominance of social media, two decades of unchecked technological evolution have come at a staggering cost to the next generation. Social media was embedded into young people’s lives long before guardrails were in place and they withstood the worst of the cost — mental-health impacts, misinformation, polarization and eroded trust. All of this was accelerated by unchecked screen time, COVID-19 and personalized algorithms.
It was reassuring then to see Canadian Artificial Intelligence Minister Evan Soloman summon OpenAI to Ottawa to question representatives about the company’s safety protocols. Soloman came away “disappointed” and suggested the government may introduce its own regulations.
Critics who decry regulation may say we can’t let fear of the unknown and privacy concerns stifle our willingness to explore these technologies and risk making Canada less competitive on the world stage. On a more personal level, they may argue our children need access to learn these technologies, explore them, and learn to solve problems with them.
Sounding the alarm around security, safety and privacy doesn’t mean the public education sector is avoiding or running from AI. Far from it. As the largest district in B.C., Surrey schools is fortunate to have a progressive board of education that is calling for innovation, adaptability and creativity.
We are actively exploring new tools and technologies, conducting privacy assessments, modelling responsible use while protecting our students’ privacy and security.
We’re helping students understand both the extraordinary possibilities of AI and the importance of protecting their data, their ideas and their digital identities.
We’re collaborating with the Education and Child Care Ministry, as well as districts across the province, because all of us — communities big and small — must “be at the table.” We’re sharing knowledge, strengthening best practices and anticipating the risks associated with AI.
It’s impossible, though, to expect school districts, or provinces, to create their own, disparate and disconnected approaches to the safe, secure adoption of AI. Instead, AI safety will need to be a collective responsibility shared by schools, students, families, research institutions, industry and governments.
A patchwork of policies, uneven protections for students and widening inequities across provinces and communities only leaves our must vulnerable citizens behind. A co-ordinated national strategy will align standards and privacy protections at the national level and, most importantly, prioritize the safety and well-being of our kids. Moving forward, consulting with educators must be central to this process — not a footnote.
When it comes to AI’s impact on our world, educators like me across this country are thinking about the bright-eyed kindergarten students who enter the school system every year — and how we prepare them for an AI-driven future. Beyond preparing them for a world in which AI exists, our aspiration is to ensure they’re safe, healthy, thriving and can find their purpose.
The choices we make today for our children will reverberate into the future. Adults in the room must centre the needs of our children or they will accuse us — and rightly so — of waiting too long to act while their safety and future was at stake.
Mark Pearmain, Superintendent of Surrey Schools
Published in the Vancouver Sun