Every post-secondary institution is grappling with generative AI, but most are stuck between two poles: blanket prohibition or uncritical adoption. Neither serves students or instructors well.
What’s missing is a strategic approach that addresses academic integrity, pedagogical opportunity, privacy, and practical integration — all grounded in what the technology actually does, not what the marketing says.
I bring something unusual to this work: I build things. I’ve prototyped AI-powered tools for teaching and learning contexts using local LLMs, API integrations, and learning platform connections. That means the strategic advice I offer is grounded in hands-on understanding of what these tools can and can’t do — not just what a vendor demo suggests.
What an engagement looks like
AI strategy development
Working with institutional leadership to develop a coherent approach to AI in teaching and learning — covering policy, pedagogy, professional development, and infrastructure.
This isn’t a generic framework applied to your institution. It’s built around your specific context, values, and capacity. It accounts for questions that most AI strategies skip: Indigenous data sovereignty, accessibility implications, the difference between institutional AI tools and consumer AI tools, and what happens when the hype cycle moves on.
Practical AI prototyping and evaluation
I build working prototypes, not slide decks. If you want to understand what an AI-powered course assistant, a curriculum mapping tool, or an automated feedback system could look like at your institution, I can build and test a functional prototype using current tools.
This grounds strategic conversations in reality. It’s one thing to discuss whether AI tutoring could work; it’s another to put a working prototype in front of instructors and see what they actually do with it.
AI literacy professional development
Workshops and hands-on sessions for faculty and instructional designers. These aren’t “here’s what ChatGPT can do” overviews — they’re practical, pedagogically grounded sessions on integrating AI tools into course design and assessment, with honest discussion of limitations and risks.
Typical duration: half-day to full-day sessions, standalone or as part of a broader engagement.
Why me
I’ve built working AI prototypes for teaching and learning contexts — including a Brightspace Course Coach application that connects a local LLM to institutional course data via the Brightspace API using Python, Flask, and Ollama. I understand these systems at the code level, not just the concept level.
I also co-lead work on engaging with Indigenous perspectives on AI and data sovereignty, bringing ethical dimensions that most AI consultants ignore entirely. The hard questions about AI in education aren’t technical — they’re about power, privacy, and purpose.
I have a PhD in Computational Media Design and I’ve been building educational technology tools since before most current AI consultants had email accounts. The technology changes; the pedagogical questions don’t.