I am new to most of this modern technology. I graduated from high school nearly 20 years ago, and my last post-secondary experience was over 10 years ago, long before tools like Generative AI even existed. As both a mom and a student teacher, I am constantly juggling two worlds that are changing faster than I can keep up. At home, my kids are growing up surrounded by screens and smart tech, even while I try to limit their use. At school, I am expected to step into classrooms where digital tools are becoming part of everyday teaching. Lately, I have been exploring how Generative AI, like Magic School AI and ChatGPT, might fit into all of this. I am still very new to life in the classroom as an educator, so I have been curious about whether these tools actually support good teaching, or if they just create more work.

So far, I have found GenAI helpful for generating ideas for lesson plans, especially because I do not yet have teaching experience to fall back on. When I am staring at a blank page trying to think of how to introduce a concept or make a lesson more engaging, tools like Magic School and ChatGPT give me a starting point. Magic School, in particular, feels like something I will use a lot as an educator. It can quickly suggest activities, scaffolds, or question ideas that I can then adapt to my students and context. I also love the idea of report card comments for when the time comes to learn that part of being an educator. I still see the planning as my job, but it is nice to have something to “plan with” instead of planning alone.

At the same time, I am very aware that these tools are far from perfect and come with limitations. One of the biggest issues I have noticed is that GenAI does not  say “I do not know.” Instead, it will create an answer, even when it is not correct. I have also seen it start to pull from sources like the BC Curriculum, but then drift off and make things up or misinterpret what is there. This means I cannot just copy and paste what it gives me. Which I find reassuring, I need to read it critically, check it against the actual curriculum, and make sure it aligns with what I know is true and accurate. In other words, GenAI is helpful, but it still needs a human teacher.

Looking ahead, I think GenAI can also be a useful tool for students if it is introduced with clear expectations and guidance. I want my students to see it as a support for learning, not as a method to avoid doing the work. Together, we can use it to brainstorm ideas, explore examples, or check understanding, but I will always emphasize the importance of questioning what the technology produces. I am not naive enough to think that older students will avoid using AI just because I set boundaries, so instead, I want to teach them how to use it responsibly and with integrity. Just like I do, students need to learn how to read critically, verify information, and reference where their ideas come from. Teaching students to use GenAI responsibly is helping them build the digital literacy and critical thinking skills they will need for the rest of their lives.

For this week, I used ChatGPT to help me brainstorm ideas for my blog post, but I quickly realized it was actually more work to use what it gave me than to just write it myself. The suggestions helped get me started, but they didn’t sound like me. I would have had to make a lot of edits to match my own voice and add the direction I wanted. It reaffirmed that while generative AI can be a great tool for inspiration, our own voices and perspectives are what make our writing meaningful.

Shared by: