Ethical AI Use for CALP Staff

Ethical AI Use for CALP Staff

Laureen Guldbrandsen, East-Central Regional Support Staff

Have you ever wished you had more time to focus on your learners instead of lesson planning? AI tools can help, but only if used wisely!

Artificial Intelligence (AI) is becoming more common in education, offering ways to create learning materials, assist with lesson planning, and streamline tasks. Used wisely, AI can help save time and create engaging, relevant learning experiences. It’s important though, to ensure AI use aligns with our values.

How AI Can Help CALP Staff

AI can be a valuable tool when used with intention and care. Some key benefits include:

  • Generating lesson plans and learning materials tailored to different skill levels.
  • Saving time by quickly drafting emails, worksheets, and activities.
  • Providing creative ideas for engaging learners in new ways.
  • Helping staff find simplified explanations for complex topics.

AI is a tool, not a replacement, for educators. While it can assist in many areas, human review and interaction remain essential to ensure learning is meaningful. While AI offers many benefits, it’s important to use it responsibly. Here are four things to keep in mind to ensure ethical and effective use of AI in CALP settings.

Four Things to Keep in Mind

1. Keep Learners’ Privacy Safe

AI tools process and analyze the data entered. Additionally, some AI platforms store data for future learning. This means that AI is being trained from the data that you give it, and this information can be spit out in someone else’s interactions with the AI platform. It’s important to protect the learners’ privacy. This means personal or sensitive information should never be shared with AI platforms.

  • Avoid inputting learners’ names, stories, or any identifying details into AI tools.
  • Use generic or fictionalized examples when generating content.
  • If you’re unsure how a tool handles information, assume that anything entered could be retained.

Did you know?

Modeling safe AI use also helps learners develop privacy awareness, helping them understand how to protect their own data.


2. Make Sure Content is Accurate and Relevant

AI can generate text quickly, but that doesn’t mean it’s always correct or appropriate. You may recall not so long ago when ChatGPT was unable to correctly count the number of Rs in the word strawberry, for example.

It may provide outdated information, use complex language, or reflect biases found in its training data. Before using AI-generated materials, always take the time to review them carefully.

Ask yourself:

  • Is this information factually correct and up to date?
  • Is it written at the right literacy level for my learners?
  • Does it reflect real-life situations and needs?

AI is a great brainstorming and drafting tool, but human editing is necessary to ensure content is accurate, relevant, and useful for learners.

3. Be Aware of AI Biases

AI tools generate content based on the data they’ve been trained on — but that data isn’t always neutral. AI can reflect biases present in its training material, leading to stereotypes, cultural inaccuracies, or perspectives that don’t align with the diverse experiences of adult learners.

For example:

  • AI might create materials with assumptions about gender roles (e.g., “engineers are men” or “nurses are women”).
  • It may struggle to provide culturally relevant examples for learners from different backgrounds.
  • Most AI tools favor formal or academic language, making content harder to understand for those with foundational literacy needs.

To reduce bias, always review AI-generated materials critically. Ask:

  • Does this reflect diverse perspectives?
  • Is the language inclusive and accessible?
  • Would my learners see themselves represented in this content?

If AI output seems biased or inappropriate, tweak your prompts or manually adjust the content to make it more relevant and equitable for your learners. AI is a great starting point, but human editing is key to ensuring fairness and inclusion.

4. AI as a Support, Not a Substitute 

AI is a great tool for brainstorming, drafting lesson materials, and simplifying complex topics, but it can't replace the knowledge, experience, and adaptability of CALP staff. Learners benefit most from direct interaction, guidance, and encouragement — things AI can't do.

Use AI strategically to save time on repetitive tasks so you can focus more on personalized support and engagement. For example:

  • AI can draft a reading comprehension worksheet, but an instructor can adapt it to match a learner’s interests and skill level.
  • AI can suggest a math lesson plan, but a tutor’s encouragement and explanations build confidence.
  • AI can provide writing prompts, but feedback and discussion help learners develop critical thinking skills.

By using AI to enhance rather than replace instruction, CALP staff can provide more effective, learner-centered support.

Final Thought

AI has the potential to support CALP staff by making lesson planning and content creation more efficient and allowing for more time directly engaging with the learners. Ethical use of AI is key. By prioritizing privacy, accuracy, awareness of AI biases, and ensuring that AI remains a support - not a substitute - we can harness its benefits.

With careful use, AI can be a valuable tool in adult learning, helping to create accessible and engaging materials. But at the heart of working with adult learners, it is the human connection and thoughtful instruction that make the biggest impact on foundational learning. AI should enhance, not replace, the meaningful work CALP staff do every day.


Published on March 18, 2025 9:00am MDT