ChatGPT is Biased. What does that means for Teachers and Lesson Plans?

ChatGPT is fantastic.

It’s our robot friend who can help us develop lesson plans, come up with report card comments, and explain complex concepts like I’m five years old.

The problem with ChatGPT is that it reinforces any stereotypes it has been trained with.

This is an issue for teachers and change makers who are trying to make the world a better place.

Why?

  • Because ChatGPT is our “robot friend”
  • Robots seem to be unbiased and impartial sources of “facts”
  • The danger is we accept potentially biased analysis and information.

Our goal at Educircles and SEOT Mindset is to amplify the stories we don’t always hear.

Read more about how ChatGPT is biased and how that affects Education, International Development, and Policy Makers.