School leaders are increasingly called upon to navigate the opportunities and risks of generative AI (GenAI) in schools. One of the most pressing – and often overlooked – challenges is bias: the ways in which GenAI can reinforce stereotypes and even exclude certain individuals and communities. But what can schools do to address it?
Director of Innovation at Haberdashers’ Elstree Schools (Habs), Hertfordshire, Clare Jarmy, who recently co-authored a chapter on the topic with Sabrina Nanji (Habs EDI Lead and Head of Geography) and Enora Hauduc (Student AI Champion) for a book, offers some thoughts and outlines the approach they are taking.
Why does GenAI bias matter in schools?
At Habs, our community is richly diverse, with students from a wide range of ethnic and religious backgrounds. We have worked hard to foster a sense of belonging, but our experience with GenAI has shown us how easily technology can undermine this work.
Take a simple example: we asked GenAI to create a game using girls’ names. The names it generated – Olivia, Sophia, Alice – did not reflect our student body at all. Where were Aalia, Saanvi, or Yasmine? Names are deeply tied to identity, and their omission sends a powerful message about who is seen and valued…