Asking the tricky questions on AI, education and equity

How do we utilise AI as a force for good, rather than further entrenching inequality?

News • December 2023


AI is the shiny new toy, the new cool kid on the block, and people are rightly excited about the ways in which it can improve life and be applied to transform education. We hear this every day when we speak to people, but we also hear many valid concerns about how realistic it is to use any technology in the remote schools we want to help. These concerns matter to us, as AI-for-Education is about building together and collaborating to find ways that all the kids get to play with the new toys.

One thing we know well is that ideas are easy – and implementing them is hard. Especially where there’s no electricity, and it’s hot and dusty and at best one device for the whole school. So even while thinking about how AI can help, we can’t escape the realities of where we are trying to help.

This means we cannot be evangelists about AI, but must start with acknowledging the risks, challenges and finding answers to some tough questions. But that doesn’t mean we give up, or say this technology isn’t for everyone – it just means we have to work on solutions and realistic designs.

So, what are the common challenges?

Last week we held our first working group to discuss the potential ways AI can improve foundation literacy and numeracy in low- and middle-income countries. We heard from educators, NGOs, donors, developers, implementers, and large assessment companies.

At this event, and at our first convening in October, I facilitated sessions in breakout groups. As well as hearing about inspiring projects and innovative ways that people were building AI solutions, people also raised concerns, worries and barriers. It left me thinking that as well as promoting the good, we must tackle these issues head on as a community. I’ll illustrate a couple of issues here but this initiative came out of a pretty fundamental question…

How do we ensure AI benefits the poorest and most vulnerable?

In the discussion I chaired people raised a number of potential limitations. For instance, in many schools children do not have access to devices or have low internet connectivity. This means that many potential uses of AI to support student learning, such as creating personalised lesson plans, or AI chatbots interacting with students who need more support, may not be practical. In this scenario AI could further inequality as children in better resourced schools may have devices and get the benefits. But it still poses the question, how do we avoid AI furthering digital divide?

People in the discussions had still found ways to use AI in various ways. For instance, Smart Paper used it to support assessments by collecting written tests and uploading them in bulk once in an area that had connectivity. Neurabuild talked about how they were working in the north of South Africa and took the inverse approach of creating materials when they had connection and taking the materials offline for students.

How can we ensure that AI works in a culturally appropriate way?

After all most of the tech is devised in Europe and North America and this bias can come through in terms of the languages that AI works in, or the examples that it generates. If AI does not work in the local language then how can it work for the benefit of the students?

Then there were seemingly simple things like the places AI talks about or the animals that it uses. People in the discussion talked about how they have developed tools to get round this, such as Robots Mali who have connected translation tools to their AI, or Neurabuild, who have fashioned tools that make the examples more relevant to their students.

Will AI for education work in countries with weak governing institutions, or with little budget to invest?

EdTech Hub raised the issue that some countries might not have suitable infrastructure to support AI implementation at scale. In these scenarios costly solutions may be out of reach for whole countries, or the implementation might only benefit the richer parts of society. Part of our answer to this is sharing tools and open sourcing code to share expertise and reduce costs, but surely more work is needed and more ideas on how to scale AI for education in low- and middle-income countries.

AI – a blessing and a challenge for my work

On a personal level, I am also interested in the role of AI in building more climate resilient education systems. This is a classic example of AI being a blessing, but bringing challenges.

AI could also help us in climate resilience – for example, it can help us develop climate curriculum tailored to the local context at lower costs. There are also many exciting innovations which can help with sector planning – most obvious predicting extreme weather, such as heat, flooding, and drought (

 But I’m also worried we don’t join up across different challenges – as we know that AI requires a lot of energy and this has the potential to negatively impact the climate, as lots of servers means lots of electricity and produce lots of heat. Again, this has equity implications as those who are using this less often feel the impacts the most.

This is something to keep in mind as we explore more – this doesn’t mean we shouldn’t use these technologies, just that we need to do so with the constraints and challenges in mind.

In essence it boils down to a simple question, how do we utilise AI as a force for good, rather than further entrenching inequality?

Article by Dr Ian Sullivan, Fab Inc.


Dr Ian Sullivan


Climate and Education Lead

Fab Inc. &

Learning By Doing

We are providing small grants to support the development of AI products & components in LMICs. We know that innovation investment is high-risk. Our aim is that our community can benefit from the lessons learned in these pilots – what works and what doesn’t.

Learn more about our pilot projects here. We will be following each project and reporting on key learnings.

Sign Up

Join our mailing list to keep up to date with news and events.

Community          Knowledge          FAQ
          Privacy Policy was set up by Fab Inc. in partnership with Team4Tech. We are grateful to the Bill & Melinda Gates Foundation and the Jacobs Foundation for their support.

Powered by FabData.IO