Learning by Doing: Research & Development Project


Sharing the benefits of AI more equitably is difficult when only 25% of people in low-income countries have access to the internet compared to 85% of people in high income ones. Currently, almost all large language models require good internet access to work properly.

Yet we cannot wait for mass infrastructure overhaul to share the benefits of AI more equitably. That is why Beekee is exploring the limits of what is possible in the context of limited connectivity to ensure learners in under-resourced communities do not get left behind.

The advent of powerful and affordable single-board computers, such as the Raspberry Pi, has created new opportunities to run optimised AI models in areas with limited infrastructure.

Aims of the Research & Development

The Raspberry Pi and similar devices, while increasingly capable, do not match the processing power of large cloud servers and may struggle with complex AI computations. Open-source AI models that are light enough to run on these devices may offer a solution, yet they often come with trade-offs in terms of sophistication and capabilities. The challenge is further compounded by the need for these systems to be power-efficient, given the unreliable electricity supply in many remote areas.

This project aims to identify the most efficient AI models that balance performance with the operational constraints of single-board computers. By addressing these challenges head-on, they will identify AI educational applications that are not only effective but are also well-suited to the realities of the target environments.

How will this be done

Beekee will conduct a thorough review of existing AI educational tools and their impact on learning to pinpoint solutions that can adapt to individual learners, thereby providing personalized assistance. The project has three deliverables that build on each other.

Deliverable 1: Identification of Transformative AI Applications: Identify AI applications with the highest potential to significantly enhance learning experiences in environments where resources are scarce. This task will involve a comprehensive literature review and analysis of existing AI educational tools with an emphasis on intelligent tutoring systems. The selection will be based on how well these tools can be adapted to the constraints and needs of learners.

Deliverable 2: Indexing AI Models for Single-Board Deployment: Compile a state-of-the-art index of existing open-source AI models that are compatible with single-board computers like the Raspberry Pi. The indexing will evaluate the models based on criteria relevant to the educational field, such as computational requirements, ease of use, adaptability, and the specific educational needs they serve.

Deliverable 3: Implementation and Assessment on Single-Board Computer Implementation and performance assessment of a select set of promising AI models on the Raspberry Pi. This phase is critical, as it will provide tangible insights into the practicality of running such models in real-world settings. Beekee will measure the models’ performance in terms of their response times, accuracy, and stability, under the constraints typical of low-resource environments, such as intermittent power and lack of internet connectivity. The outcome of this assessment will guide the refinement of these models to ensure they are optimised for the target context.

The synthesis of these three deliverables will create a robust framework for utilising AI to uplift educational outcomes. The action plan is designed to be iterative and responsive, allowing for adjustments and enhancements as new insights and feedback emerge from each implementation phase.


Results of the literature review, indexing and testing will be made available on the AI-for-Education website.

Project timeline

3 months (February – April 2024)

Current progress

The team has finished mapping out LLMs that can run offline. They have ordered low-cost hardware equipment in preparation for testing those models. In testing, the team will focus on documenting the performance (including output quality) and limitations (such as time taken to produce outputs) on education-related tasks.

Meet the team

The work is led by Sergio Estupiñán (1st from left) and Vincent Widmer (2nd from left), Co-founders of Beekee. They have been working with partners in resource-constrained areas. Beekee had created two offline-first devices: the Beekee Box and Beekee Hub.

What is the biggest challenge you are currently facing?

Sergio Estupiñán: Our goal is to identify practical approaches that allow for the use of LLMs under these hardware limitations, aiming to maintain their usefulness without overstressing the available resources.

Reduced power and memory: The research itself involves reconciling the limited capabilities of single-board computers with the computational demands of LLMs… For instance, existing AI frameworks may have to be adjusted to fit the reduced processing power and memory.

Overheating: Additionally, running LLMs for a continuous period of time can lead to overheating of devices, affecting their performance and longevity.

The big issue is connectivity, but it’s not just about that. Because we work with low-resource communities, we need to be mindful of the electricity consumption too. It’s about identifying the right software and hardware.

What are you hoping to get from the community?

We aim to gather insights, technical advice, and experience feedback from the community. This includes optimisation strategies for models, guidance on managing infrastructure and hardware, as well as examples of successful LLMs applications on low-power devices.

Advice on tools, libraries, and best practices is also appreciated.

Exploring how to take Large Language Models offline.

Pilot information
Large Language Models

Based in Switzerland

Contact – Beekee

Sergio Estupiñán



The big issue is connectivity, but it’s not just about that. Because we work with low-resource communities, we need to be mindful of the electricity consumption too. It’s about identifying the right software and hardware.

Sergio Estupiñán

Co-founder, Beekee

Learning By Doing

We are providing small grants to support the development of AI products & components in LMICs. We know that innovation investment is high-risk. Our aim is that our community can benefit from the lessons learned in these pilots – what works and what doesn’t.

Learn more about our pilot projects here. We will be following each project and reporting on key learnings.

Sign Up

Join our mailing list to keep up to date with news and events.

Community          Knowledge          FAQ
          Privacy Policy was set up by Fab Inc. in partnership with Team4Tech. We are grateful to the Bill & Melinda Gates Foundation and the Jacobs Foundation for their support.

Powered by FabData.IO