Promoting Gender Equity in Kenya’s Digital Landscape: A Conversation with Lilian Olivia Orero
Promoting Gender Equity in Kenya’s Digital Landscape: A Conversation with Lilian Olivia Orero
As public discourse around the implications of AI and technology in the Global South ramp up, Rodrigo Ugarte, editor of the Just Tech platform, and Just Tech Program Assistant An Doan interviewed Lilian Olivia Orero about AI and digital technology regulation policy in Africa, broadly, and Kenya, specifically, with particular focus on their impact on women and how organizations and advocates can push governments to include protections as legislation is drafted.
Orero is a lawyer in Kenya whose research specializes in gender, technology, and law, focusing on cybersecurity, women’s online safety, and gender bias in AI across Sub-Saharan Africa. She is also the founder of SafeOnline Women Kenya (SOW-Kenya).
Rodrigo Ugarte (RU): You’re an attorney doing advocacy work, so we wanted to learn more about your trajectory. How did you get involved and interested in tech and AI policy?
Lilian Olivia Orero (LOO): I am a gender and technology lawyer, and an advocate of the High Court of Kenya. One would expect that my professional trajectory would be to represent clients in court. However, as my legal career progressed, I got the opportunity to work at a Pan-African venture-building studio called Adanian Labs that focuses on nurturing and scaling tech startups in Africa sometime in 2020. At this technology studio, I provided legal and compliance advise on a diverse range of startups in FinTech, Ed-tech, Agri-Tech, healthcare tech, and e-commerce. I just found that super exciting, because I was learning quite a lot of interesting ideas and concepts around technology that I never studied in school. Based off this experience, I also developed a keen interest on data governance, and I was fortunate enough to participate in one of the research sprints by the Berkman Klein Center for Internet and Society at Harvard University, which sparked my interest in digital identity risks and how it influences marginalized communities.
So, I would say that both getting myself into the research sphere as well as the advocacy work began there. Once I participated in research, in particular research I’ve done around gender inclusive AI in Sub-Saharan Africa, I thought to myself, how can I put into action what I have been researching? So, that is how I got the interest of starting my own organization, SafeOnline Women Kenya (SOW-Kenya). But this was not the only inspiration for founding this organization. I also experience cyberbulling as a response to some of my work.
That is why I decided to start SOW-Kenya. Currently, I am pursuing my master’s in law innovation and technology at the University of Bristol, because I really wanted to expand my expertise in terms of looking at the broader ecosystem of AI and tech policy.

RU: Thank you, and I’m sorry this happened. In that vein, can you talk to us about your work with SOW-Kenya and its goals?
LOO: SafeOnline Women Kenya is a community-based organization that was founded in September 2023 based on the idea to provide a safe environment where young women and girls can get the digital literacy skills to navigate the online space. Having experienced cyberbullying, I realized that, since I was educated, I knew what online safety was about, yet I still faced this form of violence. What about people who do not have access to technology? Or sometimes, when they do have access to technology, they do not know that they are experiencing a form of technology-facilitated gender-based violence. And so, because of the increasing numbers of women who face online stalking, cyberbullying, body-shaming, or doxxing in Kenya, I saw this as an opportunity to establish an organization that would provide a safe space. The goals of SOW-Kenya, first, are to provide digital literacy. We have developed our curriculum called the TechSecure Literacy Curriculum. In a very simplified form, this curriculum explains very complex technological terms that maybe women who’ve not advanced in their education are still able to understand when I say, “Oh, what is a deep fake or what is AI?”
Coming up with the curriculum has not been easy, because trying to explain online safety in all the different local dialects we have in Kenya is a challenge we are facing. We’ve just been able to have the curriculum in simple English, and we are in the process of trying to translate it to Kiswahili, which is our national language. But we are also having discussions on translating it to local dialects. Sometimes, in instances of online violence, the words used are in languages culturally unique to Kenya. Thus, digital literacy is the first goal. The second goal is trying to use innovation for good. We are developing a mobile application called SafeHer. It uses anti-cyberbullying algorithms to aid in reporting accounts that consistently spew hate, not only in English, but also in our local dialects.
The challenge here is how to integrate this application with other social media platforms like Meta, X, or even Tiktok. Because, while other social media platforms have their own reporting mechanisms, most of the time the reports are never taken seriously. With this application on board, if for instance, over a thousand people keep reporting one account, then Meta might take it seriously. The mobile application is supposed to integrate the materials we have in our training, but in a simplified form, instilling the concepts of online safety into the users. So, it would be like a resource hub where you have access to steps on keeping yourself safe and internalize them. It’s supposed to foster a community of people who, for instance, have also faced similar forms of online violence by linking them with psychosocial support. This would look like a combination of several features. First, it would include a way for users to find counselors or mental health professionals who specialize in trauma caused by online violence, through a directory or referral system either through in-app messaging or scheduled sessions. Second, I’d want the app to have a forum or community space where people can share their stories and find others who have faced similar experiences, so survivors feel less alone and are supported. It’s important to me that this space feels safe, so there would be strict privacy controls in place. So, we are also trying to work on how to have professionals guide survivors of online violence using our app.
The third goal of SOW-Kenya is to speak about advocacy and policy legislation changes. If you look at Kenya, when we speak about our legislation in terms of computer misuse, we have the Cyber Crimes Act, passed in 2018. At the time, we did not have definitions or even discussions around cyberbullying. One of the core topics that SOW-Kenya is trying to do is explain to the public, policymakers, and legislators that we have a serious problem of online violence, and we need our laws to change accordingly.
RU: I am curious about the curriculum. What does the dissemination plan look like? Are you planning on collaborating with schools, other nonprofit organizations, or NGOs?
LOO: The TechSecure Literacy curriculum is supposed to go beyond SOW-Kenya. Our target audience is women between the ages of 13 to around 35, because people between these ages are often on social media platforms. Therefore, we’ve sought to collaborate with schools, high schools, and universities. We have tried to use our social media platforms to launch the curriculum. However, when we carried out the first two training sessions, we could only accommodate just 15 women for each, but interest was overwhelming, more than triple what we expected. Due to lack of resources, we were not able to host a training session that covered the numbers. Previously, we’ve been invited by universities to speak about SOW-Kenya’s work during campaigns such as 16 Days of Activism. When such opportunities come about, we try to talk about the curriculum. But so far, I would say we’ve not progressed further in terms of the next steps. We are looking for partnerships and collaborators.

An Doan (AN): I wanted to circle back and talk more about your work in tech policy. Can you describe the policy space in Kenya?
LOO: When we talk about tech policy in Kenya, we have to place it within the broader scope of tech in Africa. Kenya has been called the Silicon Savannah of Africa because many Kenyans are interested and participate in the tech world. We’ve seen Big Tech companies, such as Google, open their offices in Nairobi, which shows how the world is interested in investing in Africa, and particularly Kenya, because a majority of Kenyans speak English and often work in the gig economy. So, these Big Tech companies employed Kenyans either as content moderators or fact-checkers. However, we’ve had issues concerning labor laws where content moderators have been dismissed without notice, leading to court cases, which are still ongoing. Hundreds of former content moderators have sued Meta and OpenAI due to the mental health impact of looking at gruesome content and also due to the low wages they were paid.
When tech companies look at Kenya, they’re thinking about getting cheap labor. It’s easy for tech companies to target Kenyans because of our high unemployment rate, making it easier for people to accept the opportunities that come in this space. Thus, when we talk about tech policy, most of the time we are thinking about what is not being done or what the problems are that Kenya can solve. However, we often neglect the fact that we also have African innovators who are not only consumers of technology, but they are also big players. When we look at the policy space in Kenya, we have legislation such as the Data Protection Act of 2019, but the act was mostly a copy of the European Union’s General Data Protection Regulation (GDPR). So, as Kenya plans its national AI strategy, will we also simply copy the current EU AI act?
The conversations are still ongoing and we have a draft National AI Strategy. As we speak, we recently also just had some stakeholder engagement, reaching out to government, tech world, civil society, and the public to see what AI law looks like in our context, which is different to the West’s because we live by the Ubuntu philosophy.
This philosophy says, “I am because we are.” It believes in living in a community. Growing up, you would see even at the village level, that it’s not an individualistic society. We might not really know each other, but because you are my neighbor, I’m here to help you. In the context of AI policy, we are trying to see how we can have the same ethics and try to develop a responsible AI for Kenya. Those are the conversations that we are currently having.
RU: What do you think AI policy in Kenya should look like in terms of gender and the inclusion of women and their circumstances?
LOO: At the moment, if I do comparisons with the inclusion of gender in our previous legislation, most of the time, gender is added as a form of tokenism to show that they have included everyone. But when it comes to implementation, little to nothing is done. My hope is that, as Kenya moves forward into planning its national AI strategy, the AI technologies that are developed and/or deployed in Kenya protect marginalized communities such as women. This would begin, first, during the design phase. Are there any forms of biases in the data being used? And if so, how can we do away with them?
The second step would be how does the strategy, for instance, deal with the need for accountability? Because when you talk about regulatory mechanisms for AI transparency, it means the people involved should be able to disclose the origins of the data used and how it impacts women and marginalized communities. Then, as these discussions are happening, there’s a multi-stakeholder side. However, based off previous experiences, I look forward to seeing more marginalized people involved in these discussions. I’d just like to add that the stereotype has been that women do not really understand what AI is, thinking that AI is maybe magic, for example, and so they don’t really bring much into the discussion. But I would like to counter that and say that we have brilliant women who are leading startups in Kenya and who are innovators, but they’ve just not had the platform to share their work.
RU: How would you involve these stakeholders in these conversations? What would be the ideal way of making Kenya’s AI policy more inclusive of women and considering their circumstances?
LOO: First, would be embedding gender sensitive practices into AI governance frameworks as a policy strategy. For instance, looking at how gender issues intersect and manifest in socioeconomic or our cultural settings, because women from diverse backgrounds have distinct characteristics, which needs to be taken into consideration. A woman in Nairobi would not be impacted by AI regulation in the same way as it would impact a woman in rural areas. In order to accomplish this, it would be essential to have transparent and accountable data practices, especially clarity regarding where we are getting our data to avoid perpetuating prejudices, like what I had mentioned before. AI technologies, if deployed thoughtfully and ethically, can address a wide range of social, economic, and environmental issues in Kenya, such as healthcare quality, agricultural productivity, education, financial inclusion, and gender-based violence prevention.
The other thing that policymakers should do is work together with grassroots organizations. Most of the time when we speak about data, it’s grassroots organizations that have gathered this data, because they interact one-on-one with the real-life experiences of the women and girls involved. Policymakers should engage with organizations like SOW-Kenya to ensure that our national AI framework considers diverse perspectives. And also, when it comes to Big Tech, policymakers should be cautious in terms of the influence it has in how we shape our AI regulations, because there’ve been instances where Big Tech’s interests have entered conversations. And because of the monopoly and how the investments are done in Kenya, then the policymakers are automatically tempted to decide on what big tech is saying as opposed to what the citizens want.
When we talk about AI policy in Africa, we need to be aware of the fact that Africa is a diverse continent, so what might work for West Africa might not work for east or even south. The mistake that normally happens is that whenever people talk about AI policy, like at the African Union level where there is a continental AI strategy, the challenge has been how different member states implement such a strategy, because it does not cover the nuances of their different contexts.
RU: To wrap up our interview, I wanted to ask about whose work, organizations, books, or other literature has influenced or inspired you in this line of work.
LOO: The book Responsible AI in Africa, coedited by Damian Okaibedi Eke, Kutoma Wakunuma, and Simisola Akintoye, has been very instrumental in the work that I do, because it explains Africa’s different dynamics and how Africa is often portrayed as technologically underdeveloped. Yet, this is rarely the case. The book tries to demystify the perceptions or the narratives that people have about Africa by trying to link Africa to poverty and not linking Africa to innovation. And, even when discussions around AI and tech policy are made in the world, Africa comes last, yet we are such a huge continent and the data that is used in most cases often comes from the Global South. Even the people who work for Big Tech, such as content moderators, come from Kenya, for example. So, the book really tries to give a very good picture of how AI and tech policy in Africa should be.
Regionally, I look to organizations like Kenya ICT Action Network (KICTANet) with regard to policy. When we speak about international organizations, the work that UN Women is trying to do in Kenya specifically, looking at technology-facilitated gender-based violence has also been instrumental in my work. Based off my experience with national, regional, and international organizations, I have been aligned with organizations that speak about Africa as a partner and having Africa on board in these discussions and how we can collaborate instead of following a top-down approach.