Please note that you must sign up for this hackathon with the same email address you signed up to Zindi. If you need to change your email address on Zindi, please watch this video to navigate to edit your profile.
This hackathon is only open to participants who physically attend the Nairobi MCV Hackathon event hosted by Africa’s Talking on the 13th and 14th of October.
If you have any questions, please reach out to Kathleen Siminyu - witty-kitty.
Mozilla Common Voice is an open-source initiative to make voice technology more inclusive. Supported by the funding from Bill and Melinda Gates, FCDO and GIZ, we have been enhancing the growth of open source data sets for Africa Languages in East Africa. Our open source data set for Kinyarwanda has been used to build a chatbot to respond to COVID-19 questions, and our partners in Uganda are working on a keyword spotter in Luganda.
In 2021, we focused on building an open source data set for Kiswahili and now Mozilla is opening up this competition as part of a series of activities that work towards equipping the technical community in East Africa with skills to build voice technology. This Hackathon, and the wider series of activities, is aimed at increasing the impact of the MCV Kiswahili dataset by fostering a developer community with the skills to take the dataset and make use of it in problem solving.
The objective of the challenge is to train an ASR model for Kiswahili using MCV Kiswahili data. Beyond building a well performing model, we are keen on models that have comparable performance across demographic characteristics, particularly age and gender. We will compare the overall performance of the model with how it compares on the following evaluation sets representing the following demographics:
You can use an ASR library of your choice to train the Kiswahili model. Here is an example notebook using NVIDIA Nemo and MCV data - Training Esperanto ASR model using Mozilla Common Voice Dataset. You can either use the data to train a model from scratch or to fine-tune a pre-trained model.
Training on GPU is recommended. You can access some GPU resources on platforms such as Colab and Kaggle.
About Mozilla Foundation
 The Mozilla Foundation works to ensure the internet remains a public resource that is open and accessible to us all. We invest in bold ideas, global leaders, and the convening and campaign power of people. For more than two decades, we’ve worked across borders, disciplines, and technologies to fuel a movement to realize the full potential of the internet.
What We Do
Rally Citizens
Issues around the health of the internet impact us all. Mozilla empowers consumers to demand better online privacy, trustworthy artificial intelligence, and safe online experiences from Big Tech and governments.
Connect Leaders
Mozilla supports activists and thought leaders shaping the future of our online lives, advocating for a better internet. We provide funding, mentorship, and connection to a growing community of leaders and innovators.
Shape the Agenda
The internet is a vast ecosystem. Mozilla’s research — from our annual Internet Health Report to our white papers — make sense of all of it. We investigate threats to a healthy internet, from the explosion of online tracking, to increasing internet shutdowns, to the advent of deepfakes. But we also highlight bright spots and provide solutions, from the potential of open data to nascent social media cooperatives.
Mozilla’s research, like its products, is open-source. It’s built with input from experts across the globe. And it’s free for others to re-publish, remix, and use in their own campaigns and curricula.
The evaluation metric for this challenge is Word Error Rate metric.
You can read more about WER in this medium article. Here are some libraries to start off with.
Prizes will be based on a combination of 60% of your leaderboard score and 40% score on your demographic score which is achieved from reviewing your submissions.
For every row in the dataset, submission files should contain 2 columns: Path and Sentence.
Your submission file should look like this:
Path Sentence common_voice_sw_37214280.mp3 Miti mizee ya ...
common_voice_sw_37803619.mp3 Ni chanzo cha mto ...
1st prize: $1 000 USD
2nd prize: $500 USD
3rd prize: $200 USD
There are 500 Zindi points available. You can read more about Zindi points here.
Prizes will be based on a combination of 60% of your leaderboard score and 40% score on your demographic score which is achieved from reviewing your submissions.
Prizes will only be paid to user currently living in Kenya.
This challenge starts on 14 October 2023 at 9:00 AM.
Competition closes on 15 November 2023 at midnight.
Final submissions must be made by 15 November 2023 23:59 PM EAT.
The private leaderboard and winners will be announced on Saturday 18th November at Swahilipot Hub, Mombasa, Kenya.
We reserve the right to update the contest timeline if necessary.
How to enroll in your first Zindi competition
How to create a team on Zindi
How to update your profile on Zindi
How to use Colab on Zindi
How to mount a drive on Colab
This challenge is only open to those who have registered via the Pwani Innovation week, MCV Hackathon I held at Swahilipot in Mombasa, Kenya.
Teams and collaboration
You may participate in competitions as an individual or in a team of up to four people. When creating a team, the team must have a total submission count less than or equal to the maximum allowable submissions as of the formation date. A team will be allowed the maximum number of submissions for the competition, minus the total number of submissions among team members at team formation. Prizes are transferred only to the individual players or to the team leader.
Multiple accounts per user are not permitted, and neither is collaboration or membership across multiple teams. Individuals and their submissions originating from multiple accounts will be immediately disqualified from the platform.
Code must not be shared privately outside of a team. Any code that is shared, must be made available to all competition participants through the platform. (i.e. on the discussion boards).
The Zindi data scientist who sets up a team is the default Team Leader but they can transfer leadership to another data scientist on the team. The Team Leader can invite other data scientists to their team. Invited data scientists can accept or reject invitations. Until a second data scientist accepts an invitation to join a team, the data scientist who initiated a team remains an individual on the leaderboard. No additional members may be added to teams within the last hour of a hackathon.
The team leader can initiate a merge with another team. Only the team leader of the second team can accept the invite. The default team leader is the leader from the team who initiated the invite. Teams can only merge if the total number of members is less than or equal to the maximum team size of the competition.
A team can be disbanded if it has not yet made a submission. Once a submission is made individual members cannot leave the team.
All members in the team receive points associated with their ranking in the competition and there is no split or division of the points between team members.
Datasets and packages
The solution must use publicly-available, open-source packages only.
You may use only the datasets provided for this competition. Automated machine learning tools such as automl are not permitted.
You may use pretrained models as long as they are openly available to everyone.
You are allowed to access, use and share competition data under a CC0 license.
You must notify Zindi immediately upon learning of any unauthorised transmission of or unauthorised access to the competition data, and work with Zindi to rectify any unauthorised transmission or access.
Your solution must not infringe the rights of any third party and you must be legally entitled to assign ownership of all rights of copyright in and to the winning solution code to Zindi.
Submissions and winning
You may make a maximum of 10 submissions per day.
You may make a maximum of 100 submissions for this competition.
Before the end of the competition you need to choose 2 submissions to be judged on for the private leaderboard. If you do not make a selection your 2 best public leaderboard submissions will be used to score on the private leaderboard.
During the competition, your best public score will be displayed regardless of the submissions you have selected. When the competition closes your best private score out of the 2 selected submissions will be displayed.
Zindi maintains a public leaderboard and a private leaderboard for each competition. The test set is used to score the public leaderboard and the eval set is used for the private leaderboard.
Note that to count, your submission must first pass processing. If your submission fails during the processing step, it will not be counted and not receive a score; nor will it count against your daily submission limit. If you encounter problems with your submission file, your best course of action is to ask for advice on the Competition’s discussion forum.
If you are in the top 5 at the time the leaderboard closes, we will email you to request your code. On receipt of email, you will have 2 hours to respond and submit your code following the Reproducibility of the submitted code guidelines detailed below. Failure to respond will result in disqualification.
If your solution places 1st, 2nd, or 3rd on the final leaderboard, you will be required to submit your winning solution code to us for verification, and you thereby agree to assign all worldwide rights of copyright in and to such winning solution to Zindi.
If two solutions earn identical scores on the leaderboard, the tiebreaker will be the date and time in which the submission was made (the earlier solution will win).
The winners will be paid via bank transfer, PayPal if payment is less than or equivalent to $100, or other international money transfer platform. International transfer fees will be deducted from the total prize amount, unless the prize money is under $500, in which case the international transfer fees will be covered by Zindi. In all cases, the winners are responsible for any other fees applied by their own bank or other institution for receiving the prize money. All taxes imposed on prizes are the sole responsibility of the winners. The top winners or team leaders will be required to present Zindi with proof of identification, proof of residence and a letter from your bank confirming your banking details. Winners will be paid in USD or the currency of the competition. If your account cannot receive US Dollars or the currency of the competition then your bank will need to provide proof of this and Zindi will try to accommodate this.
Please note that due to the ongoing Russia-Ukraine conflict, we are not currently able to make prize payments to winners located in Russia. We apologise for any inconvenience that may cause, and will handle any issues that arise on a case-by-case basis.
Payment will be made after code review and sealing the leaderboard.
You acknowledge and agree that Zindi may, without any obligation to do so, remove or disqualify an individual, team, or account if Zindi believes that such individual, team, or account is in violation of these rules. Entry into this competition constitutes your acceptance of these official competition rules.
Zindi is committed to providing solutions of value to our clients and partners. To this end, we reserve the right to disqualify your submission on the grounds of usability or value. This includes but is not limited to the use of data leaks or any other practices that we deem to compromise the inherent value of your solution.
Zindi also reserves the right to disqualify you and/or your submissions from any competition if we believe that you violated the rules or violated the spirit of the competition or the platform in any other way. The disqualifications are irrespective of your position on the leaderboard and completely at the discretion of Zindi.
Please refer to the FAQs and Terms of Use for additional rules that may apply to this competition. We reserve the right to update these rules at any time.
A README markdown file is required
It should cover:
Your code needs to run properly, code reviewers do not have time to debug code. If code does not run easily you will be bumped down the leaderboard.
Consequences of breaking any rules of the competition or submission guidelines:
Monitoring of submissions
Join the largest network for 
data scientists and AI builders