Primary competition visual

Your Voice, Your Device, Your Language Challenge

Helping Africa
1 000 CHF
Challenge completed ~1 month ago
Automatic Speech Recognition
Natural Language Processing
278 joined
73 active
Starti
Jul 22, 25
Closei
Sep 22, 25
Reveali
Sep 22, 25
Clarification Needed: Does the T4 GPU Constraint Apply to Training or Just Inference?
Help · 8 Aug 2025, 09:25 · 2

Hello fellow Zindians

I’m seeking clarification about the following requirement

*"To reflect the realities of deploying AI in low-resource environments, your solution must run on a single NVIDIA T4 GPU with ≤16 GB RAM, such as what’s available through Google Colab’s free tier."*

Question: Does this constraint apply to:

  1. Both fine-tuning/training and inference? (i.e., the entire pipeline must run on a T4), or
  2. Only inference (after training on more powerful hardware)?

I want to ensure my approach aligns with the rules. If anyone has insights or has asked organizers directly, I’d appreciate your input!

Discussion 2 answers
User avatar
msamwelmollel
University of Glasgow

Just inference!

8 Aug 2025, 18:18
Upvotes 0