Primary competition visual

Barbados Traffic Analysis Challenge

Helping Barbados
$11 000 USD
Completed (2 months ago)
Computer Vision
Prediction
Object Tracking
Video Analytics
Deep Learning
1836 joined
222 active
Starti
Nov 07, 25
Closei
Jan 25, 26
Reveali
Jan 26, 26
User avatar
MuhammadQasimShabbeer
Engmatix
Data Upload to kaggle request
Help · 8 Nov 2025, 17:00 · 16

Can anyone please upload the data to kaggle in two days? It will be very helpful for people who do not have lot of space like me at their laptop 500 GB or 100 GB free space to work on competition without any issue. Thanks for cooperations if it is not done by the end of week i will upload to kaggle msyself and share with anyone? Wishing everyone best luck on this challenge

Discussion 16 answers
User avatar
Ojuka
Mount Kenya University

On the lookout coz the space is damn big, I can't manage with my little storage

8 Nov 2025, 17:33
Upvotes 0

I can do it, but only if organizers allow it.

I'm not sure if it is allowed by the rules.

8 Nov 2025, 18:24
Upvotes 1

Even if you upload to kaggle there is still no way to actually load the dataset on the kaggle instance. Kaggle gives 20GB but the data is way more than this......So uploading to kaggle doesn't neccessarily solve this storage problem in my own opinion

8 Nov 2025, 19:32
Upvotes 0
User avatar
MuhammadQasimShabbeer
Engmatix

@picekl We can splits data into chunks and while workings in a Notebook we can Get these chunks or simply 100 GB reduce Data is also fine. if you need more space i can provide you a kaggle account of my friend which is empty we can use this your account to upload this Big data into chunks.

looking forward to your solution and hope the final result and data link to kaggle 😭

User avatar
MuhammadQasimShabbeer
Engmatix

I’ll upload the dataset in chunks so it fits Kaggle — no worries, the link and notebook are coming soon. Think of it like cutting a giant cake… we just slice it so Kaggle doesn’t choke 😄

Hi, any update on the upload to kaggle?

User avatar
MuhammadQasimShabbeer
Engmatix

@Bone I prepare script put it on laptop for download of data i have clean all space so that i can download all video file right now it is taking 156 hours to load all data from google to local right now i was able to download 1056 file in 7 Hours but i am runing it hoping to copmlete in a week so if anyone have already loaded data you can ask them to upload to kaggle it will helps a lot of people to kick start the competition

Thank you for going through the stress to help others.

Bro, I see that Buck has already gone down. Is your download still going smoothly? The resolution of the video compressed by the organizer is extremely low, and the quality is very poor, which affects feature extraction. I'm really looking forward to how your download is going.

User avatar
MuhammadQasimShabbeer
Engmatix

@sisterofpeipei my downloading is also has same issue it is going down it is still not working

I've already replied in the official data-update chat asking them to reopen the high-resolution download to keep the contest fair.

Also, @picekl do you already has the full original dataset? Their licence already allows mirroring to other storage or Kaggle. Or, If you already have the full dataset, we can discuss together where to upload it in batches and share it with other participants.

use colabs is another way to do this...yo prefiero la plataforma de Kaggle pero tiene problemas con algunas librerias como por ejemplo...Torch_geometrics y otras mas que no vienen por defecto...( spanish)

8 Nov 2025, 21:36
Upvotes 1

Instead of downloading everything, you might consider processing the data directly from the public bucket.

Here’s a potential workflow that could help with your space constraints:

  • Process in Chunks: You can download videos one-by-one or in small chunks.
  • Extract Features: Immediately extract the necessary features (the data you'll actually use for training).
  • Delete the Raw Video: Once the features are extracted and saved, you can delete the raw video file to free up the space before moving to the next chunk.

The main benefit is that you only need to store the resulting features locally,

10 Nov 2025, 10:49
Upvotes 2

The constraint in the competition is the No Backpropagation rule, which prevents the use of standard end-to-end deep learning where raw data (like video pixels) directly tunes the entire model.