Can anyone please upload the data to kaggle in two days? It will be very helpful for people who do not have lot of space like me at their laptop 500 GB or 100 GB free space to work on competition without any issue. Thanks for cooperations if it is not done by the end of week i will upload to kaggle msyself and share with anyone? Wishing everyone best luck on this challenge
On the lookout coz the space is damn big, I can't manage with my little storage
I can do it, but only if organizers allow it.
I'm not sure if it is allowed by the rules.
Even if you upload to kaggle there is still no way to actually load the dataset on the kaggle instance. Kaggle gives 20GB but the data is way more than this......So uploading to kaggle doesn't neccessarily solve this storage problem in my own opinion
The Kaggle dataset limit is 200GB.
https://www.kaggle.com/docs/datasets#technical-specification
@picekl We can splits data into chunks and while workings in a Notebook we can Get these chunks or simply 100 GB reduce Data is also fine. if you need more space i can provide you a kaggle account of my friend which is empty we can use this your account to upload this Big data into chunks.
looking forward to your solution and hope the final result and data link to kaggle 😭
I’ll upload the dataset in chunks so it fits Kaggle — no worries, the link and notebook are coming soon. Think of it like cutting a giant cake… we just slice it so Kaggle doesn’t choke 😄
Hi, any update on the upload to kaggle?
@Bone I prepare script put it on laptop for download of data i have clean all space so that i can download all video file right now it is taking 156 hours to load all data from google to local right now i was able to download 1056 file in 7 Hours but i am runing it hoping to copmlete in a week so if anyone have already loaded data you can ask them to upload to kaggle it will helps a lot of people to kick start the competition
Thank you for going through the stress to help others.
Bro, I see that Buck has already gone down. Is your download still going smoothly? The resolution of the video compressed by the organizer is extremely low, and the quality is very poor, which affects feature extraction. I'm really looking forward to how your download is going.
@sisterofpeipei my downloading is also has same issue it is going down it is still not working
I've already replied in the official data-update chat asking them to reopen the high-resolution download to keep the contest fair.
Also, @picekl do you already has the full original dataset? Their licence already allows mirroring to other storage or Kaggle. Or, If you already have the full dataset, we can discuss together where to upload it in batches and share it with other participants.
use colabs is another way to do this...yo prefiero la plataforma de Kaggle pero tiene problemas con algunas librerias como por ejemplo...Torch_geometrics y otras mas que no vienen por defecto...( spanish)
Instead of downloading everything, you might consider processing the data directly from the public bucket.
Here’s a potential workflow that could help with your space constraints:
The main benefit is that you only need to store the resulting features locally,
The constraint in the competition is the No Backpropagation rule, which prevents the use of standard end-to-end deep learning where raw data (like video pixels) directly tunes the entire model.