For a dataset that's this large, I'm already out of disk space on google colab by just unzipping the images, Please what are some free platforms that allow users to train ml models with a large dataset
Or is there anyway to load only a fraction of the zipped images folder
hi,
u can delete the zip file after unzipping it
Well the issue is that I have to actually finish unzipping it before i do that, but i run out of disk space before the unzipping is over
i work on colab too and it doesn't run out of space, maybe because you have the drive mounted , i don't know ^^
Or is there anyway to load only a fraction of the zipped images folder
hi,
u can delete the zip file after unzipping it
Well the issue is that I have to actually finish unzipping it before i do that, but i run out of disk space before the unzipping is over
i work on colab too and it doesn't run out of space, maybe because you have the drive mounted , i don't know ^^