Ever since I was introduced to Fastai thanks to the wonderful notebooks by @Professor I fell in love with it. At first I was doubting its perfomance because not many winning solutions on Kaggle use this framework but then seeing Victor in the top 10 in most dl competitions using it motivated me.
Jeremy Howard founder of fastai said you have to find a way to iterate quickly over your experiments. At first I was having trouble doing that but then when I teamed up with my brother @J0NNY everything became easy. He found a way to resize the images and which had a significant improvement in the time of experimenting. We could now iterate over multiple experimets quickly.
We used heavy augmentation thanks to the albumentations library which include:
One interesting aug was RandomGridShuffle. Even though it was not in our best single nb which scored (0.9794 a VIT) , It was crucial in our other notebooks that we used different models.
I think Models were crucial in this competition and not only this but I guess transformers are now taking over in the DL space. Jeremy has a very cool notebook on the best models to fine tune and everyone should look at that via this link
Convnets, vits and swins are the best models for finetuning. So thats what we used and then ensembled.
Cross validation, My bro @J0NNY used stratifiedKFold while I used a simple random split then we ensembled both our approaches.
I hope this helps.
Lastly thank you @J0NNY for teaming up with me. You have accelerated my machine learning and deep learning journey in ways I could have not imagined. Thank you @Professor you inspire me and it was great competing with you.
Same here!
Enlighten us, please🙃
Hey @Kuala,
2nd Place Approach,
Ever since I was introduced to Fastai thanks to the wonderful notebooks by @Professor I fell in love with it. At first I was doubting its perfomance because not many winning solutions on Kaggle use this framework but then seeing Victor in the top 10 in most dl competitions using it motivated me.
Jeremy Howard founder of fastai said you have to find a way to iterate quickly over your experiments. At first I was having trouble doing that but then when I teamed up with my brother @J0NNY everything became easy. He found a way to resize the images and which had a significant improvement in the time of experimenting. We could now iterate over multiple experimets quickly.
We used heavy augmentation thanks to the albumentations library which include:
Resizing, Rotating, Flipping, ShiftScaleRotating, HueSaturationValue, RandomBrightnessContrast, RandomGamma
One interesting aug was RandomGridShuffle. Even though it was not in our best single nb which scored (0.9794 a VIT) , It was crucial in our other notebooks that we used different models.
I think Models were crucial in this competition and not only this but I guess transformers are now taking over in the DL space. Jeremy has a very cool notebook on the best models to fine tune and everyone should look at that via this link
https://www.kaggle.com/code/jhoward/the-best-vision-models-for-fine-tuning/notebook
Convnets, vits and swins are the best models for finetuning. So thats what we used and then ensembled.
Cross validation, My bro @J0NNY used stratifiedKFold while I used a simple random split then we ensembled both our approaches.
I hope this helps.
Lastly thank you @J0NNY for teaming up with me. You have accelerated my machine learning and deep learning journey in ways I could have not imagined. Thank you @Professor you inspire me and it was great competing with you.
@Koleshjr this is gold! Thanks for sharing.
Do you have any recommended resources on ensembling neural networks? Can't say I know how to do that.
We never did complex ensembling techniques. Just weighted veraging based on the local scores and public lb scores.
Oh. Never thought about this. Smart! Thanks.
What an hackathon! 😅, it's great to hear that sharing some of my work inspired you @Koleshjr.
Congratulations to you and your teammate on an amazing win! More wins ahead 🥂
Nice approach, thanks for sharing, this is helpful.
May I ask you what is the score of a single model without enseble.
Congrats to you and to JONNY.
Kay0, I guess this video on youtube can help you understand ensembling/blending: https://www.youtube.com/watch?v=ISZYWtvoCAc&t=2s
0.9794 with Vit_Base_Patch_16_384
That great. again thanks for sharing and good luck in the next competitions.