I have implemented my model with Tensorflow but I'm getting terrible scores.
I have tried:
* Class weights to handle the imbalanced dataset
* Training with different LR and Convolutions
* Training from scratch and transfer learning from mobileNetV2
Here's my notebook: https://github.com/Sciederrick/zindi-cgiar/blob/main/cgiar-crop-damage-classification-with-tf.ipynb
isnt the competition Over ?
Hi Derrick, from my experience working on this competition, using class weights makes the model prone to over fitting. Personal tricks that worked for me to some degree,
1. Definitely use transfer learning
2. Instead of class weights, try focal loss. Although crossentropy worked better for me
3. Instead of randomly changing LR, use a LR finder as a starting point. Then use some kind of scheduler (e.g one cycle) to gradually decay your LR.
4. Freeze all layers except the final layer and train for a few epochs, then fine-tune by unfreezing the entire model (or some of the layers) for the remainder of the epochs.
5. Try different LR for different layers of the model. Earlier layers can have very small LR so as not to greatly affect what it has already learnt.
These tricks helped me reach an LB of 0.6-ish. Which I believe is the usual baseline.
6. Other things to try include, Test Time augmentation.
7. Using more complex models
8. Using larger image sizes.
9. Or simply doing some house cleaning on the data.
Even though the competition is over, I still believe one can gain some valuable insights from it.
Your pipeline is broken, it is not about adjustments. Train loss should go down, in notebook it does not. Remove everything, keep it simple: no augmentations, Adam, lr = 3e-4, single batch training, CE loss. Make sure train loss goes down
Thank you for the responses, @atia @Gleb_bak I will give it another shot.
Although, cleaning the data can pose as a challenge. Its kind of difficult to know what to clean without being an expert in the field. I'll try implementing suggestions on the model then see where I go from there.