Many thanks to Zindi and other organizers for this exciting challenge.
Solution Approach
Data Download and Manipulation
- Images were downloaded in batches to avoid out of memory error as colab TPU has a maximum of 35gb RAM
- The images were zipped and stored in google drive.
- Images with a 10 day frequency were used to get the raw image pixels
- Images at the start and end of every month were also processed to raw numpy values
- Pyspark was used to get the mean of the pixel values for each field. Pyspark was utilised becaused the data was quite huge and because of limited compute resources.
Feature Engineering and Preprocessing
- Removed skewness using square root
- Vegetation indices calculations
- Vegetation Indices aggregation - mean
- Vegetation indices differences between different periods
- Quantiles
- Filled missing and infinite numbers with -999999
Model Training
- Catboost classifier trained on vegetation indices data using a 10 stratified cross validation strategy
- LGBM classifier trained on vegetation indices data using a 10 stratified cross validation strategy
- A pytorch classifier trained on raw image pixels.
Final Model
- The final model is an ensemble of boosting trees i.e LGBM and catboost and a PyTorch classifier
Link to solution: https://github.com/DariusTheGeek/Radiant-Earth-Spot-the-Crop-Challenge
Thank you for sharing this!!
This is a great wealth of knowledge, congrat!!!
Thank you for sharing, was in a learning phase block and just got the starting points from the shared resources
@Brainiac Congrats and thanks for sharing. Intresting approach, but for weights of models in ensemble, is there any approach to choice them because anyone can try different weights randomly to adjust the predictions of model and improve the score becaue I see the summation of weights could be more than 1.
Thank you @masawdah. The weights were selected based on the models' local cv and leaderboard score.
All weights add up 1.
@Brainiac so the selected weights it is subjective more than objective, it comes with experience. Thanks for your reply and congrats again :)