focus on models from different families of algorithm to build the ensembles. add a little bit randomness to the models and use a mode or mean probability score
You don't need stacking, ensembling yet ,just focus on features engineering, feature selecting.. after all those you will see great change when you use stacking and the likes
use random search to get good parameters. then, fine tune the values using a smaller set of values . Alternatively, you can use bayesian search, its very fast and accurate!
Change the knn to Decision Tree. KNN takes longer period to train on such dataset.
Alright Thanks
Did you on your GPU on colab? And forget about knn bcos it will take a while day to run.
yes GPU enabled on colab, can you suggest some models for me
Use Lightgbm..... It's more faster.
Use the boosting family .. catboost,lightgbm or xgboost
yeah, i have tried that, it worked well, i want to improve the score, any tip
Try ensemble techniques, crossvalidation techniques, stacking etc.
focus on models from different families of algorithm to build the ensembles. add a little bit randomness to the models and use a mode or mean probability score
You don't need stacking, ensembling yet ,just focus on features engineering, feature selecting.. after all those you will see great change when you use stacking and the likes
Thanks, i will try that out
Alright, Thanks i will try it out
use random search to get good parameters. then, fine tune the values using a smaller set of values . Alternatively, you can use bayesian search, its very fast and accurate!
Thanks