Hello zindians,
The metric for this competition is the AUC. The cross_val_score function provided by sklearn does not recognize 'auc' as a metric. I have done some research on Internet and the use of 'auc' in cross validation is mixed. Then
- Are you using 'auc' in sklearn's cross_val_score function? If so, how do you do it?
- Why is 'auc' not recommended in cross validation?
- Why use auc instead of F1?
PS: I have been using F1-score since the start of the competition.
Hi @ff,
For auc in cross_val_score, set the scoring parameter to 'roc_auc'. This represents auc.
okay thanks!