Local validation
Help · 2 Nov 2021, 14:57 · 5

Hi, does anyone know the way to set up local validation with AUC? As I understand sklearn auc evaluation metric is not working due to only 0s and 1s in our target. So, are there any other ways to set up CV? Will be glad to receive any tips or snippets of code.

Discussion 5 answers

You need to generate the fpr and tpr using the roc curve function after training the model (using the label and the predicted probability of the positive class ). Then calculate the auc with the auc function.

2 Nov 2021, 15:03
Upvotes 0

Oh I got it, thank you a lot!

User avatar
Muhamed_Tuo
Inveniam
from sklearn.metrics import roc_curve, auc


def auc_metric(gt, pred):
    """
    gt: ground truths. Arrays or list
    pred: Array/List of probabilities 
    fpr, tpr, thresholds = roc_curve(gt, pred,     pos_label=1)
    return auc(fpr, tpr)
    
# Example

lgb_model = LGBMClassifier(**lgb_params)
lgb_model.fit(X, Y)
p = lgb_model.predict_proba(x)[:, 1]
auc_sc = auc_metric(y, p)

Thanks, that helps a lot!

I've been using sklearn.metrics.roc_auc_score:

cv_score = roc_auc_score(y_true, y_pred)

y_true - zeros and ones. Works fine so far.

8 Nov 2021, 20:40
Upvotes 0