Primary competition visual

Farm to Feed Shopping Basket Recommendation Challenge

Helping Kenya
€8 250 EUR
Completed (2 months ago)
Machine Learning
Prediction
Feature Engineering
747 joined
266 active
Starti
Dec 02, 25
Closei
Jan 19, 26
Reveali
Jan 20, 26
User avatar
AJoel
Zindi
Multi-metric and LB score.
Platform · 12 Dec 2025, 15:01 · 1

Understanding the Public Leaderboard Score

The public leaderboard score is calculated as the simple average of four normalised metrics. Each metric contributes equally (25%) to the final score.

Targets and Metrics

• Target_purchase_next_1w – AUC

• Target_qty_next_1w – MAE

• Target_purchase_next_2w – AUC

• Target_qty_next_2w – MAE

Normalisation Rules

• AUC is already bounded between 0 and 1 and is used directly.

• MAE is normalised using min–max scaling and then inverted so that lower error results in a higher score.

Normalisation Ranges

• AUC: xmin = 0, xmax = 1

• MAE (Target_qty_next_1w): xmin = 0, xmax = 59.5561260135

• MAE (Target_qty_next_2w): xmin = 0, xmax = 81.6314093155

Benchmark Scores (see LB)

Target 1 WAUC = 0.837729787

Target 2 WAUC = 0.828651274

Target 1 Qty MAE = 0.616150905

Target 2 Qty MAE = 1.748611334

Normalised Scores (No Rounding Applied)

Target 1 WAUC normalised = 0.837729787

Target 2 WAUC normalised = 0.828651274

Target 1 Qty MAE normalised = 1 − (0.616150905 / 59.5561260135) = 0.9896542816626398

Target 2 Qty MAE normalised = 1 − (1.748611334 / 81.6314093155) = 0.9785791847934202

Final Score Formula

Final score = 0.25 × (Target 1 WAUC normalised + Target 2 WAUC normalised + Target 1 MAE normalised + Target 2 MAE normalised)

Final Public Leaderboard Score 0.908653631864015

Discussion 1 answer
User avatar
Brainiac

Thanks for the update

12 Dec 2025, 15:37
Upvotes 0