Primary competition visual

agriBORA Commodity Price Forecasting Challenge

Helping Kenya
€8 250 EUR
18 days left
Data analysis
GIS
Time-series
Forecasting
Nowcasting
604 joined
199 active
Starti
Nov 14, 25
Closei
Dec 27, 25
Reveali
Jan 13, 26
User avatar
Freelance
Clarification on leaderboard scoring method
Help · 16 Nov 2025, 17:03 · 4

I don't know if anyone is as confused as I am about how the evaluation metric for this competition is combined. I understand that it is a multimetric method, but I am a bit confused about the direction that indicates a better model.

At the info tab, it is said that the score is a weighted average of MAE and RMSE. If this is the case, I think a lower score indicates a better model, since both metrics are minimised (errors). However, on the leaderboard, it is not; rather, a higher score indicates a better model.

Let's assume in a hypothetical case that a model has MAE and RMSE values of 5 and 7, respectively and another, 2 and 5, respectively. A weighted average (equal weighting) of the two models would be 6 and 3.5, respectively. In this case, the second model is better than the first, but in the case of the leaderboard, the first model is better.

I don't know, but maybe I may be wrong with how the scores are combined. Probably, @Zindi may show us a formula on how both metrics are averaged.

Peace!

Discussion 4 answers
User avatar
nymfree

indeed very confusing.

16 Nov 2025, 17:56
Upvotes 1
User avatar
J0NNY

The formula is:

1 - (RMSE/4.76112697162654 + MAE/3.82592018387808) / 2

The higher the better

17 Nov 2025, 03:57
Upvotes 3
User avatar
Freelance

Hi @JONNY, thank you for providing the formula. It confirms that the higher score the better. 👍

User avatar
Amy_Bray
Zindi

Hi Gozie, our apologies. We have updated the blog post for you to better understand: https://zindi.africa/learn/introducing-multi-metric-evaluation-or-one-metric-to-rule-them-all

18 Nov 2025, 14:05
Upvotes 2