Primary competition visual

IBM SkillsBuild Hydropower Climate Optimisation Challenge

Helping the World
$3 000 USD
Completed (12 months ago)
Prediction
Forecast
1231 joined
466 active
Starti
Mar 03, 25
Closei
Apr 13, 25
Reveali
Apr 14, 25
User avatar
Knowledge_Seeker101
Freelance
Key to LB improvement
12 Apr 2025, 09:27 · 12

Okay guys I think post processing might be the key to LB improvement what 's your opinion ,

Discussion 12 answers
User avatar
CodeJoe

Did you apply a postprocessing trick to your subs?

12 Apr 2025, 09:29
Upvotes 0
User avatar
Knowledge_Seeker101
Freelance

Yea I did a single post processing which improved my score from 6.38 to 6.18

User avatar
CodeJoe

Wow that's a boost. I tried the multiplier trick but it just boosted me a little. Are you using gbdt models?

User avatar
Knowledge_Seeker101
Freelance

Yes I am only using CatBoost, I think you should understand the dynamics of your data for some post processing to work

User avatar
CodeJoe

Have you tried lightgbm? To me lightgbm is performing more than catboost

User avatar
CodeJoe

I'm also impressed you used catboost to reach that score. Big ups!

User avatar
Knowledge_Seeker101
Freelance

Yea I did a lot of feature engineering, feature engineering worked it got me top 20 , lightgbm didn't work very well for me I am not good at parameter tuning

User avatar
CodeJoe

You don't really have to tune the lgbm parameters here. I even used an early stopping of 5 and just 1000 n_estimators.

User avatar
Knowledge_Seeker101
Freelance

Wow , for me the score was >7 , used 1000 n_estimators no early stopping, maybe I was overfitting

User avatar
CodeJoe

Definitely, when I tried with even 10 early stopping, I got a worse score. Can't tell though, we are just public board probing🤣🤣

User avatar
Knowledge_Seeker101
Freelance

🤣🤣 that's a fact

User avatar
Knowledge_Seeker101
Freelance

I didn't do any validation, I lost a lot of signal during splitting so am actually relying on lb so far @CodeJoe