Primary competition visual

GEOAI Challenge for Cropland Mapping in Dry Environments

Helping Uzbekistan, Russian Federation
1 000 CHF
Completed (6 months ago)
Classification
Earth Observation
617 joined
184 active
Starti
Jul 02, 25
Closei
Sep 29, 25
Reveali
Sep 29, 25
User avatar
Amy_Bray
Zindi
🏆Announcing the winners 🏆
Platform · 27 Oct 2025, 11:20 · 6

Hi everyone!

The results are in, and we are thrilled to announce the winners! A huge congratulations to the top performers who have shown incredible skill, creativity, and perseverance throughout this challenge. Without further ado, here are your champions:

🥇 1st Place: @Gozie

🥈 2nd Place: Team Bösendorff

🥉 3rd Place: @rogue_26

Your hard work and innovative solutions are truly inspiring, and we’re so proud of everything this community has achieved during this challenge.

How winners were selected Only users who submitted valid solutions on time were judged. Their work was evaluated across three key dimensions:

  • Performance (50%): Private leaderboard score
  • Novelty (25%): How original or creative is your approach? Did you explore unusual architectures, pre-processing methods, data augmentation techniques, or ensemble strategies? We’re looking for new ideas, not just boilerplate pipelines.
  • Practicality (25%): Could this solution be realistically deployed or adapted for the real-world use case? Consider factors like speed, interpretability, resource constraints (e.g. memory, inference time), and how the model might perform outside of the test data.)

A note to all participants: For those who didn’t make it to the top, remember: on Zindi, every challenge is a chance to grow, learn, and connect. We encourage you to share your solutions and thought processes in the forums so we can all learn from each other.

Thank you all for making this challenge an incredible experience. Stay tuned for more exciting competitions and opportunities to showcase your talent.

Keep Zinding! 🌟

Discussion 6 answers

but leaderboard different results

27 Oct 2025, 12:43
Upvotes 1

Can we get phase 2 score of each one ?

sure I got high score on Performance (50%) and created model architecture from scratch i guess anything can't beat that in Novelty (25%), my solution inference can run on 2GB ram system hope satisfied Practicality (25%)

lack of Transparency can easily people get discrimination on open space.

even phase 2 only notice after competition end and already my solution submited.

28 Oct 2025, 04:40
Upvotes 0
User avatar
Amy_Bray
Zindi

Hi bit_guber,

We are working on trying to make things clearer. This is a feature we are actively working on.

At the moment the leaderboard reflects the best score!

Here are the ranks of the final solutions including the novelty and practicality score.

I hope this helps.

Best regards,

still how come Boosting trees model more novel than specific contruct netural network architecture?

p.s : I talked to 2 others on list their used ensemble boosting models

User avatar
Masese
University of Nairobi

Is there somewhere we can see their solutions?

28 Oct 2025, 12:34
Upvotes 0

init = 'glorot_uniform'
inputLayer = tf.keras.layers.Input( shape =(48, features.shape[-1],) )
x = tf.keras.layers.BatchNormalization()(inputLayer)
x = tf.keras.layers.GaussianNoise(0.006, seed = seed)(x)
x = tf.keras.layers.Permute( (2, 1) )(x)
x = tf.keras.layers.SpatialDropout1D( 0.1, seed = seed )(x)
x = tf.keras.layers.Permute( (2, 1) )(x)
x = tf.keras.layers.Conv1D( int(96*1), 2,  padding='valid', activation = activation, kernel_initializer=init, groups = 4  )(x)
x = tf.keras.layers.MaxPool1D()(x)
x = tf.keras.layers.SpatialDropout1D( 0.1, seed = seed )(x)
x = tf.keras.layers.Conv1D( int(192*1), 2,  padding='valid', activation = activation, kernel_initializer=init, groups = 4 )(x)
x = tf.keras.layers.MaxPool1D()(x)
x = tf.keras.layers.SpatialDropout1D( 0.1, seed = seed )(x)
x = tf.keras.layers.Dense(int(384*1), activation=activation, kernel_initializer=init,  )(x)
x = tf.keras.layers.SpatialDropout1D( 0.1, seed = seed )(x)
x = tf.keras.layers.GlobalMaxPooling1D()(x)
x = tf.keras.layers.Flatten()(x)
x = tf.keras.layers.Dense( int(256*1), activation=activation, kernel_initializer=init )(x)
x = tf.keras.layers.Dropout( 0.4, noise_shape = (1, 256), seed = seed )(x)
x = tf.keras.layers.Dense( int(128*1), activation=activation, kernel_initializer=init )(x)
x = tf.keras.layers.Dropout( 0.4, noise_shape = (1, 128), seed = seed )(x)
x = tf.keras.layers.Dense( int(64*1), activation=activation, kernel_initializer=init )(x)
x = tf.keras.layers.Dropout( 0.9, noise_shape = (1, 64), seed = seed )(x)
x = tf.keras.layers.Dense( 2, activation='softmax' )(x)
model = tf.keras.Model( inputs=inputLayer, outputs=x )
model.compile( optimizer= tf.keras.optimizers.AdamW( learning_rate = 0.001, weight_decay=0.1 ),
loss = tf.keras.losses.CategoricalCrossentropy( label_smoothing = 0.1,from_logits =False ) )