Primary competition visual

Malawi Public Health Systems LLM Challenge

Helping Malawi
$2 000 USD
Challenge completed over 1 year ago
Questioning and Answering
Generative AI
407 joined
74 active
Starti
Jan 24, 24
Closei
Mar 03, 24
Reveali
Mar 03, 24
Ambiguous Hardware Restrictions [RAM?!]
Help Ā· 19 Feb 2024, 18:52 Ā· 2

Hardware Restriction Rules have been a bit confusing:

```

You are limited to the following or equivalent hardware specifications: **Multi-core processor (e.g., Intel Core i7/i9, AMD Ryzen 7/9)** for efficient processing.

```

No explicit requirements for RAM has been specified, is there a specific number for RAM?

A better IDEA for Hardware Restrictions would be to limit the **LLM by size**.

Example: Only open-source models under 13B are to be used.

Discussion 2 answers

I think they are always refering to inference so I woul say RAM would not make a big difference

19 Feb 2024, 19:00
Upvotes 0

RAM would probably make a difference when loading models on CPU.

Example: I can load huge models of the size of 30B/70B onto 256Gb of RAM, while inferencing it one-by-one.

But is that even valid is the question?