ValueError: Cannot handle batch sizes > 1 if no padding token is defined.
The error persists despite
running , tokenizer.pad_token = tokenizer.eos_token
model.config.pad_token_id = tokenizer.pad_token_id.....anyone who has been able to further fine tune this model please provide guidance
Hey Tony,
We've fine-tuned the script to handle this issue. If you're looking to fine-tune the model yourself, you can use the following scripts: