Llama 7B lora fine-tune 2 epoch training on webNLG2017, 256, 128 length for context and completion

#1
by Jojo567 - opened

Sign up or log in to comment