Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

A Newbie Question

#48
by Psok - opened

Forgive my ignorance, but I have searched everywhere I can think of for an answer to this question...
You require set_remote_code=True and provide the code to be used. Where exactly do I put the code? I have searched the configuration files and python files for a proper place, but I'm just too new to understand what to do. I'm using ooba and my system is a 5800x Ryzen, 3070 8 mb, with 32 gb ram. Thank you!

To run the llm using ooba, I run the command

python server.py --model-menu --notebook --model mosaicml_mpt-7b-storywriter --trust-remote-code

This is where I put the --trust-remote-code command and it worked for me.

Cheers!

Thank you so much!!!

Sign up or log in to comment