Gpt-neox-20b
#27 opened 2 months ago
by
Shubham1611
![](https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/1S3fJDHghnACjrld1cwN5.png)
how to use this with ollama
#26 opened 2 months ago
by
Pawankumar9413
Upload FlaxGPTNeoXForCausalLM
1
#24 opened over 1 year ago
by
heegyu
![](https://cdn-avatars.huggingface.co/v1/production/uploads/5f3a52317e583543386218db/WLnW1_fCic9NWfMjE3yB-.jpeg)
I have been asked to put ketchup in pie and take vitamin S!
#21 opened almost 2 years ago
by
sreeparna
Max context length/input token length.
#20 opened almost 2 years ago
by
gsaivinay
Is it possible to train this model on a commercially available cloud machine?
1
#19 opened almost 2 years ago
by
Walexum
<Response [422]>
#18 opened almost 2 years ago
by
skrishna
![](https://cdn-avatars.huggingface.co/v1/production/uploads/6186fef1b1085ab638324e7f/BL6_WJCkxB-BatBUBilT8.jpeg)
The generated results using inference API and the webpage are very different! Is the model called from the api the same as the one called from the webpage?
#17 opened almost 2 years ago
by
zouhanyi
Fine-Tuning GPT-Neox-20B using Hugging Face Transformers
1
#16 opened almost 2 years ago
by
Dulanjaya
Unusual behaviour with inference using transformers library
1
#15 opened almost 2 years ago
by
vmajor
![](https://cdn-avatars.huggingface.co/v1/production/uploads/63992e59afe0d224cf2b6bf1/q2JeqTcIb5j6fUg1SWGzL.jpeg)