Квантизированная версия модели ruGPT-3.5
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="pe4enov/ruGPT-3.5-13B-8bit")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("ai-forever/ruGPT-3.5-13B")
model = AutoModelForCausalLM.from_pretrained("pe4enov/ruGPT-3.5-13B-8bit")
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.