Configuration Parsing
Warning:
In config.json: "architectures" must be an array
How to Get Started with the Model
Please install transformers
, safetensors
and torch
to use this model.
pip install transformers safetensors torch
Run the following Python code:
import torch
import transformers
from transformers import AutoModelForCausalLM
model_id = "ivanzhouyq/levanter-backpack-1b-100k"
config = transformers.AutoConfig.from_pretrained(model_id, trust_remote_code=True)
torch_model = AutoModelForCausalLM.from_pretrained(
model_id,
config=config,
trust_remote_code=True
)
torch_model.eval()
input = torch.randint(0, 50264, (1, 512), dtype=torch.long)
torch_out = torch_model(input, position_ids=None,)
torch_out = torch.nn.functional.softmax(torch_out.logits, dim=-1)
print(torch_out.shape)
- Downloads last month
- 18
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.