Description

This model is derived from OpenCoder-1.5B-Base by applying additional context extension fine-tuning. The repository context is composed using the Random .py composer, more details on which, along with others, can be found in the On Pretraining for Project-Level Code Completion paper (arxiv). Specifically, Section A.1 of the Appendix describes the context composition method, and Table 3 provides a comparison with other composers from the same collection.

We publish this checkpoint to support the reproducibility and accessibility of our research results.

Quickstart

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "JetBrains-Research/OpenCoder-1.5B-Random-Py"
tokenizer_name = "infly/OpenCoder-1.5B-Base"

model = AutoModelForCausalLM.from_pretrained(model_name,
                                             torch_dtype=torch.bfloat16,
                                             device_map="auto",
                                             trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(tokenizer_name, trust_remote_code=True)

inputs = tokenizer("# write a quick sort algorithm", return_tensors="pt")
outputs = model.generate(**inputs.to(model.device), max_new_tokens=256)

result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result)
Downloads last month
15
Safetensors
Model size
2B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for JetBrains-Research/OpenCoder-1.5B-Random-Py

Finetuned
(35)
this model

Collection including JetBrains-Research/OpenCoder-1.5B-Random-Py