Edit model card

Dataset credits go to: theblackcat102

How to run inference:

import transformers
import torch


def fmt_prompt(prompt: str) -> str:
    return f"""[Instructions]:\n{prompt}\n\n[Response]:"""


if __name__ == "__main__":
    model_name = "abacaj/starcoderbase-1b-sft"
    tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)

    model = (
        transformers.AutoModelForCausalLM.from_pretrained(
            model_name,
        )
        .to("cuda:0")
        .eval()
    )

    prompt = "Write a python function to sort the following array in ascending order, don't use any built in sorting methods: [9,2,8,1,5]"
    prompt_input = fmt_prompt(prompt)
    inputs = tokenizer(prompt_input, return_tensors="pt").to(model.device)
    input_ids_cutoff = inputs.input_ids.size(dim=1)

    with torch.no_grad():
        generated_ids = model.generate(
            **inputs,
            use_cache=True,
            max_new_tokens=512,
            temperature=0.2,
            top_p=0.95,
            do_sample=True,
            eos_token_id=tokenizer.eos_token_id,
            pad_token_id=tokenizer.pad_token_id,
        )

    completion = tokenizer.decode(
        generated_ids[0][input_ids_cutoff:],
        skip_special_tokens=True,
    )

    print(completion)

Evals: image/png

Training charts: image/png

Link to charts: https://api.wandb.ai/links/abacaj1/c4nkcs9r

Code to train model: https://github.com/abacaj/train-with-fsdp

Downloads last month
53
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for abacaj/starcoderbase-1b-sft

Quantizations
1 model

Dataset used to train abacaj/starcoderbase-1b-sft

Spaces using abacaj/starcoderbase-1b-sft 3

Evaluation results