Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
Edit model card

🎩 Magicoder: Source Code Is All You Need

Refer to our GitHub repo ise-uiuc/magicoder for an up-to-date introduction to the Magicoder family!

  • 🎩Magicoder is a model family empowered by πŸͺ„OSS-Instruct, a novel approach to enlightening LLMs with open-source code snippets for generating low-bias and high-quality instruction data for code.
  • πŸͺ„OSS-Instruct mitigates the inherent bias of the LLM-synthesized instruction data by empowering them with a wealth of open-source references to produce more diverse, realistic, and controllable data.

Overview of OSS-Instruct Overview of Result

Model Details

Model Description

Model Sources

Training Data

Uses

Direct Use

Magicoders are designed and best suited for coding tasks.

Out-of-Scope Use

Magicoders may not work well in non-coding tasks.

Bias, Risks, and Limitations

Magicoders may sometimes make errors, producing misleading contents, or struggle to manage tasks that are not related to coding.

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.

How to Get Started with the Model

Use the code below to get started with the model. Make sure you installed the transformers library.

from transformers import pipeline
import torch

MAGICODER_PROMPT = """You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliable responses to user instructions.

@@ Instruction
{instruction}

@@ Response
"""

instruction = <Your code instruction here>

prompt = MAGICODER_PROMPT.format(instruction=instruction)
generator = pipeline(
    model="ise-uiuc/Magicoder-S-CL-7B",
    task="text-generation",
    torch_dtype=torch.bfloat16,
    device_map="auto",
)
result = generator(prompt, max_length=1024, num_return_sequences=1, temperature=0.0)
print(result[0]["generated_text"])

Technical Details

Refer to our GitHub repo: ise-uiuc/magicoder.

Citation

@misc{magicoder,
    title={Magicoder: Source Code Is All You Need}, 
    author={Yuxiang Wei and Zhe Wang and Jiawei Liu and Yifeng Ding and Lingming Zhang},
    year={2023},
    eprint={2312.02120},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}

Acknowledgements

Important Note

Magicoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's terms of use when using the models and the datasets. Magicoders will not compete with OpenAI's commercial products.

Downloads last month
744
Safetensors
Model size
6.74B params
Tensor type
F32
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ise-uiuc/Magicoder-S-CL-7B

Adapters
1 model
Quantizations
2 models

Datasets used to train ise-uiuc/Magicoder-S-CL-7B

Space using ise-uiuc/Magicoder-S-CL-7B 1