Vezora's picture
Update README.md
5664b4e verified
|
raw
history blame
4.33 kB
metadata
license: mit
license_link: https://huggingface.co/Vezora/WaveCoder-6.7b-Ultra-bf16/blob/main/LICENSE
language:
  - en
library_name: transformers
datasets:
  - humaneval
pipeline_tag: text-generation
tags:
  - code
metrics:
  - code_eval

This is a Re-Upload of Wave-Coder-Ultra in bf16 since original model was uploaded in fp32 and there are none others available. Licensing remains the same as original base model.

🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM

[📜 Paper][🐱 GitHub]
[🐦 Twitter][💬 Reddit][🍀 Unofficial Blog]

Repo for "WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation"

🔥 News

  • [2024/04/10] 🔥🔥🔥 WaveCoder repo, models released at 🤗 HuggingFace!
  • [2023/12/26] WaveCoder paper released.

💡 Introduction

WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.

Model HumanEval MBPP(500) HumanEval
Fix(Avg.)
HumanEval
Explain(Avg.)
GPT-4 85.4 - 47.8 52.1
🌊 WaveCoder-DS-6.7B 65.8 63.0 49.5 40.8
🌊 WaveCoder-Pro-6.7B 74.4 63.4 52.1 43.0
🌊 WaveCoder-Ultra-6.7B 79.9 64.6 52.3 45.7

🪁 Evaluation

Please refer to WaveCoder's GitHub repo for inference, evaluation, and training code.

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("microsoft/wavecoder-ultra-6.7b")
model = AutoModelForCausalLM.from_pretrained("microsoft/wavecoder-ultra-6.7b")

📖 License

This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its License.

☕️ Citation

If you find this repository helpful, please consider citing our paper:

@article{yu2023wavecoder,
  title={Wavecoder: Widespread and versatile enhanced instruction tuning with refined data generation},
  author={Yu, Zhaojian and Zhang, Xin and Shang, Ning and Huang, Yangyu and Xu, Can and Zhao, Yishujie and Hu, Wenxiang and Yin, Qiufeng},
  journal={arXiv preprint arXiv:2312.14187},
  year={2023}
}

Note

WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's terms of use when using the models and the datasets.