|
--- |
|
license: cc |
|
datasets: |
|
- VMware/open-instruct-v1-oasst-dolly-hhrlhf |
|
- conceptofmind/cot_submix_original |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# VMware/xgen-7b-8k-open-instruct |
|
Instruction-tuned version of SalesForce/Xgen-7b-8k-base. The model is open for <b>COMMERCIAL USE</b>. <br> |
|
|
|
<b> NOTE </b> : The model was trained using the Alpaca prompt template <br> |
|
<b> NOTE </b> : tiktoken library is required for the tokenizer. Set trust_remote_code=True when launching the tokenizer.<br> |
|
|
|
We expanded Open-instruct with additional commercially viable zero-shot COT datasets from Flan v2 (~70k). <br> |
|
|
|
|
|
Open-instruct-v1 |
|
- Mosaic/Dolly-HHRLHF + filtered OASST1 - cc by 3.0 |
|
|
|
Subset of COT SUBMIX (FROM FLAN V2) Zeroshot examples |
|
- ESNLI - MIT |
|
- ECQA - CDLA 1.0 - Sharing |
|
- Strategy - MIT |
|
- CREAK - MIT |
|
- gsmk8 - MIT |
|
- aqua - MIT |
|
- qasc - Apache 2.0 |
|
|
|
<br> |
|
|
|
The model supports up to <b>8192 tokens </b> |
|
|
|
|
|
|
|
## License |
|
- <b>Commercially Viable </b> |
|
- The instruction datasets used for instruction tuning are open for commercial usage. (TODO LIST OUT THE DATASETS) |
|
- Language Model, ([Salesforce/xgen-7b-8k-base](https://huggingface.co/Salesforce/xgen-7b-8k-base)) is under apache-2.0 |
|
|
|
|
|
|
|
## Use in Transformers |
|
|
|
``` |
|
pip install tiktoken |
|
``` |
|
|
|
``` |
|
import os |
|
import torch |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
model_name = 'VMware/xgen-7b-8k-open-instruct' |
|
|
|
|
|
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False, trust_remote_code = True) |
|
|
|
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map='sequential') |
|
|
|
prompt_template = "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:" |
|
|
|
prompt = 'Explain with examples how differentiation in math works' |
|
|
|
|
|
inputt = prompt_template.format(instruction= prompt) |
|
input_ids = tokenizer(inputt, return_tensors="pt").input_ids.to("cuda") |
|
|
|
output1 = model.generate(input_ids, max_length=8192) |
|
input_length = input_ids.shape[1] |
|
output1 = output1[:, input_length:] |
|
output = tokenizer.decode(output1[0]) |
|
|
|
print(output) |
|
|
|
``` |
|
|
|
``` |
|
# output |
|
|
|
|
|
|
|
Differentiation is a fundamental concept in calculus and related mathematical disciplines, which allows us to find the rate of change of a function at a particular point. Here are some examples of how differentiation works: |
|
|
|
1. Finding the slope of a curve: One of the most common uses of differentiation is to find the slope of a curve at a particular point. This can be done by evaluating the derivative of the function at the point in question, which essentially measures the rate of change of the function at that point. For example, consider the following function: |
|
|
|
f(x) = 2x + 3 |
|
|
|
To find the derivative of this function at the point x = 2, we would evaluate the function at that point and then differentiate it: |
|
|
|
f'(x) = 2 + 3 |
|
|
|
This derivative evaluates to 2, which tells us that the slope of the curve at x = 2 is 2. |
|
|
|
2. Finding the instantaneous rate of change of a function: In calculus, the instantaneous rate of change of a function at a point is the derivative of the function at that point. This is particularly useful in situations where the time variable is not explicitly included in the function. For example, consider the following function: |
|
|
|
f(x) = 2x^3 + 5x^2 - 3x + 1 |
|
|
|
To find the instantaneous rate of change of this function at the point x = 2, we would differentiate it with respect to x at that point: |
|
|
|
f'(x) = 6x^2 + 10x - 3 |
|
|
|
This derivative evaluates to 6, which tells us that the instantaneous rate of change of the function at x = 2 is 6. |
|
|
|
3. Finding the concavity of a function: The concavity of a function is a measure of how the curve changes shape as you move away from the point of inflection. For example, consider the following function: |
|
|
|
f(x) = x^3 - 2x^2 + 1 |
|
|
|
To find the concavity of this function at the point x = 1, we would differentiate it with respect to x at that point: |
|
|
|
f'(x) = 3x^2 - 4x + 1 |
|
|
|
This derivative evaluates to 3, which tells us that the concavity of the function at x = 1 is negative, which means that the curve is concave downward at that point. |
|
|
|
4. Finding the critical points of a function: Critical points of a function are points where the derivative of the function is equal to zero. These points are important in optimization problems, where they can be used to determine the values of the independent variables that minimize or maximize the value of the function. For example, consider the following function: |
|
|
|
f(x) = x^3 - 2x^2 + 1 |
|
|
|
To find the critical points of this function, we would differentiate it with respect to x and then set the derivative equal to zero: |
|
|
|
f'(x) = 3x^2 - 4x + 1 = 0 |
|
|
|
This evaluates to x = 0, which tells us that the critical points of the function are all equal to zero. |
|
|
|
5. Finding the maximum and minimum values of a function: The maximum and minimum values of a function are the largest and smallest values that the function can take, respectively. These values can be found by differentiating the function and setting the derivative to zero, which essentially finds the points where the curve "bends" the most. For example, consider the following function: |
|
|
|
f(x) = x^3 - 2x^2 + 1 |
|
|
|
To find the maximum and minimum values of this function, we would differentiate it with respect to x and then set the derivative equal to zero: |
|
|
|
f'(x) = 3x^2 - 4x + 1 = 0 |
|
|
|
This evaluates to x = 0, which tells us that the maximum and minimum values of the function are both 0. |
|
|
|
These are just a few examples of how differentiation can be used in mathematics. In general, differentiation allows us to find the rate of change, slope, instantaneous rate of change, critical points, and maximum and minimum values of a function at a particular point or over a particular interval.<|endoftext|> |
|
|
|
|
|
|
|
|
|
``` |
|
|
|
## Finetuning details |
|
The finetuning scripts will be available in our [RAIL Github Repository](https://github.com/vmware-labs/research-and-development-artificial-intelligence-lab/tree/main/instruction-tuning) |
|
## Evaluation |
|
|
|
<B>TODO</B> |