File size: 6,118 Bytes
38621ba 714c19e 38621ba 5a9d58d 38621ba 480b107 714c19e fb0b223 714c19e 38621ba 714c19e 38621ba a54e47e 38621ba a54e47e 38621ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
---
license: cc
datasets:
- VMware/open-instruct-v1-oasst-dolly-hhrlhf
- conceptofmind/cot_submix_original
language:
- en
library_name: transformers
pipeline_tag: text-generation
---
# VMware/xgen-7b-8k-open-instruct
Instruction-tuned version of SalesForce/Xgen-7b-8k-base. The model is open for <b>COMMERCIAL USE</b>. <br>
<b> NOTE </b> : The model was trained using the Alpaca prompt template <br>
<b> NOTE </b> : tiktoken library is required for the tokenizer. Set trust_remote_code=True when launching the tokenizer.<br>
We expanded Open-instruct with additional commercially viable zero-shot COT datasets from Flan v2 (~70k). <br>
Open-instruct-v1
- Mosaic/Dolly-HHRLHF + filtered OASST1 - cc by 3.0
Subset of COT SUBMIX (FROM FLAN V2) Zeroshot examples
- ESNLI - MIT
- ECQA - CDLA 1.0 - Sharing
- Strategy - MIT
- CREAK - MIT
- gsmk8 - MIT
- aqua - MIT
- qasc - Apache 2.0
<br>
The model supports up to <b>8192 tokens </b>
## License
- <b>Commercially Viable </b>
- The instruction datasets used for instruction tuning are open for commercial usage. (TODO LIST OUT THE DATASETS)
- Language Model, ([Salesforce/xgen-7b-8k-base](https://huggingface.co/Salesforce/xgen-7b-8k-base)) is under apache-2.0
## Use in Transformers
```
pip install tiktoken
```
```
import os
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = 'VMware/xgen-7b-8k-open-instruct'
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False, trust_remote_code = True)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map='sequential')
prompt_template = "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:"
prompt = 'Explain with examples how differentiation in math works'
inputt = prompt_template.format(instruction= prompt)
input_ids = tokenizer(inputt, return_tensors="pt").input_ids.to("cuda")
output1 = model.generate(input_ids, max_length=8192)
input_length = input_ids.shape[1]
output1 = output1[:, input_length:]
output = tokenizer.decode(output1[0])
print(output)
```
```
# output
Differentiation is a fundamental concept in calculus and related mathematical disciplines, which allows us to find the rate of change of a function at a particular point. Here are some examples of how differentiation works:
1. Finding the slope of a curve: One of the most common uses of differentiation is to find the slope of a curve at a particular point. This can be done by evaluating the derivative of the function at the point in question, which essentially measures the rate of change of the function at that point. For example, consider the following function:
f(x) = 2x + 3
To find the derivative of this function at the point x = 2, we would evaluate the function at that point and then differentiate it:
f'(x) = 2 + 3
This derivative evaluates to 2, which tells us that the slope of the curve at x = 2 is 2.
2. Finding the instantaneous rate of change of a function: In calculus, the instantaneous rate of change of a function at a point is the derivative of the function at that point. This is particularly useful in situations where the time variable is not explicitly included in the function. For example, consider the following function:
f(x) = 2x^3 + 5x^2 - 3x + 1
To find the instantaneous rate of change of this function at the point x = 2, we would differentiate it with respect to x at that point:
f'(x) = 6x^2 + 10x - 3
This derivative evaluates to 6, which tells us that the instantaneous rate of change of the function at x = 2 is 6.
3. Finding the concavity of a function: The concavity of a function is a measure of how the curve changes shape as you move away from the point of inflection. For example, consider the following function:
f(x) = x^3 - 2x^2 + 1
To find the concavity of this function at the point x = 1, we would differentiate it with respect to x at that point:
f'(x) = 3x^2 - 4x + 1
This derivative evaluates to 3, which tells us that the concavity of the function at x = 1 is negative, which means that the curve is concave downward at that point.
4. Finding the critical points of a function: Critical points of a function are points where the derivative of the function is equal to zero. These points are important in optimization problems, where they can be used to determine the values of the independent variables that minimize or maximize the value of the function. For example, consider the following function:
f(x) = x^3 - 2x^2 + 1
To find the critical points of this function, we would differentiate it with respect to x and then set the derivative equal to zero:
f'(x) = 3x^2 - 4x + 1 = 0
This evaluates to x = 0, which tells us that the critical points of the function are all equal to zero.
5. Finding the maximum and minimum values of a function: The maximum and minimum values of a function are the largest and smallest values that the function can take, respectively. These values can be found by differentiating the function and setting the derivative to zero, which essentially finds the points where the curve "bends" the most. For example, consider the following function:
f(x) = x^3 - 2x^2 + 1
To find the maximum and minimum values of this function, we would differentiate it with respect to x and then set the derivative equal to zero:
f'(x) = 3x^2 - 4x + 1 = 0
This evaluates to x = 0, which tells us that the maximum and minimum values of the function are both 0.
These are just a few examples of how differentiation can be used in mathematics. In general, differentiation allows us to find the rate of change, slope, instantaneous rate of change, critical points, and maximum and minimum values of a function at a particular point or over a particular interval.<|endoftext|>
```
## Finetuning details
The finetuning scripts will be available in our [RAIL Github Repository](https://github.com/vmware-labs/research-and-development-artificial-intelligence-lab/tree/main/instruction-tuning)
## Evaluation
<B>TODO</B> |