Update README.md
Browse files
README.md
CHANGED
@@ -41,6 +41,22 @@ inputs = tokenizer.encode("Machine Learning is", return_tensors="pt").to(device)
|
|
41 |
outputs = model.generate(inputs)
|
42 |
print(tokenizer.decode(outputs[0]))
|
43 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
44 |
## Training
|
45 |
### Model
|
46 |
- Architecture: Llama model
|
|
|
41 |
outputs = model.generate(inputs)
|
42 |
print(tokenizer.decode(outputs[0]))
|
43 |
```
|
44 |
+
|
45 |
+
## Intermediate checkpoints
|
46 |
+
|
47 |
+
We are releasing intermediate checkpoints for this model at intervals of every 1000 training steps in separate branches. The naming convention is `step-001000-2BT`.
|
48 |
+
|
49 |
+
You can load a specific model revision with `transformers` using the argument `revision`:
|
50 |
+
```python
|
51 |
+
model = AutoModelForCausalLM.from_pretrained(checkpoint, revision="step-001000-2BT")
|
52 |
+
```
|
53 |
+
You can access all the revisions for the models via the following code:
|
54 |
+
```python
|
55 |
+
from huggingface_hub import list_repo_refs
|
56 |
+
out = list_repo_refs(checkpoint)
|
57 |
+
branches = [b.name for b in out.branches]
|
58 |
+
```
|
59 |
+
|
60 |
## Training
|
61 |
### Model
|
62 |
- Architecture: Llama model
|