phoebeklett
commited on
Commit
•
7a2fcfd
1
Parent(s):
da61f3d
Update README.md
Browse files
README.md
CHANGED
@@ -27,10 +27,10 @@ from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
27 |
|
28 |
ag_wiki_entry = """Alexander Grothendieck (/ˈɡroʊtəndiːk/; German pronunciation: [ˌalɛˈksandɐ ˈɡʁoːtn̩ˌdiːk] (listen); French: [ɡʁɔtɛndik]; 28 March 1928 – 13 November 2014) was a stateless (and then, since 1971, French) mathematician who became the leading figure in the creation of modern algebraic geometry.[7][8] His research extended the scope of the field and added elements of commutative algebra, homological algebra, sheaf theory, and category theory to its foundations, while his so-called "relative" perspective led to revolutionary advances in many areas of pure mathematics.[7][9] He is considered by many to be the greatest mathematician of the twentieth century.[10][11]"""
|
29 |
|
30 |
-
tokenizer_hf = AutoTokenizer.from_pretrained("normalcomputing/extended-mind-
|
31 |
memories = tokenizer_hf(ag_wiki_entry).input_ids
|
32 |
|
33 |
-
model_hf = AutoModelForCausalLM.from_pretrained("normalcomputing/extended-mind-
|
34 |
```
|
35 |
After this, you can generate text with the model as usual. The model will automatically use the memories during generation. You can update any config parameters (we set `topk` below) by passing new values to the `model.generate()` method.
|
36 |
|
|
|
27 |
|
28 |
ag_wiki_entry = """Alexander Grothendieck (/ˈɡroʊtəndiːk/; German pronunciation: [ˌalɛˈksandɐ ˈɡʁoːtn̩ˌdiːk] (listen); French: [ɡʁɔtɛndik]; 28 March 1928 – 13 November 2014) was a stateless (and then, since 1971, French) mathematician who became the leading figure in the creation of modern algebraic geometry.[7][8] His research extended the scope of the field and added elements of commutative algebra, homological algebra, sheaf theory, and category theory to its foundations, while his so-called "relative" perspective led to revolutionary advances in many areas of pure mathematics.[7][9] He is considered by many to be the greatest mathematician of the twentieth century.[10][11]"""
|
29 |
|
30 |
+
tokenizer_hf = AutoTokenizer.from_pretrained("normalcomputing/extended-mind-mpt-30b")
|
31 |
memories = tokenizer_hf(ag_wiki_entry).input_ids
|
32 |
|
33 |
+
model_hf = AutoModelForCausalLM.from_pretrained("normalcomputing/extended-mind-mpt-30b", external_memories=memories, trust_remote_code=True)
|
34 |
```
|
35 |
After this, you can generate text with the model as usual. The model will automatically use the memories during generation. You can update any config parameters (we set `topk` below) by passing new values to the `model.generate()` method.
|
36 |
|