Update README.md
Browse files
README.md
CHANGED
@@ -42,9 +42,9 @@ Soon real PoSE to extend Llama's context length to 64k with using my merge metho
|
|
42 |
|
43 |
I have found that most of merge's model outside so far do not actually have 64k in their configs. I will improve it in the next merge.
|
44 |
|
45 |
-
256k is not possible. My computer is running out of memory.
|
46 |
|
47 |
-
also, i would like to conduct great tests by building a network with high-capacity traffic and high-speed 10G speeds for you.
|
48 |
|
49 |
### Merge Method
|
50 |
|
|
|
42 |
|
43 |
I have found that most of merge's model outside so far do not actually have 64k in their configs. I will improve it in the next merge.
|
44 |
|
45 |
+
256k is not possible. My computer is running out of memory.
|
46 |
|
47 |
+
If you support me, i will try it on a computer with maximum specifications, also, i would like to conduct great tests by building a network with high-capacity traffic and high-speed 10G speeds for you.
|
48 |
|
49 |
### Merge Method
|
50 |
|