[Feedback]
feedback here.
Good model, as always!
The model breaks for me once the context goes higher than 8k:
Output generated in 42.72 seconds (35.96 tokens/s, 1536 tokens, context 8729, seed 712205367)
Output generated in 38.33 seconds (40.07 tokens/s, 1536 tokens, context 8729, seed 751044917)
Output generated in 38.44 seconds (39.96 tokens/s, 1536 tokens, context 8729, seed 1025991785)
Output generated in 38.78 seconds (39.61 tokens/s, 1536 tokens, context 8729, seed 1942480785)
Output:
Disclosure of Celeste's location poses significant risks exacerbating exposure compromising security compromising trust compromising the delicate equipoise compromising the equipoise compromising equipoise Collapse of equipoise Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse ...
The model breaks for me once the context goes higher than 8k:
Output generated in 42.72 seconds (35.96 tokens/s, 1536 tokens, context 8729, seed 712205367)
Output generated in 38.33 seconds (40.07 tokens/s, 1536 tokens, context 8729, seed 751044917)
Output generated in 38.44 seconds (39.96 tokens/s, 1536 tokens, context 8729, seed 1025991785)
Output generated in 38.78 seconds (39.61 tokens/s, 1536 tokens, context 8729, seed 1942480785)Output:
Disclosure of Celeste's location poses significant risks exacerbating exposure compromising security compromising trust compromising the delicate equipoise compromising the equipoise compromising equipoise Collapse of equipoise Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse ...
Try new 64K version.
The model breaks for me once the context goes higher than 8k:
Output generated in 42.72 seconds (35.96 tokens/s, 1536 tokens, context 8729, seed 712205367)
Output generated in 38.33 seconds (40.07 tokens/s, 1536 tokens, context 8729, seed 751044917)
Output generated in 38.44 seconds (39.96 tokens/s, 1536 tokens, context 8729, seed 1025991785)
Output generated in 38.78 seconds (39.61 tokens/s, 1536 tokens, context 8729, seed 1942480785)Output:
Disclosure of Celeste's location poses significant risks exacerbating exposure compromising security compromising trust compromising the delicate equipoise compromising the equipoise compromising equipoise Collapse of equipoise Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse Collapse ...
new version have fixed this issue.