Text Generation
Transformers
llama
Inference Endpoints
bhenrym14 commited on
Commit
4170423
1 Parent(s): 8ab4145

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -38,8 +38,8 @@ Here I explore whether training on long sequences that have clear conceptual dep
38
 
39
  ## Relative Performance (perplexity)
40
 
41
- | Context (tokens) | airophin-13b-pntk-16k-fp16| bhenrym14/airoboros-13b-gpt4-1.4.1-PI-8192-GPTQ |bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 | jondurbin/airoboros-33B-gpt4-1.4-GPTQ |
42
- | ---| ------- | -----| ------ | --- | --- |
43
  | 512 | 7.62 | 8.24 | 7.90 | **6.36** |
44
  | 1024 | 6.20 | 6.71 | 6.17 | **5.12** |
45
  | 2048 | 5.38 | 5.87 | 5.23 | **4.43** |
 
38
 
39
  ## Relative Performance (perplexity)
40
 
41
+ | Context (tokens) | airophin-13b-pntk-16k-fp16| bhenrym14/airoboros-13b-gpt4-1.4.1-PI-8192-fp16 |bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 | jondurbin/airoboros-33B-gpt4-1.4-GPTQ |
42
+ | ---| ----- | -----| ------| --- |
43
  | 512 | 7.62 | 8.24 | 7.90 | **6.36** |
44
  | 1024 | 6.20 | 6.71 | 6.17 | **5.12** |
45
  | 2048 | 5.38 | 5.87 | 5.23 | **4.43** |