Korbinian Pöppel commited on
Commit
54366c1
1 Parent(s): 90f1e33

Fix example.

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -28,7 +28,11 @@ xlstm = AutoModelForCausalLM.from_pretrained("NX-AI/xLSTM-7b", device_map="auto"
28
  # this is a fork of EleutherAI/gpt-neox-20b
29
  tokenizer = AutoTokenizer.from_pretrained("NX-AI/xLSTM-7b")
30
 
31
- xlstm(tokenizer("Hello xLSTM, how are you doing?"))
 
 
 
 
32
  ```
33
 
34
  ## Speed results
 
28
  # this is a fork of EleutherAI/gpt-neox-20b
29
  tokenizer = AutoTokenizer.from_pretrained("NX-AI/xLSTM-7b")
30
 
31
+ tokens = tokenizer("Hello xLSTM, how are you doing?", return_tensors='pt')['input_ids'].to(device="cuda")
32
+
33
+ out = xlstm.generate(tokens, max_new_tokens=20)
34
+
35
+ print(tokenizer.decode(out[0]))
36
  ```
37
 
38
  ## Speed results