AntonV commited on
Commit
2d54e1e
1 Parent(s): b960d2c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -3
README.md CHANGED
@@ -1,3 +1,39 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mamba2
4
+ license: mit
5
+ library_name: transformers
6
+ ---
7
+
8
+ # mamba2-1.3b-hf
9
+
10
+ Converted files of the original model at [mamba2-1.3b](https://huggingface.co/state-spaces/mamba2-1.3b) to HF transformers compatible formats.
11
+ Not affiliated with both the original authors or hf.
12
+
13
+ ## Usage
14
+ ```python
15
+ from transformers import AutoTokenizer, AutoModelForCausalLM
16
+
17
+ tokenizer = AutoTokenizer.from_pretrained("AntonV/mamba2-1.3b-hf")
18
+ model = AutoModelForCausalLM.from_pretrained("AntonV/mamba2-1.3b-hf")
19
+
20
+ input_ids = tokenizer("Hey how are you doing?", return_tensors="pt")["input_ids"]
21
+ out = model.generate(input_ids, max_new_tokens=10)
22
+ print(tokenizer.batch_decode(out))
23
+ ```
24
+
25
+
26
+ ## Citation
27
+
28
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
29
+
30
+ **BibTeX:**
31
+
32
+ ```bibtex
33
+ @inproceedings{mamba2,
34
+ title={Transformers are {SSM}s: Generalized Models and Efficient Algorithms Through Structured State Space Duality},
35
+ author={Dao, Tri and Gu, Albert},
36
+ booktitle={International Conference on Machine Learning (ICML)},
37
+ year={2024}
38
+ }
39
+ ```