DeepMount00 commited on
Commit
6491825
·
verified ·
1 Parent(s): c17f4f7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -19
README.md CHANGED
@@ -35,25 +35,6 @@ Despite its compact size, Alireo-400M demonstrates impressive performance:
35
  * **GPU**: Optional, but recommended for faster inference
36
  * **Disk Space**: ~1GB (including model and dependencies)
37
 
38
- <h3 style="font-size: 32px; color: #2980b9;">Usage Example 💡</h3>
39
-
40
- ```python
41
- from transformers import AutoModelForCausalLM, AutoTokenizer
42
-
43
- # Load model and tokenizer
44
- model = AutoModelForCausalLM.from_pretrained("montebovi/alireo-400m")
45
- tokenizer = AutoTokenizer.from_pretrained("montebovi/alireo-400m")
46
-
47
- # Example text
48
- text = "L'intelligenza artificiale sta"
49
-
50
- # Tokenize and generate
51
- inputs = tokenizer(text, return_tensors="pt")
52
- outputs = model.generate(**inputs, max_new_tokens=50)
53
- result = tokenizer.decode(outputs[0], skip_special_tokens=True)
54
- print(result)
55
- ```
56
-
57
  <h3 style="font-size: 32px; color: #2980b9;">Citation 📄</h3>
58
 
59
  ```bibtex
 
35
  * **GPU**: Optional, but recommended for faster inference
36
  * **Disk Space**: ~1GB (including model and dependencies)
37
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
38
  <h3 style="font-size: 32px; color: #2980b9;">Citation 📄</h3>
39
 
40
  ```bibtex