prithivMLmods commited on
Commit
e1a7f43
·
verified ·
1 Parent(s): 31a5f37

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -3
README.md CHANGED
@@ -48,13 +48,40 @@ Triangulum 10B is a collection of pretrained and instruction-tuned generative mo
48
  2. **Supervised Fine-Tuning (SFT)**: Aligns the model to specific tasks through curated datasets.
49
  3. **Reinforcement Learning with Human Feedback (RLHF)**: Ensures the model adheres to human values and safety guidelines through iterative training processes.
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
  # **Use Cases**
 
52
  - Multilingual content generation
53
  - Question answering and dialogue systems
54
  - Text summarization and analysis
55
  - Translation and localization tasks
56
-
57
  # **Technical Details**
58
- Triangulum 10B employs a state-of-the-art autoregressive architecture inspired by LLaMA. The optimized transformer framework ensures both efficiency and scalability, making it suitable for a variety of use cases.
59
-
60
 
 
 
48
  2. **Supervised Fine-Tuning (SFT)**: Aligns the model to specific tasks through curated datasets.
49
  3. **Reinforcement Learning with Human Feedback (RLHF)**: Ensures the model adheres to human values and safety guidelines through iterative training processes.
50
 
51
+ # **How to use with transformers**
52
+
53
+ Starting with `transformers >= 4.43.0` onward, you can run conversational inference using the Transformers `pipeline` abstraction or by leveraging the Auto classes with the `generate()` function.
54
+
55
+ Make sure to update your transformers installation via `pip install --upgrade transformers`.
56
+
57
+ ```python
58
+ import torch
59
+ from transformers import pipeline
60
+
61
+ model_id = "prithivMLmods/Triangulum-10B"
62
+ pipe = pipeline(
63
+ "text-generation",
64
+ model=model_id,
65
+ torch_dtype=torch.bfloat16,
66
+ device_map="auto",
67
+ )
68
+ messages = [
69
+ {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
70
+ {"role": "user", "content": "Who are you?"},
71
+ ]
72
+ outputs = pipe(
73
+ messages,
74
+ max_new_tokens=256,
75
+ )
76
+ print(outputs[0]["generated_text"][-1])
77
+ ```
78
  # **Use Cases**
79
+
80
  - Multilingual content generation
81
  - Question answering and dialogue systems
82
  - Text summarization and analysis
83
  - Translation and localization tasks
84
+
85
  # **Technical Details**
 
 
86
 
87
+ Triangulum 10B employs a state-of-the-art autoregressive architecture inspired by LLaMA. The optimized transformer framework ensures both efficiency and scalability, making it suitable for a variety of use cases.