STEM-AI-mtl
commited on
Commit
•
af88ba0
1
Parent(s):
572ff9c
Update README.md
Browse files
README.md
CHANGED
@@ -22,7 +22,7 @@ auto_sample: true
|
|
22 |
inference_code: chat-GPTQ.py
|
23 |
library_tag: transformers
|
24 |
---
|
25 |
-
#
|
26 |
|
27 |
A unique, deployable and efficient 2.7 billion parameters model in the field of electrical engineering. This repo contains the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.
|
28 |
|
|
|
22 |
inference_code: chat-GPTQ.py
|
23 |
library_tag: transformers
|
24 |
---
|
25 |
+
# For the electrical engineering community
|
26 |
|
27 |
A unique, deployable and efficient 2.7 billion parameters model in the field of electrical engineering. This repo contains the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the STEM-AI-mtl/Electrical-engineering dataset combined with garage-bAInd/Open-Platypus.
|
28 |
|