File size: 2,387 Bytes
1c2e8fa 4f62174 3b790ed 5a41956 aa7ca29 095ce9f bebe3cd 095ce9f 180b1fb bebe3cd 80d01d4 af88ba0 80d01d4 9b32208 80d01d4 2536e0a 80d01d4 01c40c7 80d01d4 e0dafe2 80d01d4 d803e77 2d09452 80d01d4 ea44c8a 6b5aa8d 80d01d4 ea44c8a 80d01d4 572ff9c 80d01d4 2536e0a 80d01d4 d3f115b 80d01d4 2d09452 aa70f2a cd0f87b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
license: other
license_name: stem.ai.mtl
license_link: LICENSE
language:
- en
tags:
- phi-2
- electrical engineering
- Microsoft
datasets:
- STEM-AI-mtl/Electrical-engineering
- garage-bAInd/Open-Platypus
task_categories:
- question-answering
- text-generation
pipeline_tag: text-generation
widget:
- text: "Enter your instruction here"
inference: true
auto_sample: true
inference_code: chat-GPTQ.py
library_tag: transformers
---
# For the electrical engineering community
A unique, deployable and efficient 2.7 billion parameters model in the field of electrical engineering. This repo contains the adapters from the LoRa fine-tuning of the phi-2 model from Microsoft. It was trained on the [STEM-AI-mtl/Electrical-engineering](https://huggingface.co/datasets/STEM-AI-mtl/Electrical-engineering) dataset combined with [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
- **Developed by:** STEM.AI
- **Model type:** Q&A and code generation
- **Language(s) (NLP):** English
- **Finetuned from model:** [microsoft/phi-2](https://huggingface.co/microsoft/phi-2)
### Direct Use
Q&A related to electrical engineering, and Kicad software. Creation of Python code in general, and for Kicad's scripting console.
Refer to [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) model card for recommended prompt format.
### Inference script
[Standard](https://github.com/STEM-ai/Phi-2/blob/4eaa6aaa2679427a810ace5a061b9c951942d66a/chat.py)
[GPTQ format](https://github.com/STEM-ai/Phi-2/blob/ab1ced8d7922765344d824acf1924df99606b4fc/chat-GPTQ.py)
## Training Details
### Training Data
Dataset related to electrical engineering: [STEM-AI-mtl/Electrical-engineering](https://huggingface.co/datasets/STEM-AI-mtl/Electrical-engineering)
It is composed of queries, 65% about general electrical engineering, 25% about Kicad (EDA software) and 10% about Python code for Kicad's scripting console.
In additionataset related to STEM and NLP: [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus)
### Training Procedure
[LoRa script](https://github.com/STEM-ai/Phi-2/blob/4eaa6aaa2679427a810ace5a061b9c951942d66a/LoRa.py)
A LoRa PEFT was performed on a 48 Gb A40 Nvidia GPU.
## Model Card Authors
STEM.AI: stem.ai.mtl@gmail.com\
[William Harbec](https://www.linkedin.com/in/william-harbec-56a262248/)
|