|
--- |
|
license: mit |
|
language: |
|
- en |
|
pipeline_tag: text2text-generation |
|
library_name: adapter-transformers |
|
--- |
|
# Model Card for Model ID |
|
|
|
<!-- Briefly summarize what the model is/does. --> |
|
|
|
This is an English grammar correction model. |
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
- **Developed by:** Amin Rahmani |
|
- **Model type:** T5 |
|
- **Language(s) (NLP):** English |
|
- **License:** MIT |
|
|
|
## How to Get Started with the Model |
|
|
|
from happytransformer import HappyTextToText |
|
|
|
happy_tt = HappyTextToText("T5", ".\PATH TO MODEL") |
|
|
|
from happytransformer import TTSettings |
|
|
|
beam_settings = TTSettings(num_beams=8, min_length=1, max_length=100) |
|
|
|
input_text_1 = "grammar: hi dear" |
|
|
|
output_text_1 = happy_tt.generate_text(input_text_1, args=beam_settings) |
|
print(output_text_1.text) |
|
|
|
|
|
[More Information Needed] |
|
|
|
|
|
#### Training Hyperparameters |
|
|
|
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> |
|
|
|
#### Speeds, Sizes, Times [optional] |
|
|
|
validation loss: 0.04 |
|
learning rate: |
|
epochs: 3 |
|
|
|
|
|
## Environmental Impact |
|
|
|
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> |
|
|
|
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). |
|
|
|
- **Hardware Type:** RTX 3090 |
|
|
|
|
|
## Technical Specifications [optional] |