Update README.md
Browse files
README.md
CHANGED
@@ -12,17 +12,17 @@ model-index:
|
|
12 |
|
13 |
# AmpGPT2
|
14 |
|
15 |
-
AmpGPT2 is a language model capable of generating de novo antimicrobial peptides (AMPs).
|
16 |
|
17 |
## Model description
|
18 |
|
19 |
AmpGPT2 is a fine-tuned version of [nferruz/ProtGPT2](https://huggingface.co/nferruz/ProtGPT2) based on the GPT2 Transformer architecture.
|
20 |
-
|
21 |
| Model | sequences generated | AMP percentage (AMP%) | average length |
|
22 |
|:------------:|:-----------:|:-----------:|:-----------:|
|
23 |
| AmpGPT2| 1000 | 95.86| 64.08 |
|
24 |
| ProtGPT2| 1000 | 51.85 | 222.59 |
|
25 |
|
|
|
26 |
To validate the results the Antimicrobial Peptide Scanner vr.2 (https://www.dveltri.com/ascan/v2/ascan.html) was used, which is a deep learning tool specifically designed for AMP recognition.
|
27 |
|
28 |
## Training and evaluation data
|
|
|
12 |
|
13 |
# AmpGPT2
|
14 |
|
15 |
+
AmpGPT2 is a language model capable of generating de novo antimicrobial peptides (AMPs). Over 95% of sequences generated by AmpGPT2 are predicted to have antimicrobial activities.
|
16 |
|
17 |
## Model description
|
18 |
|
19 |
AmpGPT2 is a fine-tuned version of [nferruz/ProtGPT2](https://huggingface.co/nferruz/ProtGPT2) based on the GPT2 Transformer architecture.
|
|
|
20 |
| Model | sequences generated | AMP percentage (AMP%) | average length |
|
21 |
|:------------:|:-----------:|:-----------:|:-----------:|
|
22 |
| AmpGPT2| 1000 | 95.86| 64.08 |
|
23 |
| ProtGPT2| 1000 | 51.85 | 222.59 |
|
24 |
|
25 |
+
The results demonstrate that AmpGPT2 outperformes ProtGPT2 in AMP%, suggesting the model learned from the AMP-specific data.
|
26 |
To validate the results the Antimicrobial Peptide Scanner vr.2 (https://www.dveltri.com/ascan/v2/ascan.html) was used, which is a deep learning tool specifically designed for AMP recognition.
|
27 |
|
28 |
## Training and evaluation data
|