wabu commited on
Commit
ede8220
1 Parent(s): 9ddd36d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -5
README.md CHANGED
@@ -24,8 +24,12 @@ To validate the results the Antimicrobial Peptide Scanner vr.2 (https://www.dvel
24
 
25
  ## Training and evaluation data
26
 
27
- run_clm.py
28
 
 
 
 
 
29
  ### Training hyperparameters
30
 
31
  The following hyperparameters were used during training:
@@ -38,11 +42,11 @@ The following hyperparameters were used during training:
38
  - num_epochs: 50.0
39
 
40
  ### Training results
41
- tr
42
 
43
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
44
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
45
- | 3.7948 | 50.0 | 7400 | 3.9890 | 0.4213 |
46
 
47
 
48
  ### Framework versions
 
24
 
25
  ## Training and evaluation data
26
 
27
+ AmpGPT2 was trained using 32014 AMP sequences from the Compass (https://compass.mathematik.uni-marburg.de/) database.
28
 
29
+ ## How to use AmpGPT2
30
+ ```
31
+ from transformers import pipeline from transformers import GPT2LMHeadModel, GPT2Tokenizer ampgpt2 = pipeline('text-generation', model="wabu/AmpGPT2") model_amp = GPT2LMHeadModel.from_pretrained('wabu/AmpGPT2') tokenizer_amp = GPT2Tokenizer.from_pretrained('wabu/AmpGPT2') amp_sequences = ampgpt2( "", do_sample=True, repetition_penalty=1.2, num_return_sequences=10, eos_token_id=0 ) for i, seq in enumerate(amp_sequences): sequence_identifier = f"Sequence_{i + 1}" sequence = seq['generated_text'].replace('','').strip() print(f">{sequence_identifier}\n{sequence}")
32
+ ```
33
  ### Training hyperparameters
34
 
35
  The following hyperparameters were used during training:
 
42
  - num_epochs: 50.0
43
 
44
  ### Training results
45
+ these are the training losses after the final epoch
46
 
47
+ | Training Loss | Epoch | Validation Loss | Accuracy |
48
+ |:-------------:|:-----:|:---------------:|:--------:|
49
+ | 3.7948 | 50.0 | 3.9890 | 0.4213 |
50
 
51
 
52
  ### Framework versions