File size: 4,330 Bytes
21b9ec1 d818554 db585d3 21b9ec1 e695dea db585d3 e695dea db585d3 e695dea 81179a6 a159286 81179a6 e695dea db585d3 e695dea 81179a6 e695dea |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 |
---
license: mit
tags:
- chemistry
- smiles
widget:
- text: "^"
example_title: "Sample molecule | SMILES"
---
# Model Card for Model hogru/MolReactGen-GuacaMol-Molecules
<!-- Provide a quick summary of what the model is/does. -->
MolReactGen is a model that generates molecules in SMILES format (this model) and [reaction templates in SMARTS format](https://huggingface.co/hogru/MolReactGen-USPTO50K-Reaction-Templates).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
MolReactGen is based on the the GPT-2 transformer decoder architecture and has been trained on the [GuacaMol dataset](https://figshare.com/projects/GuacaMol/56639). More information can be found in these [introductory slides](https://github.com/hogru/MolReactGen/blob/main/presentations/Slides%20(A4%20size).pdf).
- **Developed by:** Stephan Holzgruber
- **Model type:** Transformer decoder
- **License:** MIT
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/hogru/MolReactGen
- **Presentation:** https://github.com/hogru/MolReactGen/blob/main/presentations/Slides%20(A4%20size).pdf
- **Poster:** https://github.com/hogru/MolReactGen/blob/main/presentations/Poster%20(A0%20size).pdf
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
The main use of this model is to pass the master's examination of the author ;-)
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
The model can be used in a Hugging Face text generation pipeline. For the intended use case a wrapper around the raw text generation pipeline is needed. This is the [`generate.py` from the repository](https://github.com/hogru/MolReactGen/blob/main/src/molreactgen/generate.py).
The model has a default `GenerationConfig()` (`generation_config.json`) which can be overwritten. Depending on the number of molecules to be generated (`num_return_sequences` in the `JSON` file) this might take a while. The generation code above shows a progress bar during generation.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The model generates molecules that are similar to the GuacaMol training data, which itself is based on [ChEMBL](https://www.ebi.ac.uk/chembl/). Any checks of the molecules, e.g. chemical feasiblitly, must be adressed by the user of the model.
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[GuacaMol dataset](https://figshare.com/projects/GuacaMol/56639)
### Training Procedure
The default Hugging Face `Trainer()` has been used, with an `EarlyStoppingCallback()`.
### Preprocessing
The training data was pre-processed with a `PreTrainedTokenizerFast()` trained on the training data with a character level pre-tokenizer and Unigram as the sub-word tokenization algorithm with a vocabulary size of 88. Other tokenizers can be configured.
### Training Hyperparameters
- **Batch size:** 64
- **Gradient accumulation steps:** 4
- **Mixed precision:** fp16, native amp
- **Learning rate:** 0.0025
- **Learning rate scheduler:** Cosine
- **Learning rate scheduler warmup:** 0.1
- **Optimizer:** AdamW with betas=(0.9,0.95) and epsilon=1e-08
- **Number of epochs:** 50
More configuration (options) can be found in the [`conf`](https://github.com/hogru/MolReactGen/tree/main/src/molreactgen/conf) directory of the repository.
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
Please see the slides / the poster mentioned above.
### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
Please see the slides / the poster mentioned above.
### Results
Please see the slides / the poster mentioned above.
## Technical Specifications
### Framework versions
- Transformers 4.27.1
- Pytorch 1.13.1
- Datasets 2.10.1
- Tokenizers 0.13.2
### Hardware
- Local PC running Ubuntu 22.04
- NVIDIA GEFORCE RTX 3080Ti (12GB)
|