HUANGYIFEI
commited on
add steps in readme.md
Browse files
Graph/GraphMAE_MQ9/README.md
CHANGED
@@ -2,12 +2,15 @@
|
|
2 |
## Overview
|
3 |
We run the Graph Mask AutoEncoder on QM9 Dataset for pretraining. We use the atom position of each atom and the embedding of their element type as the input feature (dim=7) and predict the input feature by using the GraphSage with 4-dim hidden representation.
|
4 |
## How to run
|
5 |
-
|
|
|
|
|
|
|
6 |
|
7 |
```bash
|
8 |
python prepare_QM9_dataset.py --label_keys "mu" "gap"
|
9 |
```
|
10 |
-
- step2. Train the Graph Mask AutoEncoder on the preprocessed dataset
|
11 |
```bash
|
12 |
python run.py [--dataset_path] [--batch_size] [--epochs] [--device] [--save_dir]
|
13 |
```
|
|
|
2 |
## Overview
|
3 |
We run the Graph Mask AutoEncoder on QM9 Dataset for pretraining. We use the atom position of each atom and the embedding of their element type as the input feature (dim=7) and predict the input feature by using the GraphSage with 4-dim hidden representation.
|
4 |
## How to run
|
5 |
+
### If you do not want to re-train the model again
|
6 |
+
- **Unzip the model.zip** to get the model weight & embedded graph in each epoch
|
7 |
+
### If you want to try out the training process
|
8 |
+
- step1. **Preprocess the dataset** (we have provided the preprocessed as well)
|
9 |
|
10 |
```bash
|
11 |
python prepare_QM9_dataset.py --label_keys "mu" "gap"
|
12 |
```
|
13 |
+
- step2. **Train the Graph Mask AutoEncoder on the preprocessed dataset**
|
14 |
```bash
|
15 |
python run.py [--dataset_path] [--batch_size] [--epochs] [--device] [--save_dir]
|
16 |
```
|