File size: 3,068 Bytes
70e9a78 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 |
---
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
- Image Regression
datasets:
- "tonyassi/tony__assi-ig-ds5"
metrics:
- accuracy
model-index:
- name: "tony__assi-ig-prediction"
results: []
---
# tony__assi-ig-prediction
## IG Prediction
This model was trained with [IGPrediction](https://github.com/TonyAssi/IGPrediction). It predicts how many likes an image will get.
```python
from IGPredict import predict_ig
predict_ig(repo_id='tonyassi/tony__assi-ig-prediction',image_path='image.jpg')
```
---
## Dataset
Dataset: tonyassi/tony__assi-ig-ds5\
Value Column:\
Train Test Split: 0.2
---
## Training
Base Model: [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224)\
Epochs: 20\
Learning Rate: 0.0001
---
## Usage
### Download
```bash
git clone https://github.com/TonyAssi/IGPrediction.git
cd IGPrediction
```
### Installation
```bash
pip install -r requirements.txt
```
### Import
```python
from IGPredict import ig_download, upload_dataset, train_ig_model, upload_ig_model, predict_ig
```
### Download Instagram Images
- **username** Instagram username
- **num_images** maximum number of images to download
```python
ig_download(username='instagarm_username', num_images=100)
```
Instagram images will be downloaded to *'./images'* folder, each one named like so *"index-likes.jpg"*. E.g. *"3-17.jpg"* is the third image and has 17 likes.
### Upload Dataset
- **dataset_name** name of dataset to be uploaded
- **token** go [here](https://huggingface.co/settings/tokens) to create a new π€ token
```python
upload_dataset(dataset_name='tonyassi/tony__assi-ig-ds5', token='YOUR_HF_TOKEN')
```
Go to your π€ profile to find your uploaded dataset, it should look similar to [tonyassi/tony__assi-ig-ds](https://huggingface.co/datasets/tonyassi/tony__assi-ig-ds).
### Train Model
- **dataset_id** π€ dataset id
- **test_split** test split of the train/test split
- **num_train_epochs** training epochs
- **learning_rate** learning rate
```python
train_ig_model(dataset_id='tonyassi/tony__assi-ig-ds5',
test_split=0.2,
num_train_epochs=20,
learning_rate=0.0001)
```
The trainer will save the checkpoints in the 'results' folder. The model.safetensors are the trained weights you'll use for inference (predicton).
### Upload Model
This function will upload your model to the π€ Hub.
- **model_id** the name of the model id
- **token** go [here](https://huggingface.co/settings/tokens) to create a new π€ token
- **checkpoint_dir** checkpoint folder that will be uploaded
```python
upload_ig_model(model_id='tony__assi-ig-prediction',
token='YOUR_HF_TOKEN',
checkpoint_dir='./results/checkpoint-940')
```
### Inference (Prediction)
- **repo_id** π€ repo id of the model
- **image_path** path to image
```python
predict_ig(repo_id='tonyassi/tony__assi-ig-prediction',
image_path='image.jpg')
```
The first time this function is called it'll download the safetensor model. Subsequent function calls will run faster. |