pietrolesci's picture
Upload folder using huggingface_hub
7806f40 verified
|
raw
history blame
877 Bytes
## Run info
- complete_hash: b97998c098e0d34fa00d918d670b8f9b
- short_hash: b9799b8f9b
### Configuration
```yaml
data:
batch_size: 32
data_seed: 42
drop_last: false
eval_batch_size: 128
max_length: 512
multiprocessing_context: null
num_workers: 8
persistent_workers: false
pin_memory: true
replacement: false
shuffle: true
dataset: mnli
estimator:
accelerator: gpu
convert_to_bettertransformer: false
deterministic: true
precision: bf16-true
tf32_mode: high
fit:
enable_progress_bar: true
limit_train_batches: null
log_interval: 100
max_epochs: 20
min_epochs: null
optimizer_kwargs:
init_kwargs:
fused: true
lr: 3.0e-05
name: adamw
scheduler_kwargs:
name: constant_schedule_with_warmup
num_warmup_steps: 2000
model:
base_model: roberta-base
name: roberta-base
revision: null
seed: 42
seed: 42
```