Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
gosshh
/
output
like
0
Text Classification
Transformers
TensorBoard
Safetensors
Habana
bert
Generated from Trainer
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
1265085
output
1 contributor
History:
10 commits
gosshh
Training in progress, step 500
1265085
verified
4 months ago
runs
Training in progress, step 500
4 months ago
.gitattributes
1.52 kB
initial commit
4 months ago
config.json
1.02 kB
Training in progress, step 500
4 months ago
gaudi_config.json
246 Bytes
Training in progress, step 500
4 months ago
model.safetensors
46.7 MB
LFS
Training in progress, step 500
4 months ago
special_tokens_map.json
286 Bytes
Training in progress, step 500
4 months ago
spiece.model
760 kB
LFS
Training in progress, step 500
4 months ago
tokenizer.json
2.27 MB
Training in progress, step 500
4 months ago
tokenizer_config.json
1.22 kB
Training in progress, step 500
4 months ago
training_args.bin
pickle
Detected Pickle imports (9)
"transformers.trainer_utils.IntervalStrategy"
,
"optimum.habana.transformers.training_args.GaudiTrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_utils.HubStrategy"
,
"optimum.habana.accelerate.utils.dataclasses.GaudiDistributedType"
,
"optimum.habana.accelerate.state.GaudiPartialState"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.training_args.OptimizerNames"
,
"torch.device"
How to fix it?
4.79 kB
LFS
Training in progress, step 500
4 months ago
vocab.txt
232 kB
Training in progress, step 500
4 months ago