OWG
/

Fill-Mask
Transformers
ONNX
English
bert
exbert
Inference Endpoints
bert-base-uncased / README.md
chainyo's picture
add onnx model version
1315eff
|
raw
history blame
565 Bytes
metadata
language: en
tags:
  - exbert
license: apache-2.0
datasets:
  - bookcorpus
  - wikipedia

BERT base model (uncased)

Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in this repository. This model is uncased: it does not make a difference between english and English.

Original implementation

Follow this link to see the original implementation.