Edit model card

ERNIE-Gram-zh

Introduction

ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding

More detail: https://arxiv.org/abs/2010.12148

Released Model Info

Model Name Language Model Structure
ernie-gram-zh Chinese Layer:12, Hidden:768, Heads:12

This released Pytorch model is converted from the officially released PaddlePaddle ERNIE model and a series of experiments have been conducted to check the accuracy of the conversion.

How to use

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-gram-zh")
model = AutoModel.from_pretrained("nghuyong/ernie-gram-zh")
Downloads last month
128
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nghuyong/ernie-gram-zh

Quantizations
1 model