add unicodedata
99300a7
-
__pycache__
fix __main__ module issue on phoBERT.py
-
gru_model.tf
Upload variables.data-00000-of-00001
-
lstm_model.tf
Upload variables.data-00000-of-00001
-
train
change file path
-
1.86 kB
Upload variables.data-00000-of-00001
-
208 Bytes
First commit
-
188 Bytes
initial commit
-
22 Bytes
add bert tokenizer
-
2.49 kB
add unicodedata
-
1.14 MB
add bert tokenizer
-
245 MB
change file path
-
202 MB
Upload 2 files
-
244 MB
change file path
-
202 MB
Upload 2 files
-
462 Bytes
remove add.py
-
2.13 kB
fix __main__ module issue on phoBERT.py
phoBertModel.pth
Detected Pickle imports (26)
- "__main__.PhoBertModel",
- "transformers.models.roberta.modeling_roberta.RobertaModel",
- "transformers.models.roberta.modeling_roberta.RobertaAttention",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.models.roberta.modeling_roberta.RobertaPooler",
- "torch.LongStorage",
- "transformers.models.roberta.modeling_roberta.RobertaEncoder",
- "torch.nn.modules.activation.Tanh",
- "torch._C._nn.gelu",
- "collections.OrderedDict",
- "transformers.models.roberta.modeling_roberta.RobertaSelfOutput",
- "transformers.models.roberta.modeling_roberta.RobertaOutput",
- "torch._utils._rebuild_parameter",
- "torch.FloatStorage",
- "transformers.activations.GELUActivation",
- "torch.nn.modules.sparse.Embedding",
- "__builtin__.set",
- "transformers.models.roberta.modeling_roberta.RobertaLayer",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.roberta.modeling_roberta.RobertaEmbeddings",
- "transformers.models.roberta.configuration_roberta.RobertaConfig",
- "transformers.models.roberta.modeling_roberta.RobertaIntermediate",
- "torch.nn.modules.linear.Linear",
- "transformers.models.roberta.modeling_roberta.RobertaSelfAttention"
How to fix it?
542 MB
add BERT model
-
129 Bytes
add unicodedata
-
167 Bytes
add bert tokenizer
tokenizer.pkl
Detected Pickle imports (4)
- "collections.OrderedDict",
- "collections.defaultdict",
- "builtins.int",
- "keras.src.preprocessing.text.Tokenizer"
How to fix it?
9.15 MB
change file path
-
1.14 kB
add bert tokenizer
-
895 kB
add bert tokenizer