vkehfdl1 commited on
Commit
042e08f
1 Parent(s): c46f31f

Change max_position_embeddings to 512

Browse files

When embedding in CUDA-enable environment, the Device side Assertion error occur when I try to put more than 512 embedding dimension. It looks like the config.json is wrong, the real max_position_embeddings are 512.

Files changed (1) hide show
  1. config.json +1 -1
config.json CHANGED
@@ -12,7 +12,7 @@
12
  "initializer_range": 0.02,
13
  "intermediate_size": 3072,
14
  "layer_norm_eps": 1e-05,
15
- "max_position_embeddings": 514,
16
  "model_type": "mpnet",
17
  "num_attention_heads": 12,
18
  "num_hidden_layers": 12,
 
12
  "initializer_range": 0.02,
13
  "intermediate_size": 3072,
14
  "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 512,
16
  "model_type": "mpnet",
17
  "num_attention_heads": 12,
18
  "num_hidden_layers": 12,