Update README.md
1691910
verified
-
1.52 kB
initial commit
-
5.38 kB
Upload folder using huggingface_hub (#1)
-
293 Bytes
Upload folder using huggingface_hub (#1)
-
0 Bytes
Update README.md
model.pt
Detected Pickle imports (24)
- "quanto.nn.qlinear.QLinear",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3Attention",
- "torch.device",
- "torch.nn.modules.sparse.Embedding",
- "torch._utils._rebuild_parameter",
- "__builtin__.set",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3RMSNorm",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3ForCausalLM",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.container.ModuleList",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3LongRoPEScaledRotaryEmbedding",
- "torch.float8_e4m3fn",
- "torch.nn.modules.dropout.Dropout",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3DecoderLayer",
- "torch.BFloat16Storage",
- "torch.nn.modules.activation.SiLU",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3Model",
- "transformers.generation.configuration_utils.GenerationConfig",
- "torch.bfloat16",
- "torch.FloatStorage",
- "quanto.tensor.qtype.qtype",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.modeling_phi3.Phi3MLP",
- "transformers_modules.microsoft.Phi-3-mini-128k-instruct.d548c233192db00165d842bf8edff054bb3212f8.configuration_phi3.Phi3Config",
- "collections.OrderedDict"
How to fix it?
7.64 GB
Upload folder using huggingface_hub (#1)
-
1.04 kB
Upload folder using huggingface_hub (#1)
-
569 Bytes
Upload folder using huggingface_hub (#1)
-
1.84 MB
Upload folder using huggingface_hub (#1)
-
500 kB
Upload folder using huggingface_hub (#1)
-
3.34 kB
Upload folder using huggingface_hub (#1)