Update README.md
c93776b
verified
-
1.52 kB
initial commit
-
5.38 kB
Update README.md
-
1.68 MB
Upload folder using huggingface_hub (#1)
-
0 Bytes
Update README.md
model.pt
Detected Pickle imports (26)
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.configuration_phi3_small.Phi3SmallConfig",
- "quanto.nn.qlinear.QLinear",
- "torch.nn.modules.container.ModuleList",
- "torch.device",
- "__builtin__.set",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.triton_blocksparse_attention_layer.BlockSparseAttentionLayer",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.modeling_phi3_small.Phi3SmallModel",
- "torch.BoolStorage",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.modeling_phi3_small.Phi3SmallMLP",
- "collections.OrderedDict",
- "torch._utils._rebuild_parameter",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.modeling_phi3_small.Phi3SmallSelfAttention",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.modeling_phi3_small.Phi3SmallDecoderLayer",
- "torch.nn.modules.dropout.Dropout",
- "torch.FloatStorage",
- "transformers.generation.configuration_utils.GenerationConfig",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.triton_flash_blocksparse_attn.BlockSparseParams",
- "quanto.tensor.qtype.qtype",
- "torch._utils._rebuild_tensor_v2",
- "torch.BFloat16Storage",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.modeling_phi3_small.Phi3SmallForCausalLM",
- "torch.int8",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.bfloat16",
- "transformers_modules.microsoft.Phi-3-small-8k-instruct.69caae1f2acea34b26f535fecb1f2abb9a304695.positional_embedding.RotaryEmbedding",
- "torch.nn.modules.sparse.Embedding"
How to fix it?
15.7 GB
Upload folder using huggingface_hub (#1)
-
1.03 kB
Upload folder using huggingface_hub (#1)
-
99 Bytes
Upload folder using huggingface_hub (#1)
-
769 Bytes
Upload folder using huggingface_hub (#1)