Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DataCanvas
/
Alaya-7B-Base
like
4
Follow
DataCanvas
7
Text Generation
Transformers
PyTorch
Chinese
English
mpt
custom_code
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
9811d73
Alaya-7B-Base
1 contributor
History:
5 commits
ShellyYuJJ
Update README.md
9811d73
about 1 year ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
README.md
3.08 kB
Update README.md
about 1 year ago
adapt_tokenizer.py
1.72 kB
upload model
about 1 year ago
attention.py
21.6 kB
upload model
about 1 year ago
blocks.py
2.84 kB
upload model
about 1 year ago
config.json
1.25 kB
upload model
about 1 year ago
configuration_mpt.py
11 kB
first upload
about 1 year ago
custom_embedding.py
292 Bytes
upload model
about 1 year ago
fc.py
167 Bytes
upload model
about 1 year ago
ffn.py
1.75 kB
upload model
about 1 year ago
flash_attn_triton.py
28.2 kB
upload model
about 1 year ago
generation_config.json
91 Bytes
upload model
about 1 year ago
hf_prefixlm_converter.py
27.6 kB
upload model
about 1 year ago
meta_init_context.py
3.96 kB
upload model
about 1 year ago
modeling_mpt.py
20.1 kB
first upload
about 1 year ago
norm.py
3.12 kB
upload model
about 1 year ago
param_init_fns.py
11.9 kB
upload model
about 1 year ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.89 GB
LFS
upload model
about 1 year ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
3.49 GB
LFS
upload model
about 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
What is a pickle import?
13.4 GB
LFS
upload model
about 1 year ago
pytorch_model.bin.index.json
31.5 kB
first upload
about 1 year ago
special_tokens_map.json
548 Bytes
first upload
about 1 year ago
tokenizer.json
3.25 MB
first upload
about 1 year ago
tokenizer.model
1.2 MB
LFS
first upload
about 1 year ago
tokenizer_config.json
917 Bytes
first upload
about 1 year ago