Model card of JOSIExM7-7b
This is my Token customized liminerity/M7-7b model
This is based on liminerity/M7-7b model with added custom special Tokens. This wil most likely be my next Model, trained on my own Dataset.
New BOS and EOS Tokens
BOS = '<|startoftext|>'
EOS = '<|endoftext|>'
Model Architecture:
MistralForCausalLM(
(model): MistralModel(
(embed_tokens): Embedding(32002, 4096)
(layers): ModuleList(
(0-31): 32 x MistralDecoderLayer(
(self_attn): MistralSdpaAttention(
(q_proj): Linear(in_features=4096, out_features=4096, bias=False)
(k_proj): Linear(in_features=4096, out_features=1024, bias=False)
(v_proj): Linear(in_features=4096, out_features=1024, bias=False)
(o_proj): Linear(in_features=4096, out_features=4096, bias=False)
(rotary_emb): MistralRotaryEmbedding()
)
(mlp): MistralMLP(
(gate_proj): Linear(in_features=4096, out_features=14336, bias=False)
(up_proj): Linear(in_features=4096, out_features=14336, bias=False)
(down_proj): Linear(in_features=14336, out_features=4096, bias=False)
(act_fn): SiLU()
)
(input_layernorm): MistralRMSNorm()
(post_attention_layernorm): MistralRMSNorm()
)
)
(norm): MistralRMSNorm()
)
(lm_head): Linear(in_features=4096, out_features=32002, bias=False)
)
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Goekdeniz-Guelmez/JOSIExM7-7b
Base model
liminerity/M7-7b