Edit model card

Null-GPT2

Description

This is a GPT2 Model, but only with the architecture, no pre-trained weights, biases, attention, etc.

This is useful for researchers who want to play with training the model (not tuning).

Generated via the github repo Model Architecture Generator

Use

First go into the directory of the model,

git clone https://github.com/ivanhe123/Model-Architecture-Generator
python -m randomnize_params -in "./NullGPT2" -out path_model_out

path_model_out is just the output path of the newly randomnized model.

Downloads last month
0
Safetensors
Model size
124M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Inoob/NullGPT2

Finetuned
(1172)
this model