Edit model card
Things I noticed with GPT-J-6B-Skein-4bit when compared to OPT-6B-nerys-v2
    -Outputs special characters even if previous context doesn't have any
    -Original text style isn't always reflected in newly generated text
    -Writes names wrong e.g. Arktinu -> Arkkinu or Semuanya -> Semua
    -Uses wrong pronouns even if this information is in W info
    -Sometimes splits actions such "I go to home. I open the box." rather than writing "I go to home and open the box." even in story mode
    -Does not generate dialogue tags
    -"LayerNormKernelImpl" not implemented for 'Half'
 
Quantization Details

#GTX 1080 8GB
$time python gptj.py ${MODEL_DIR} c4 --nsamples 48 --wbits 4 --groupsize 128 --save ${OUT_DIR}/GPT-J-6B-Skein-4bit-128g/4bit-128g.pt
#real	25m0.999s

Benchmarks
nvidia gtx 1080 8GB, ryzen 5900x, tridenz 4x8gb 3200mhz 16cl

0cc4m KoboldAI
UI2 experimental
Game mode Story
output length 78
context tokens 2048
top_p 1 top_k 0 tail_free 0.95
rep_pen 1.19
rep_pen_range
RNG seed 1
Full Determinism On (despite of this, only few first sentences are identical after restart)

Highest tokens per second:
    model		- layers gpu/disk - tps
OPT-2.7B-Nerys-v2 	- layers 29/3 - 8.44 tokens per second
OPT-6B-nerys-v2 	- layers 8/24 - 1.09 tokens per second
GPT-J-6B-Skein 		- layers 7/21 - 1.18 tokens per second
GPT-J-6B-Skein-4bit - layers 26/2 - 5.06 tokens per second

OPT-6B-nerys-v2-4bit - out of vram and unable to use ram: Expected all tensors to be on the same device, but found at least two, cuda:0 and cpu! devices

Text generation
Generated in +1000 action story with in depth W info that the 2048 context tokens are definitely used up

Prompt: The morning I give the herbs to Kura. "You might find use for these." I say.

 OPT-2.7B-Nerys-v2
    "Thanks, I appreciate it."
    I go back inside, and grab my cloak and helmet.
    When I stand up I notice that Kura has already taken the helm.
    "Hey, I don't need it."
    "No, but I do. You are the leader now and I will follow you."
    "Well, I guess that makes sense."
    "

 OPT-6B-nerys-v2
    "Thank you, I appreciate it."
    I start walking towards the road.
    Kura says "Aren't you going to get revenge?"
    "No, I got nothing against her, she was just trying to scare me."
    "Good idea."
    "Yes, let's move."
    After a short walk I feel the ground shaking beneath my feet.
    I turn around

 GPT-J-6B-Skein
    "Thanks, I'll take care of the rest." He replies.
    "Good luck." I say.
    "Thank you."

    Kura departs from me and I watch him disappear into the horizon. I walk to the edge of the cliff and look out over the desert.
    I think that I could spend the rest of my days here.
    I wonder

 GPT-J-6B-Skein-4bit-128g
    "Thanks, I owe you for saving me. I promise to repay the debt."
    "No need to worry, I know I am not worth enough."
    "How much are you willing to pay?"
    "Nothing, but I wish to help you somehow."
    "That's very kind of you."
    "I shall think of something."

    After talking for



License Apache License 2.0 

Original Model https://huggingface.co/KoboldAI/GPT-J-6B-Skein
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using Tokenisrequired/GPT-J-6B-Skein-4bit-128g 1