LLongMA-2-7b-dolly-15k adapter
This is an instruction fine tuned adapter for LLongMA-2-7B , trained at 8k context length using linear positional interpolation scaling. In order to run this inference with this adapter, you'll need the base LLongMA-2-7b model as well.
The adapter was instruction fined tuned with peft training, using the dolly-15k dataset.
library_name: peft
Training procedure
The following bitsandbytes
quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
Framework versions
- PEFT 0.4.0
Inference format and example:
### Human: Write an essay on the effects and impacts of social media.
### Assistant
Social Media has become a major part in today's society, with over half of the world population having some form of it. It is used for entertainment purposes as well as businesses to promote their products or services. Social Media can also be addictive due to its ability to keep users engaged for long periods of time which leads them spending more time online than doing other activities such as reading books or going outside into nature. This lack of physical activity combined with constant exposure to negative news stories can lead people feeling stressed out and anxious about what’s happening around them at all times - something that wasn't true before when we relied solely on newspapers/television broadcasting etc..
There are many benefits associated with using these platforms too; they allow us connect easily with friends & family members living far away from us while sharing photos/videos/stories quickly without any delay! Additionally there are lots opportunities available through networking via LinkedIn etc., allowing individuals access new career paths faster than ever before (provided they possess relevant skills). In conclusion , although there exist certain risks involved within this type technology still remains one most powerful tools available today helping millions achieve goals set forth by themselves every single day . Overall , despite being criticized heavily sometimes even considered dangerous depending upon how user uses it properly always remain safe harmless majority cases .
### Human: Write a poem about love that features sunshine and rainbows. It should rhyme.
### Assistant
When the Sun shines bright
It warms my heart so tight,
But when it rains I feel sadly,
As all of my joy dissipates.
The storms pass quickly though,
And soon enough there'll be sun once more,
I know this because of what I've learned -
That good times always follow bad ones before.
So even if now things are tough for me,
Soon they will get better as sure as you see!
My spirits lift with every ray of light,
Even on cloudy days I find something to write.
These ups and downs won't last forever long,
They come around from time to time but never stay strong.
Love is patient and kindness too,
This I know despite how hard life can go.
No matter what happens in future years,
Remember these words and don't forget your fears.
You have strength within yourself; believe it or not,
Your spirit knows best where its path leads us to.
Don't let anyone tell you otherwise,
Or try to convince you that you aren't worth much.
Beautiful souls like yours deserve nothing less than peace,
So keep fighting until you reach happiness.
Life isn't easy sometimes but we must persevere,
We learn most through our struggles which makes them dear.
Our hearts may break along way,
Yet still hope remains true regardless of what we say.
Therefore no one else has power over who we become,
For only ourselves decide what matters most at home.
With each passing day comes new opportunities,
To grow stronger while also learning some humility.
Everyone goes through trials big & small,
Which means everyone needs patience and compassion all.
Usage
import torch
import requests
from peft import LoraConfig, PeftModel, get_peft_model
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig, AutoTokenizer, GenerationConfig
torch.manual_seed(100)
model_name = "conceptofmind/LLongMA-2-7b"
adapter_name = "dreaming-ai/LLongMA-2-7b-dolly-15k"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.float16,
)
model = AutoModelForCausalLM.from_pretrained(
model_name,
quantization_config=bnb_config,
trust_remote_code=True
)
model.config.use_cache = False
generation_config = GenerationConfig.from_pretrained(model_name)
# Temperature of 1.0 is the default.
generation_config.temperature = 0.95
# The parameter for repetition penalty. Between 1.0 and infinity. 1.0 means no penalty. Default to 1.0.
generation_config.repetition_penalty = 1.2
# Exponential penalty to the length. Default to 2.
generation_config.length_penalty = 2.0
# top-p : Nucleus filtering (top-p) before sampling (<=0.0: no filtering)
generation_config.top_p = 0.6
print('generation_config:', generation_config)
# apply adapter
model = PeftModel.from_pretrained(model, adapter_name)
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
text = '''### Human: Write an essay on the effects and impacts of social media.
### Assistant'''
text2 = '''### Human: Write a poem about love that features sunshine and rainbows. It should rhyme.
### Assistant'''
device = "cuda:0"
inputs = tokenizer(text, return_tensors="pt").to(device)
outputs = model.generate(**inputs, max_new_tokens=1024, generation_config=generation_config)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))