Uploaded model
- Developed by: circlelee
- License: apache-2.0
- Finetuned from model : unsloth/gemma-2-2b-it-bnb-4bit
This gemma2 model was trained 2x faster with Unsloth and Huggingface's TRL library.
Model Information
Summary description and brief definition of inputs and outputs.
Description
This model is based on Gemma2 and is fine-tuned to generate SQL from Natural Language.
Usage
Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:
pip install -U transformers
...
from transformers import AutoTokenizer, AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("circlelee/gemma-2-2b-it-nl2sql")
tokenizer = AutoTokenizer.from_pretrained("circlelee/gemma-2-2b-it-nl2sql", trust_remote_code=True)
table_schemas = "CREATE TABLE person ( name VARCHAR, age INTEGER, address VARCHAR )"
user_query = "people whoes ages are older than 27 and name starts with letter 'k'"
messages = [
{"role": "user", "content": f"""Use the below SQL tables schemas paired with instruction that describes a task. make SQL query that appropriately completes the request for the provided tables. And make SQL query according the steps.
{table_schemas}
step 1. check columns that I want.
step 2. check condition that I want.
step 3. make SQL query to get every information that I want.
{user_query}
"""}
]
formated_messages = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True, return_tensors="pt")
input_ids = tokenizer(formated_messages, return_tensors="pt")
outputs = model.generate(**input_ids, max_new_tokens=64)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 43
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for circlelee/gemma-2-2b-it-nl2sql
Base model
unsloth/gemma-2-2b-it-bnb-4bit