mav23 commited on
Commit
46ec2b6
1 Parent(s): ff2f17e

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ code-llama-2-13b-instruct-text2sql.Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,137 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: llama2
3
+ datasets:
4
+ - bugdaryan/sql-create-context-instruction
5
+ language:
6
+ - en
7
+ pipeline_tag: text-generation
8
+
9
+ widget:
10
+ - text: "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE head (age INTEGER) Question: How many heads of the departments are older than 56 ? [/INST] Here is the SQLite query to answer to the question: How many heads of the departments are older than 56 ?: ```"
11
+ example_title: "Example 1"
12
+ - text: "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE people (first_name VARCHAR) Question: List the first names of people in alphabetical order? [/INST] Here is the SQLite query to answer to the question: List the first names of people in alphabetical order?: ```"
13
+ example_title: "Example 2"
14
+ - text: "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE weather (zip_code VARCHAR, mean_sea_level_pressure_inches INTEGER) Question: What is the zip code in which the average mean sea level pressure is the lowest? [/INST] Here is the SQLite query to answer to the question: What is the zip code in which the average mean sea level pressure is the lowest?: ```"
15
+ example_title: "Example 3"
16
+ - text: "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE weather (date VARCHAR, mean_temperature_f VARCHAR, mean_humidity VARCHAR, max_gust_speed_mph VARCHAR) Question: What are the date, mean temperature and mean humidity for the top 3 days with the largest max gust speeds? [/INST] Here is the SQLite query to answer to the question: What are the date, mean temperature and mean humidity for the top 3 days with the largest max gust speeds?: ```"
17
+ example_title: "Example 4"
18
+ - text: "[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: CREATE TABLE trip (end_station_id VARCHAR); CREATE TABLE station (id VARCHAR, city VARCHAR) Question: Count the number of trips that did not end in San Francisco city. [/INST] Here is the SQLite query to answer to the question: Count the number of trips that did not end in San Francisco city.: ```"
19
+ example_title: "Example 5"
20
+
21
+ ---
22
+ # **Code-Llama-2-13B-instruct-text2sql Model Card**
23
+
24
+ **Model Name**: Code-Llama-2-13B-instruct-text2sql
25
+
26
+ **Description**: This model is a fine-tuned version of the Code Llama 2 with 13 billion parameters, specifically tailored for text-to-SQL tasks. It has been trained to generate SQL queries given a database schema and a natural language question.
27
+
28
+ ## Model Information
29
+
30
+ - **Base Model**: [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf)
31
+ - **Finetuning Dataset**: [bugdaryan/sql-create-context-instruction](https://huggingface.co/datasets/bugdaryan/sql-create-context-instruction)
32
+ - **Training Time**: Approximately 4 hours on 2 V100 32GB GPUs
33
+
34
+ ## LoRA Parameters
35
+
36
+ - **lora_r**: 64
37
+ - **lora_alpha**: 16
38
+ - **lora_dropout**: 0.1
39
+
40
+ ## bitsandbytes Parameters
41
+
42
+ - **use_4bit**: True
43
+ - **bnb_4bit_compute_dtype**: float16
44
+ - **bnb_4bit_quant_type**: nf4
45
+ - **use_nested_quant**: False
46
+
47
+ ## Training Parameters
48
+
49
+ - **Number of Training Epochs**: 1
50
+ - **Mixed-Precision Training (fp16/bf16)**: False
51
+ - **Batch Size per GPU for Training**: 32
52
+ - **Batch Size per GPU for Evaluation**: 4
53
+ - **Gradient Accumulation Steps**: 1
54
+ - **Gradient Checkpointing**: True
55
+ - **Maximum Gradient Norm (Gradient Clipping)**: 0.3
56
+ - **Initial Learning Rate**: 2e-4
57
+ - **Weight Decay**: 0.001
58
+ - **Optimizer**: paged_adamw_32bit
59
+ - **Learning Rate Scheduler Type**: cosine
60
+ - **Max Steps**: -1
61
+ - **Warmup Ratio**: 0.03
62
+ - **Group Sequences by Length**: True
63
+ - **Save Checkpoint Every X Update Steps**: 0
64
+ - **Log Every X Update Steps**: 25
65
+
66
+ ## License
67
+
68
+ This model is governed by a custom commercial license from Code Llama. For details, please visit: [Custom Commercial License](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
69
+
70
+ ## Intended Use
71
+
72
+ **Intended Use Cases**: This model is intended for commercial and research use in English. It is designed for text-to-SQL tasks, enabling users to generate SQL queries from natural language questions.
73
+
74
+ **Out-of-Scope Uses**: Any use that violates applicable laws or regulations, use in languages other than English, or any other use prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
75
+
76
+ ## Model Capabilities
77
+
78
+ - Code completion.
79
+ - Infilling.
80
+ - Instructions / chat.
81
+
82
+ ## Model Architecture
83
+
84
+ Code-Llama-2-13B-instruct-text2sql is an auto-regressive language model that uses an optimized transformer architecture.
85
+
86
+ ## Model Dates
87
+
88
+ This model was trained between January 2023 and July 2023.
89
+
90
+ ## Ethical Considerations and Limitations
91
+
92
+ Code-Llama-2-13B-instruct-text2sql is a powerful language model, but it may produce inaccurate or objectionable responses in some instances. Safety testing and tuning are recommended before deploying this model in specific applications.
93
+
94
+ ## Hardware and Software
95
+
96
+ - **Training Libraries**: Custom training libraries
97
+ - **Training Hardware**: 2 V100 32GB GPUs
98
+ - **Carbon Footprint**: Training all Code Llama models required 400K GPU hours on A100-80GB hardware with emissions offset by Meta's sustainability program.
99
+
100
+ ## Training Data
101
+
102
+ This model was trained and fine-tuned on the same data as Llama 2 with different weights.
103
+
104
+ ## Evaluation Results
105
+
106
+ For evaluation results, please refer to Section 3 and safety evaluations in Section 4 of the research paper.
107
+
108
+ ## Example Code
109
+
110
+ You can use the Code-Llama-2-13B-instruct-text2sql model to generate SQL queries from natural language questions, as demonstrated in the following code snippet:
111
+
112
+ ```python
113
+ from transformers import (
114
+ AutoModelForCausalLM,
115
+ AutoTokenizer,
116
+ pipeline
117
+ )
118
+ import torch
119
+
120
+ model_name = 'bugdaryan/Code-Llama-2-13B-instruct-text2sql'
121
+
122
+ model = AutoModelForCausalLM.from_pretrained(model_name, device_map='auto')
123
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
124
+
125
+ pipe = pipeline('text-generation', model=model, tokenizer=tokenizer)
126
+
127
+ table = "CREATE TABLE sales ( sale_id number PRIMARY KEY, product_id number, customer_id number, salesperson_id number, sale_date DATE, quantity number, FOREIGN KEY (product_id) REFERENCES products(product_id), FOREIGN KEY (customer_id) REFERENCES customers(customer_id), FOREIGN KEY (salesperson_id) REFERENCES salespeople(salesperson_id)); CREATE TABLE product_suppliers ( supplier_id number PRIMARY KEY, product_id number, supply_price number, FOREIGN KEY (product_id) REFERENCES products(product_id)); CREATE TABLE customers ( customer_id number PRIMARY KEY, name text, address text ); CREATE TABLE salespeople ( salesperson_id number PRIMARY KEY, name text, region text ); CREATE TABLE product_suppliers ( supplier_id number PRIMARY KEY, product_id number, supply_price number );"
128
+
129
+ question = 'Find the salesperson who made the most sales.'
130
+
131
+ prompt = f"[INST] Write SQLite query to answer the following question given the database schema. Please wrap your code answer using ```: Schema: {table} Question: {question} [/INST] Here is the SQLite query to answer to the question: {question}: ``` "
132
+
133
+ ans = pipe(prompt, max_new_tokens=100)
134
+ print(ans[0]['generated_text'].split('```')[2])
135
+ ```
136
+
137
+ This code demonstrates how to utilize the model for generating SQL queries based on a provided database schema and a natural language question. It showcases the model's capability to assist in SQL query generation for text-to-SQL tasks.
code-llama-2-13b-instruct-text2sql.Q4_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e4a18e8d485e4f1146d52a3842574fd7d6be0208820de3e72579586091887546
3
+ size 7365949120