sreevidya16
commited on
Commit
•
981bfd0
1
Parent(s):
ac408be
Upload 8 files
Browse files- .env +1 -0
- .gitignore +60 -0
- LICENSE +21 -0
- README.md +89 -14
- app.py +249 -0
- rag_basics.ipynb +537 -0
- requirements.txt +10 -0
- sample.txt +13 -0
.env
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
OPENAI_API_KEY=<API_KEY_HERE>
|
.gitignore
ADDED
@@ -0,0 +1,60 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Logs
|
2 |
+
logs
|
3 |
+
*.log
|
4 |
+
npm-debug.log*
|
5 |
+
yarn-debug.log*
|
6 |
+
yarn-error.log*
|
7 |
+
|
8 |
+
# Compiled binary addons (http://nodejs.org/api/addons.html)
|
9 |
+
build/Release
|
10 |
+
|
11 |
+
# Dependency directories
|
12 |
+
node_modules/
|
13 |
+
jspm_packages/
|
14 |
+
|
15 |
+
# Distribution directories
|
16 |
+
dist/
|
17 |
+
|
18 |
+
# Typescript v1 declaration files
|
19 |
+
typings/
|
20 |
+
|
21 |
+
# Optional npm cache directory
|
22 |
+
.npm
|
23 |
+
|
24 |
+
# Optional eslint cache
|
25 |
+
.eslintcache
|
26 |
+
|
27 |
+
# Optional REPL history
|
28 |
+
.node_repl_history
|
29 |
+
|
30 |
+
# Output of 'npm pack'
|
31 |
+
*.tgz
|
32 |
+
|
33 |
+
# Yarn Integrity file
|
34 |
+
.yarn-integrity
|
35 |
+
|
36 |
+
# IDE directories and files
|
37 |
+
.idea/
|
38 |
+
.vscode/
|
39 |
+
|
40 |
+
# OS generated files
|
41 |
+
.DS_Store
|
42 |
+
|
43 |
+
# Build artifacts
|
44 |
+
build
|
45 |
+
|
46 |
+
# Package lock files
|
47 |
+
# Note: package-lock.json is intended to be checked into source control.
|
48 |
+
# However, if you're using yarn, you might want to ignore yarn.lock
|
49 |
+
package-lock.json
|
50 |
+
|
51 |
+
# Environment variables
|
52 |
+
.env
|
53 |
+
|
54 |
+
# Test snapshots
|
55 |
+
__snapshots__/
|
56 |
+
|
57 |
+
# Lerna debug logs
|
58 |
+
lerna-debug.log
|
59 |
+
|
60 |
+
|
LICENSE
ADDED
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
MIT License
|
2 |
+
|
3 |
+
Copyright (c) 2024 Malhar Kuldeep Sandesh
|
4 |
+
|
5 |
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6 |
+
of this software and associated documentation files (the "Software"), to deal
|
7 |
+
in the Software without restriction, including without limitation the rights
|
8 |
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9 |
+
copies of the Software, and to permit persons to whom the Software is
|
10 |
+
furnished to do so, subject to the following conditions:
|
11 |
+
|
12 |
+
The above copyright notice and this permission notice shall be included in all
|
13 |
+
copies or substantial portions of the Software.
|
14 |
+
|
15 |
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16 |
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17 |
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18 |
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19 |
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20 |
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
21 |
+
SOFTWARE.
|
README.md
CHANGED
@@ -1,14 +1,89 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
# VideoChad🗿 : RAG based Youtube Video Conversational Chat-Bot 📤📺
|
3 |
+
|
4 |
+
### *Got Bored watching Long Youtube Videos ? Here's a Full Stack App that*
|
5 |
+
- ⭐ **Generates Smart Summary** ⭐
|
6 |
+
- ↪ **Provides BackLinks (Reference) to the Video (No hallucination)** ↪
|
7 |
+
- 🗣 **(ChatBot) Enables you to have Conversation with Video** 🗣
|
8 |
+
- 🧠**Generates MindMap** 🧠
|
9 |
+
|
10 |
+
## Demo
|
11 |
+
|
12 |
+
[![Thumbnail](https://img.youtube.com/vi/_fflcGaQjBM/0.jpg)](https://www.youtube.com/watch?v=_fflcGaQjBM)
|
13 |
+
|
14 |
+
|
15 |
+
## Features
|
16 |
+
|
17 |
+
- **Automated Video Summarization**: The application generates concise summaries of video content, allowing users to grasp key concepts efficiently and identify areas requiring deeper focus.
|
18 |
+
- **Real-time Chat Interaction**: Users can engage in conversation with the video content, fostering a deeper understanding of the subject matter by asking questions and receiving instant responses.
|
19 |
+
- **Video Backlinking**: The application incorporates a backlinking feature that enables users to seek relevant timestamps in the video player by clicking on provided reference links.
|
20 |
+
- **MindMap**: Generates a interactive mindmap using the important keywords from the video content's essence!
|
21 |
+
- **Transcript Download**: Users can download a text file containing the transcript of the processed video for future reference.
|
22 |
+
|
23 |
+
|
24 |
+
## Technologies Used
|
25 |
+
|
26 |
+
- **Flask**: A lightweight Python web framework used for building the backend API.
|
27 |
+
- **React**: A JavaScript library for building the user interface on the frontend.
|
28 |
+
- **Large Language Models (LLMs)**: Specifically, the OpenAI ChatGPT 3.5 (gpt-3.5-turbo) model is employed for generating contextual responses.
|
29 |
+
- **Retrieval-Augmented Generation (RAG)**: This approach combines a retriever and a language model, allowing for efficient retrieval of relevant information from the video transcript.
|
30 |
+
- **LangChain**: A framework for building applications with large language models, facilitating the integration of the RAG approach.
|
31 |
+
- **Vector Database (Chroma)**: A vector database used for storing and efficiently searching the embeddings of the video transcript.
|
32 |
+
- **OpenAI Embeddings API**: Utilized for converting textual data into high-dimensional vector representations.
|
33 |
+
- **YouTube API**: Employed for fetching video transcripts and metadata.
|
34 |
+
|
35 |
+
## Getting Started
|
36 |
+
|
37 |
+
To get started with this application, follow these steps:
|
38 |
+
|
39 |
+
1. Clone the repository:
|
40 |
+
```
|
41 |
+
git clone https://github.com/foolmalhar/VideoChad.git
|
42 |
+
```
|
43 |
+
|
44 |
+
2. Install the required dependencies:
|
45 |
+
```
|
46 |
+
cd VideoChad
|
47 |
+
pip install -r requirements.txt
|
48 |
+
```
|
49 |
+
|
50 |
+
3. Set up the necessary environment variables:
|
51 |
+
- `OPENAI_API_KEY`: Your OpenAI API key for accessing the language models. [OpenAi Platform](https://platform.openai.com/account/api-keys) | [Account Setup](https://platform.openai.com/docs/quickstart/account-setup)
|
52 |
+
|
53 |
+
4. Start the Flask backend:
|
54 |
+
```
|
55 |
+
python app.py
|
56 |
+
```
|
57 |
+
|
58 |
+
5. In a separate terminal, start the React frontend: (optional)
|
59 |
+
```
|
60 |
+
cd VideoChad
|
61 |
+
npm install
|
62 |
+
npm run dev
|
63 |
+
```
|
64 |
+
|
65 |
+
6. Access the application in your web browser at `http://localhost:5000`.
|
66 |
+
( if you don't use static pre-built files and are running node on the VideoChad frontend, then port might be :3000, check terminal )
|
67 |
+
|
68 |
+
## Usage
|
69 |
+
|
70 |
+
1. Enter a valid YouTube video link in the provided input field. (Link must have English Transcript available on Youtube )
|
71 |
+
2. The application will fetch the video transcript and generate a summary.
|
72 |
+
3. Interact with the video content by asking questions in the chat interface.
|
73 |
+
4. Click on the provided reference links to seek relevant timestamps in the video player.
|
74 |
+
5. explore !
|
75 |
+
|
76 |
+
## Contributing
|
77 |
+
|
78 |
+
Contributions to this project are welcome! If you encounter any issues or have suggestions for improvements, please open an issue or submit a pull request.
|
79 |
+
|
80 |
+
## License
|
81 |
+
|
82 |
+
This project is licensed under the [MIT License](LICENSE).
|
83 |
+
|
84 |
+
## Acknowledgments
|
85 |
+
|
86 |
+
- [DeepLearning.ai Short Course](https://www.deeplearning.ai/short-courses/langchain-chat-with-your-data/) Understand RAG with Langchain and Chromadb
|
87 |
+
- [LangChain](https://www.langchain.com/) The Tool!
|
88 |
+
- [OpenAI](https://openai.com/) for their powerful language models and APIs.
|
89 |
+
- [Chroma](https://www.trychroma.com/) for their vector database solution.
|
app.py
ADDED
@@ -0,0 +1,249 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
from flask import Flask, request, jsonify, send_from_directory
|
3 |
+
# from flask_session import Session
|
4 |
+
from flask_cors import CORS # <-- New import here
|
5 |
+
from flask_cors import cross_origin
|
6 |
+
import openai
|
7 |
+
import os
|
8 |
+
from pytube import YouTube
|
9 |
+
import re
|
10 |
+
from langchain_openai.chat_models import ChatOpenAI
|
11 |
+
from langchain.chains import ConversationalRetrievalChain
|
12 |
+
from langchain_openai import OpenAIEmbeddings
|
13 |
+
from langchain.text_splitter import RecursiveCharacterTextSplitter
|
14 |
+
from langchain_community.document_loaders import TextLoader
|
15 |
+
from langchain_community.vectorstores import Chroma
|
16 |
+
from youtube_transcript_api import YouTubeTranscriptApi
|
17 |
+
from dotenv import load_dotenv
|
18 |
+
|
19 |
+
load_dotenv()
|
20 |
+
|
21 |
+
app = Flask(__name__, static_folder="./dist") # requests in the dist folder are being sent to http://localhost:5000/<endpoint>
|
22 |
+
CORS(app, resources={r"/*": {"origins": "*"}})
|
23 |
+
openai.api_key = os.environ["OPENAI_API_KEY"]
|
24 |
+
llm_name = "gpt-3.5-turbo"
|
25 |
+
qna_chain = None
|
26 |
+
|
27 |
+
|
28 |
+
@app.route('/', defaults={'path': ''})
|
29 |
+
@app.route('/<path:path>')
|
30 |
+
def serve(path):
|
31 |
+
if path != "" and os.path.exists(app.static_folder + '/' + path):
|
32 |
+
return send_from_directory(app.static_folder, path)
|
33 |
+
else:
|
34 |
+
return send_from_directory(app.static_folder, 'index.html')
|
35 |
+
|
36 |
+
def load_db(file, chain_type, k):
|
37 |
+
"""
|
38 |
+
Central Function that:
|
39 |
+
- Loads the database
|
40 |
+
- Creates the retriever
|
41 |
+
- Creates the chatbot chain
|
42 |
+
- Returns the chatbot chain
|
43 |
+
- A Dictionary containing
|
44 |
+
-- question
|
45 |
+
-- llm answer
|
46 |
+
-- chat history
|
47 |
+
-- source_documents
|
48 |
+
-- generated_question
|
49 |
+
s
|
50 |
+
Usage: question_answer_chain = load_db(file, chain_type, k)
|
51 |
+
response = question_answer_chain({"question": query, "chat_history": chat_history}})
|
52 |
+
"""
|
53 |
+
|
54 |
+
transcript = TextLoader(file).load()
|
55 |
+
|
56 |
+
text_splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=70)
|
57 |
+
docs = text_splitter.split_documents(transcript)
|
58 |
+
|
59 |
+
embeddings = OpenAIEmbeddings()
|
60 |
+
|
61 |
+
db = Chroma.from_documents(docs, embeddings)
|
62 |
+
|
63 |
+
retriever = db.as_retriever(search_type="similarity", search_kwargs={"k": k})
|
64 |
+
|
65 |
+
# create a chatbot chain. Memory is managed externally.
|
66 |
+
qa = ConversationalRetrievalChain.from_llm(
|
67 |
+
llm = ChatOpenAI(temperature=0), #### Prompt Template is yet to be created
|
68 |
+
chain_type=chain_type,
|
69 |
+
retriever=retriever,
|
70 |
+
return_source_documents=True,
|
71 |
+
return_generated_question=True,
|
72 |
+
# memory=memory
|
73 |
+
)
|
74 |
+
|
75 |
+
return qa
|
76 |
+
|
77 |
+
|
78 |
+
def buffer(history, buff):
|
79 |
+
"""
|
80 |
+
Buffer the history.
|
81 |
+
Keeps only buff recent chats in the history
|
82 |
+
|
83 |
+
Usage: history = buffer(history, buff)
|
84 |
+
"""
|
85 |
+
|
86 |
+
if len(history) > buff :
|
87 |
+
print(len(history)>buff)
|
88 |
+
return history[-buff:]
|
89 |
+
return history
|
90 |
+
|
91 |
+
|
92 |
+
def is_valid_yt(link):
|
93 |
+
"""
|
94 |
+
Check if a link is a valid YouTube link.
|
95 |
+
|
96 |
+
Usage: boolean, video_id = is_valid_yt(youtube_string)
|
97 |
+
"""
|
98 |
+
|
99 |
+
pattern = r'^(?:https?:\/\/)?(?:www\.)?(?:youtube\.com\/watch\?v=|youtu\.be\/)([\w\-_]{11})(?:\S+)?$'
|
100 |
+
match = re.match(pattern, link)
|
101 |
+
if match:
|
102 |
+
return True, match.group(1)
|
103 |
+
else:
|
104 |
+
return False, None
|
105 |
+
|
106 |
+
|
107 |
+
def get_metadata(video_id) -> dict:
|
108 |
+
"""Get important video information.
|
109 |
+
|
110 |
+
Components are:
|
111 |
+
- title
|
112 |
+
- description
|
113 |
+
- thumbnail url,
|
114 |
+
- publish_date
|
115 |
+
- channel_author
|
116 |
+
- and more.
|
117 |
+
|
118 |
+
Usage: get_metadata(id)->dict
|
119 |
+
"""
|
120 |
+
|
121 |
+
try:
|
122 |
+
from pytube import YouTube
|
123 |
+
|
124 |
+
except ImportError:
|
125 |
+
raise ImportError(
|
126 |
+
"Could not import pytube python package. "
|
127 |
+
"Please install it with `pip install pytube`."
|
128 |
+
)
|
129 |
+
yt = YouTube(f"https://www.youtube.com/watch?v={video_id}")
|
130 |
+
video_info = {
|
131 |
+
"title": yt.title or "Unknown",
|
132 |
+
"description": yt.description or "Unknown",
|
133 |
+
"view_count": yt.views or 0,
|
134 |
+
"thumbnail_url": yt.thumbnail_url or "Unknown",
|
135 |
+
"publish_date": yt.publish_date.strftime("%Y-%m-%d %H:%M:%S")
|
136 |
+
if yt.publish_date
|
137 |
+
else "Unknown",
|
138 |
+
"length": yt.length or 0,
|
139 |
+
"author": yt.author or "Unknown",
|
140 |
+
}
|
141 |
+
return video_info
|
142 |
+
|
143 |
+
|
144 |
+
def save_transcript(video_id):
|
145 |
+
"""
|
146 |
+
Saves the transcript of a valid yt video to a text file.
|
147 |
+
"""
|
148 |
+
|
149 |
+
try:
|
150 |
+
transcript = YouTubeTranscriptApi.get_transcript(video_id)
|
151 |
+
except Exception as e:
|
152 |
+
print(f"Error fetching transcript for video {video_id}: {e}")
|
153 |
+
return None
|
154 |
+
if transcript:
|
155 |
+
with open('transcript.txt', 'w') as file:
|
156 |
+
for entry in transcript:
|
157 |
+
file.write(f"~{int(entry['start'])}~{entry['text']} ")
|
158 |
+
print(f"Transcript saved to: transcript.txt")
|
159 |
+
|
160 |
+
@app.route('/init', methods=['POST'])
|
161 |
+
@cross_origin()
|
162 |
+
def initialize():
|
163 |
+
"""
|
164 |
+
Initialize the qna_chain for a user.
|
165 |
+
"""
|
166 |
+
global qna_chain
|
167 |
+
|
168 |
+
qna_chain = 0
|
169 |
+
|
170 |
+
# NEED to authenticate the user here
|
171 |
+
yt_link = request.json.get('yt_link', '')
|
172 |
+
valid, id = is_valid_yt(yt_link)
|
173 |
+
if valid:
|
174 |
+
metadata = get_metadata(id)
|
175 |
+
try:
|
176 |
+
os.remove('./transcript.txt')
|
177 |
+
except:
|
178 |
+
print("No transcript file to remove.")
|
179 |
+
|
180 |
+
save_transcript(id)
|
181 |
+
|
182 |
+
# Initialize qna_chain for the user
|
183 |
+
qna_chain = load_db("./transcript.txt", 'stuff', 5)
|
184 |
+
|
185 |
+
# os.remove('./transcript.txt')
|
186 |
+
|
187 |
+
return jsonify({"status": "success",
|
188 |
+
"message": "qna_chain initialized.",
|
189 |
+
"metadata": metadata,
|
190 |
+
})
|
191 |
+
else:
|
192 |
+
return jsonify({"status": "error", "message": "Invalid YouTube link."})
|
193 |
+
|
194 |
+
|
195 |
+
@app.route('/response', methods=['POST'])
|
196 |
+
def response():
|
197 |
+
"""
|
198 |
+
- Expects youtube Video Link and chat-history in payload
|
199 |
+
- Returns response on the query.
|
200 |
+
"""
|
201 |
+
global qna_chain
|
202 |
+
|
203 |
+
req = request.get_json()
|
204 |
+
raw = req.get('chat_history', [])
|
205 |
+
|
206 |
+
# raw is a list of list containing two strings convert that into a list of tuples
|
207 |
+
if len(raw) > 0:
|
208 |
+
chat_history = [tuple(x) for x in raw]
|
209 |
+
else:
|
210 |
+
chat_history = []
|
211 |
+
# print(f"Chat History: {chat_history}")
|
212 |
+
|
213 |
+
memory = chat_history
|
214 |
+
query = req.get('query', '')
|
215 |
+
# print(f"Query: {query}")
|
216 |
+
|
217 |
+
if memory is None:
|
218 |
+
memory = []
|
219 |
+
|
220 |
+
if qna_chain is None:
|
221 |
+
return jsonify({"status": "error", "message": "qna_chain not initialized."}), 400
|
222 |
+
|
223 |
+
response = qna_chain({'question': query, 'chat_history': buffer(memory,7)})
|
224 |
+
|
225 |
+
if response['source_documents']:
|
226 |
+
pattern = r'~(\d+)~'
|
227 |
+
backlinked_docs = [response['source_documents'][i].page_content for i in range(len(response['source_documents']))]
|
228 |
+
timestamps = list(map(lambda s: int(re.search(pattern, s).group(1)) if re.search(pattern, s) else None, backlinked_docs))
|
229 |
+
|
230 |
+
return jsonify(dict(timestamps=timestamps, answer=response['answer']))
|
231 |
+
|
232 |
+
return jsonify(response['answer'])
|
233 |
+
|
234 |
+
@app.route('/transcript', methods=['POST'])
|
235 |
+
@cross_origin()
|
236 |
+
def send_transcript():
|
237 |
+
"""
|
238 |
+
Send the transcript of the video.
|
239 |
+
"""
|
240 |
+
try:
|
241 |
+
with open('transcript.txt', 'r') as file:
|
242 |
+
transcript = file.read()
|
243 |
+
return jsonify({"status": "success", "transcript": transcript})
|
244 |
+
except:
|
245 |
+
return jsonify({"status": "error", "message": "Transcript not found."})
|
246 |
+
|
247 |
+
|
248 |
+
if __name__ == '__main__':
|
249 |
+
app.run(debug=True)
|
rag_basics.ipynb
ADDED
@@ -0,0 +1,537 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"cells": [
|
3 |
+
{
|
4 |
+
"cell_type": "code",
|
5 |
+
"execution_count": 1,
|
6 |
+
"metadata": {},
|
7 |
+
"outputs": [],
|
8 |
+
"source": [
|
9 |
+
"import os\n",
|
10 |
+
"import openai\n",
|
11 |
+
"\n",
|
12 |
+
"from langchain_openai.chat_models import ChatOpenAI\n",
|
13 |
+
"from langchain.chains import ConversationalRetrievalChain\n",
|
14 |
+
"from langchain_openai import OpenAIEmbeddings\n",
|
15 |
+
"from langchain.text_splitter import RecursiveCharacterTextSplitter\n",
|
16 |
+
"from langchain_community.document_loaders import TextLoader\n",
|
17 |
+
"from langchain_community.vectorstores import Chroma\n",
|
18 |
+
"from langchain.memory import ConversationBufferWindowMemory"
|
19 |
+
]
|
20 |
+
},
|
21 |
+
{
|
22 |
+
"cell_type": "code",
|
23 |
+
"execution_count": 2,
|
24 |
+
"metadata": {},
|
25 |
+
"outputs": [],
|
26 |
+
"source": [
|
27 |
+
"from dotenv import load_dotenv, find_dotenv\n",
|
28 |
+
"_ = load_dotenv(find_dotenv()) # read local .env file\n",
|
29 |
+
"\n",
|
30 |
+
"openai.api_key = os.environ['OPENAI_API_KEY']\n",
|
31 |
+
"\n",
|
32 |
+
"llm_name = \"gpt-3.5-turbo\""
|
33 |
+
]
|
34 |
+
},
|
35 |
+
{
|
36 |
+
"cell_type": "code",
|
37 |
+
"execution_count": 3,
|
38 |
+
"metadata": {},
|
39 |
+
"outputs": [],
|
40 |
+
"source": [
|
41 |
+
"def load_db(file, chain_type, k):\n",
|
42 |
+
" # load transcript\n",
|
43 |
+
" transcript = TextLoader(file).load()\n",
|
44 |
+
" # split documents\n",
|
45 |
+
" text_splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=70)\n",
|
46 |
+
" docs = text_splitter.split_documents(transcript)\n",
|
47 |
+
" # define embedding\n",
|
48 |
+
" embeddings = OpenAIEmbeddings() # will be different for Cohere AI\n",
|
49 |
+
" # create vector database from data\n",
|
50 |
+
" db = Chroma.from_documents(docs, embeddings)\n",
|
51 |
+
" # define retriever\n",
|
52 |
+
" retriever = db.as_retriever(search_type=\"similarity\", search_kwargs={\"k\": k})\n",
|
53 |
+
" # retrieval_chain = create_retrieval_chain(retriever, combine_docs_chain)\n",
|
54 |
+
"\n",
|
55 |
+
" # create a chatbot chain. Memory is managed externally.\n",
|
56 |
+
" qa = ConversationalRetrievalChain.from_llm(\n",
|
57 |
+
" llm = ChatOpenAI(temperature=0), #### Prompt Template is yet to be created\n",
|
58 |
+
" chain_type=chain_type, #### Refine \n",
|
59 |
+
" retriever=retriever, \n",
|
60 |
+
" return_source_documents=True,\n",
|
61 |
+
" return_generated_question=True,\n",
|
62 |
+
" # memory=memory\n",
|
63 |
+
" )\n",
|
64 |
+
" \n",
|
65 |
+
" return qa "
|
66 |
+
]
|
67 |
+
},
|
68 |
+
{
|
69 |
+
"cell_type": "code",
|
70 |
+
"execution_count": 4,
|
71 |
+
"metadata": {},
|
72 |
+
"outputs": [],
|
73 |
+
"source": [
|
74 |
+
"def buffer(history, buff):\n",
|
75 |
+
" # basically a filter function\n",
|
76 |
+
" # takes the chat history, ensures only buff latest elements exist\n",
|
77 |
+
" if len(history) > buff :\n",
|
78 |
+
" print(len(history)>buff)\n",
|
79 |
+
" return history[-buff:]\n",
|
80 |
+
" else:\n",
|
81 |
+
" return history"
|
82 |
+
]
|
83 |
+
},
|
84 |
+
{
|
85 |
+
"cell_type": "code",
|
86 |
+
"execution_count": 5,
|
87 |
+
"metadata": {},
|
88 |
+
"outputs": [],
|
89 |
+
"source": [
|
90 |
+
"chat_history = []"
|
91 |
+
]
|
92 |
+
},
|
93 |
+
{
|
94 |
+
"cell_type": "code",
|
95 |
+
"execution_count": 6,
|
96 |
+
"metadata": {},
|
97 |
+
"outputs": [
|
98 |
+
{
|
99 |
+
"name": "stderr",
|
100 |
+
"output_type": "stream",
|
101 |
+
"text": [
|
102 |
+
"/home/maxx/.conda/envs/vcdeploy/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.\n",
|
103 |
+
" warn_deprecated(\n"
|
104 |
+
]
|
105 |
+
}
|
106 |
+
],
|
107 |
+
"source": [
|
108 |
+
"qa = load_db(\"./sample.txt\", 'stuff', 4)\n",
|
109 |
+
"query = \"Namaste! I need your help\"\n",
|
110 |
+
"result = qa({'question': query, 'chat_history': chat_history}) #######################################"
|
111 |
+
]
|
112 |
+
},
|
113 |
+
{
|
114 |
+
"cell_type": "code",
|
115 |
+
"execution_count": 7,
|
116 |
+
"metadata": {},
|
117 |
+
"outputs": [
|
118 |
+
{
|
119 |
+
"data": {
|
120 |
+
"text/plain": [
|
121 |
+
"langchain.chains.conversational_retrieval.base.ConversationalRetrievalChain"
|
122 |
+
]
|
123 |
+
},
|
124 |
+
"execution_count": 7,
|
125 |
+
"metadata": {},
|
126 |
+
"output_type": "execute_result"
|
127 |
+
}
|
128 |
+
],
|
129 |
+
"source": [
|
130 |
+
"type(qa)"
|
131 |
+
]
|
132 |
+
},
|
133 |
+
{
|
134 |
+
"cell_type": "code",
|
135 |
+
"execution_count": 7,
|
136 |
+
"metadata": {},
|
137 |
+
"outputs": [
|
138 |
+
{
|
139 |
+
"data": {
|
140 |
+
"text/plain": [
|
141 |
+
"\"Namaste! I'm here to help. What do you need assistance with?\""
|
142 |
+
]
|
143 |
+
},
|
144 |
+
"execution_count": 7,
|
145 |
+
"metadata": {},
|
146 |
+
"output_type": "execute_result"
|
147 |
+
}
|
148 |
+
],
|
149 |
+
"source": [
|
150 |
+
"answer = result['answer']\n",
|
151 |
+
"answer"
|
152 |
+
]
|
153 |
+
},
|
154 |
+
{
|
155 |
+
"cell_type": "code",
|
156 |
+
"execution_count": 8,
|
157 |
+
"metadata": {},
|
158 |
+
"outputs": [
|
159 |
+
{
|
160 |
+
"data": {
|
161 |
+
"text/plain": [
|
162 |
+
"{'question': 'Namaste! I need your help',\n",
|
163 |
+
" 'chat_history': [],\n",
|
164 |
+
" 'answer': \"Namaste! I'm here to help. What do you need assistance with?\",\n",
|
165 |
+
" 'source_documents': [Document(page_content='~160~Aurora nodded and waved her magic wand, casting a spell that spread throughout the forest. From that day on, the animals frolicked and played in harmony, their hearts filled with joy and gratitude.\\n~180~As Sammy climbed down from the Golden Acorn Tree, he felt a warm glow of happiness inside. He may have only found one golden acorn, but he had discovered something even more precious – the power of kindness and the magic of friendship.', metadata={'source': './sample.txt'}),\n",
|
166 |
+
" Document(page_content='~40~One sunny morning, as Sammy scampered through the trees, he stumbled upon a mysterious path he had never seen before. Intrigued, he decided to follow it. The path led him deeper and deeper into the forest until he reached a clearing filled with colorful flowers and fluttering butterflies.', metadata={'source': './sample.txt'}),\n",
|
167 |
+
" Document(page_content='~120~\"Yes, indeed! And I am here to grant you a wish for finding the Golden Acorn Tree,\" said Aurora with a smile.\\n~140~Sammy thought for a moment, his little squirrel brain buzzing with ideas. Finally, he knew exactly what he wanted. \"I wish for all the animals in the forest to be happy and healthy forever,\" he declared.', metadata={'source': './sample.txt'}),\n",
|
168 |
+
" Document(page_content='~10~Once upon a time in the lush green forest,\\n~13~there lived a curious little squirrel named Sammy.\\n~25~Sammy had soft, brown fur and big, bright eyes that sparkled with excitement. He loved exploring every nook and cranny of the forest, searching for new adventures.', metadata={'source': './sample.txt'})],\n",
|
169 |
+
" 'generated_question': 'Namaste! I need your help'}"
|
170 |
+
]
|
171 |
+
},
|
172 |
+
"execution_count": 8,
|
173 |
+
"metadata": {},
|
174 |
+
"output_type": "execute_result"
|
175 |
+
}
|
176 |
+
],
|
177 |
+
"source": [
|
178 |
+
"result"
|
179 |
+
]
|
180 |
+
},
|
181 |
+
{
|
182 |
+
"cell_type": "code",
|
183 |
+
"execution_count": 19,
|
184 |
+
"metadata": {},
|
185 |
+
"outputs": [],
|
186 |
+
"source": [
|
187 |
+
"chat_history.extend([(query, result['answer'])])\n",
|
188 |
+
"chat_history = buffer(chat_history,3)"
|
189 |
+
]
|
190 |
+
},
|
191 |
+
{
|
192 |
+
"cell_type": "code",
|
193 |
+
"execution_count": 20,
|
194 |
+
"metadata": {},
|
195 |
+
"outputs": [],
|
196 |
+
"source": [
|
197 |
+
"query = \"What is the moral of the story?\"\n",
|
198 |
+
"result = qa({\"question\": query, \"chat_history\": chat_history}) #######################################"
|
199 |
+
]
|
200 |
+
},
|
201 |
+
{
|
202 |
+
"cell_type": "code",
|
203 |
+
"execution_count": 21,
|
204 |
+
"metadata": {},
|
205 |
+
"outputs": [
|
206 |
+
{
|
207 |
+
"data": {
|
208 |
+
"text/plain": [
|
209 |
+
"{'question': 'What is the moral of the story?',\n",
|
210 |
+
" 'chat_history': [('Namaste! I need your help',\n",
|
211 |
+
" \"Namaste! I'm here to help. What do you need assistance with?\")],\n",
|
212 |
+
" 'answer': 'The moral of the story is that kindness, friendship, courage, and love can lead to magical adventures and wonderful experiences.',\n",
|
213 |
+
" 'source_documents': [Document(page_content='~160~Aurora nodded and waved her magic wand, casting a spell that spread throughout the forest. From that day on, the animals frolicked and played in harmony, their hearts filled with joy and gratitude.\\n~180~As Sammy climbed down from the Golden Acorn Tree, he felt a warm glow of happiness inside. He may have only found one golden acorn, but he had discovered something even more precious – the power of kindness and the magic of friendship.', metadata={'source': './sample.txt'}),\n",
|
214 |
+
" Document(page_content='~200~And so, with a twinkle in his eye and a skip in his step, Sammy the squirrel continued his adventures in the enchanted forest, knowing that with a little bit of courage and a lot of love, anything was possible.\\n~220~The end.', metadata={'source': './sample.txt'}),\n",
|
215 |
+
" Document(page_content='~40~One sunny morning, as Sammy scampered through the trees, he stumbled upon a mysterious path he had never seen before. Intrigued, he decided to follow it. The path led him deeper and deeper into the forest until he reached a clearing filled with colorful flowers and fluttering butterflies.', metadata={'source': './sample.txt'}),\n",
|
216 |
+
" Document(page_content='~80~Excitedly, Sammy climbed up the tree and plucked a golden acorn from a branch. Suddenly, he heard a tiny voice coming from the acorn. \"Hello there, young squirrel! I am Aurora, the guardian of the Golden Acorn Tree,\" said the voice.\\n~100~Sammy looked around in amazement until he spotted a tiny fairy perched on the golden acorn. \"Wow, you\\'re a fairy!\" exclaimed Sammy, his eyes shining with delight.', metadata={'source': './sample.txt'})],\n",
|
217 |
+
" 'generated_question': 'What is the moral of the story?'}"
|
218 |
+
]
|
219 |
+
},
|
220 |
+
"execution_count": 21,
|
221 |
+
"metadata": {},
|
222 |
+
"output_type": "execute_result"
|
223 |
+
}
|
224 |
+
],
|
225 |
+
"source": [
|
226 |
+
"result"
|
227 |
+
]
|
228 |
+
},
|
229 |
+
{
|
230 |
+
"cell_type": "code",
|
231 |
+
"execution_count": 22,
|
232 |
+
"metadata": {},
|
233 |
+
"outputs": [
|
234 |
+
{
|
235 |
+
"data": {
|
236 |
+
"text/plain": [
|
237 |
+
"'The moral of the story is that kindness, friendship, courage, and love can lead to magical adventures and wonderful experiences.'"
|
238 |
+
]
|
239 |
+
},
|
240 |
+
"execution_count": 22,
|
241 |
+
"metadata": {},
|
242 |
+
"output_type": "execute_result"
|
243 |
+
}
|
244 |
+
],
|
245 |
+
"source": [
|
246 |
+
"answer = result['answer']\n",
|
247 |
+
"answer"
|
248 |
+
]
|
249 |
+
},
|
250 |
+
{
|
251 |
+
"cell_type": "code",
|
252 |
+
"execution_count": 23,
|
253 |
+
"metadata": {},
|
254 |
+
"outputs": [],
|
255 |
+
"source": [
|
256 |
+
"chat_history.extend([(query,result['answer'])])\n",
|
257 |
+
"chat_history = buffer(chat_history,3)"
|
258 |
+
]
|
259 |
+
},
|
260 |
+
{
|
261 |
+
"cell_type": "code",
|
262 |
+
"execution_count": 24,
|
263 |
+
"metadata": {},
|
264 |
+
"outputs": [],
|
265 |
+
"source": [
|
266 |
+
"query = \"Can you give a relevant title to the story?\"\n",
|
267 |
+
"result = qa({\"question\": query, \"chat_history\": chat_history}) #######################################"
|
268 |
+
]
|
269 |
+
},
|
270 |
+
{
|
271 |
+
"cell_type": "code",
|
272 |
+
"execution_count": 25,
|
273 |
+
"metadata": {},
|
274 |
+
"outputs": [
|
275 |
+
{
|
276 |
+
"data": {
|
277 |
+
"text/plain": [
|
278 |
+
"{'question': 'Can you give a relevant title to the story?',\n",
|
279 |
+
" 'chat_history': [('Namaste! I need your help',\n",
|
280 |
+
" \"Namaste! I'm here to help. What do you need assistance with?\"),\n",
|
281 |
+
" ('What is the moral of the story?',\n",
|
282 |
+
" 'The moral of the story is that kindness, friendship, courage, and love can lead to magical adventures and wonderful experiences.')],\n",
|
283 |
+
" 'answer': 'A relevant title for the story could be \"Sammy\\'s Enchanted Forest Adventure.\"',\n",
|
284 |
+
" 'source_documents': [Document(page_content='~40~One sunny morning, as Sammy scampered through the trees, he stumbled upon a mysterious path he had never seen before. Intrigued, he decided to follow it. The path led him deeper and deeper into the forest until he reached a clearing filled with colorful flowers and fluttering butterflies.', metadata={'source': './sample.txt'}),\n",
|
285 |
+
" Document(page_content='~160~Aurora nodded and waved her magic wand, casting a spell that spread throughout the forest. From that day on, the animals frolicked and played in harmony, their hearts filled with joy and gratitude.\\n~180~As Sammy climbed down from the Golden Acorn Tree, he felt a warm glow of happiness inside. He may have only found one golden acorn, but he had discovered something even more precious – the power of kindness and the magic of friendship.', metadata={'source': './sample.txt'}),\n",
|
286 |
+
" Document(page_content='~200~And so, with a twinkle in his eye and a skip in his step, Sammy the squirrel continued his adventures in the enchanted forest, knowing that with a little bit of courage and a lot of love, anything was possible.\\n~220~The end.', metadata={'source': './sample.txt'}),\n",
|
287 |
+
" Document(page_content='~10~Once upon a time in the lush green forest,\\n~13~there lived a curious little squirrel named Sammy.\\n~25~Sammy had soft, brown fur and big, bright eyes that sparkled with excitement. He loved exploring every nook and cranny of the forest, searching for new adventures.', metadata={'source': './sample.txt'})],\n",
|
288 |
+
" 'generated_question': 'What is a relevant title for the story?'}"
|
289 |
+
]
|
290 |
+
},
|
291 |
+
"execution_count": 25,
|
292 |
+
"metadata": {},
|
293 |
+
"output_type": "execute_result"
|
294 |
+
}
|
295 |
+
],
|
296 |
+
"source": [
|
297 |
+
"result"
|
298 |
+
]
|
299 |
+
},
|
300 |
+
{
|
301 |
+
"cell_type": "code",
|
302 |
+
"execution_count": 26,
|
303 |
+
"metadata": {},
|
304 |
+
"outputs": [
|
305 |
+
{
|
306 |
+
"data": {
|
307 |
+
"text/plain": [
|
308 |
+
"'A relevant title for the story could be \"Sammy\\'s Enchanted Forest Adventure.\"'"
|
309 |
+
]
|
310 |
+
},
|
311 |
+
"execution_count": 26,
|
312 |
+
"metadata": {},
|
313 |
+
"output_type": "execute_result"
|
314 |
+
}
|
315 |
+
],
|
316 |
+
"source": [
|
317 |
+
"answer = result['answer']\n",
|
318 |
+
"answer"
|
319 |
+
]
|
320 |
+
},
|
321 |
+
{
|
322 |
+
"cell_type": "code",
|
323 |
+
"execution_count": 27,
|
324 |
+
"metadata": {},
|
325 |
+
"outputs": [],
|
326 |
+
"source": [
|
327 |
+
"chat_history.extend([(query,result['answer'])])\n",
|
328 |
+
"chat_history = buffer(chat_history,3)"
|
329 |
+
]
|
330 |
+
},
|
331 |
+
{
|
332 |
+
"cell_type": "code",
|
333 |
+
"execution_count": 28,
|
334 |
+
"metadata": {},
|
335 |
+
"outputs": [],
|
336 |
+
"source": [
|
337 |
+
"query = \"which timestamp resembles moral of the story?\"\n",
|
338 |
+
"result = qa({\"question\": query, \"chat_history\": chat_history}) #######################################"
|
339 |
+
]
|
340 |
+
},
|
341 |
+
{
|
342 |
+
"cell_type": "code",
|
343 |
+
"execution_count": 29,
|
344 |
+
"metadata": {},
|
345 |
+
"outputs": [
|
346 |
+
{
|
347 |
+
"data": {
|
348 |
+
"text/plain": [
|
349 |
+
"'The timestamp ~180~ where Sammy discovers the power of kindness and the magic of friendship best represents the moral of the story.'"
|
350 |
+
]
|
351 |
+
},
|
352 |
+
"execution_count": 29,
|
353 |
+
"metadata": {},
|
354 |
+
"output_type": "execute_result"
|
355 |
+
}
|
356 |
+
],
|
357 |
+
"source": [
|
358 |
+
"answer = result['answer']\n",
|
359 |
+
"answer"
|
360 |
+
]
|
361 |
+
},
|
362 |
+
{
|
363 |
+
"cell_type": "code",
|
364 |
+
"execution_count": 30,
|
365 |
+
"metadata": {},
|
366 |
+
"outputs": [
|
367 |
+
{
|
368 |
+
"name": "stdout",
|
369 |
+
"output_type": "stream",
|
370 |
+
"text": [
|
371 |
+
"True\n"
|
372 |
+
]
|
373 |
+
}
|
374 |
+
],
|
375 |
+
"source": [
|
376 |
+
"chat_history.extend([(query,result['answer'])])\n",
|
377 |
+
"chat_history = buffer(chat_history,3)"
|
378 |
+
]
|
379 |
+
},
|
380 |
+
{
|
381 |
+
"cell_type": "code",
|
382 |
+
"execution_count": 31,
|
383 |
+
"metadata": {},
|
384 |
+
"outputs": [
|
385 |
+
{
|
386 |
+
"data": {
|
387 |
+
"text/plain": [
|
388 |
+
"[('What is the moral of the story?',\n",
|
389 |
+
" 'The moral of the story is that kindness, friendship, courage, and love can lead to magical adventures and wonderful experiences.'),\n",
|
390 |
+
" ('Can you give a relevant title to the story?',\n",
|
391 |
+
" 'A relevant title for the story could be \"Sammy\\'s Enchanted Forest Adventure.\"'),\n",
|
392 |
+
" ('which timestamp resembles moral of the story?',\n",
|
393 |
+
" 'The timestamp ~180~ where Sammy discovers the power of kindness and the magic of friendship best represents the moral of the story.')]"
|
394 |
+
]
|
395 |
+
},
|
396 |
+
"execution_count": 31,
|
397 |
+
"metadata": {},
|
398 |
+
"output_type": "execute_result"
|
399 |
+
}
|
400 |
+
],
|
401 |
+
"source": [
|
402 |
+
"chat_history"
|
403 |
+
]
|
404 |
+
},
|
405 |
+
{
|
406 |
+
"cell_type": "markdown",
|
407 |
+
"metadata": {},
|
408 |
+
"source": [
|
409 |
+
"---"
|
410 |
+
]
|
411 |
+
},
|
412 |
+
{
|
413 |
+
"cell_type": "markdown",
|
414 |
+
"metadata": {},
|
415 |
+
"source": [
|
416 |
+
"**The Defalut Buffer Memory that store entire chat and provides it as context to the next conversation** ⬆"
|
417 |
+
]
|
418 |
+
},
|
419 |
+
{
|
420 |
+
"cell_type": "markdown",
|
421 |
+
"metadata": {},
|
422 |
+
"source": []
|
423 |
+
},
|
424 |
+
{
|
425 |
+
"cell_type": "markdown",
|
426 |
+
"metadata": {},
|
427 |
+
"source": []
|
428 |
+
},
|
429 |
+
{
|
430 |
+
"cell_type": "markdown",
|
431 |
+
"metadata": {},
|
432 |
+
"source": [
|
433 |
+
"- Memory : Conv Buffer Window Memory (k=6) / Conv Summ Buff Memory (max_token_limit=650)\n",
|
434 |
+
"- Chain Type : Refine\n",
|
435 |
+
"- Prompt : Template Fine-Tuning\n",
|
436 |
+
"- Deployment : Gunicorn / Streamlit ? "
|
437 |
+
]
|
438 |
+
},
|
439 |
+
{
|
440 |
+
"cell_type": "markdown",
|
441 |
+
"metadata": {},
|
442 |
+
"source": [
|
443 |
+
"References: \n",
|
444 |
+
"https://python.langchain.com/docs/use_cases/question_answering/"
|
445 |
+
]
|
446 |
+
},
|
447 |
+
{
|
448 |
+
"cell_type": "markdown",
|
449 |
+
"metadata": {},
|
450 |
+
"source": [
|
451 |
+
"https://python.langchain.com/docs/modules/memory/types/token_buffer"
|
452 |
+
]
|
453 |
+
},
|
454 |
+
{
|
455 |
+
"cell_type": "code",
|
456 |
+
"execution_count": 15,
|
457 |
+
"metadata": {},
|
458 |
+
"outputs": [
|
459 |
+
{
|
460 |
+
"data": {
|
461 |
+
"text/plain": [
|
462 |
+
"'~40~One sunny morning, as Sammy scampered through the trees, he stumbled upon a mysterious path he had never seen before. Intrigued, he decided to follow it. The path led him deeper and deeper into the forest until he reached a clearing filled with colorful flowers and fluttering butterflies.'"
|
463 |
+
]
|
464 |
+
},
|
465 |
+
"execution_count": 15,
|
466 |
+
"metadata": {},
|
467 |
+
"output_type": "execute_result"
|
468 |
+
}
|
469 |
+
],
|
470 |
+
"source": [
|
471 |
+
"dict(result['source_documents'][1])['page_content']"
|
472 |
+
]
|
473 |
+
},
|
474 |
+
{
|
475 |
+
"cell_type": "code",
|
476 |
+
"execution_count": 34,
|
477 |
+
"metadata": {},
|
478 |
+
"outputs": [],
|
479 |
+
"source": [
|
480 |
+
"import re\n",
|
481 |
+
"pattern = r'~(\\d+)~'"
|
482 |
+
]
|
483 |
+
},
|
484 |
+
{
|
485 |
+
"cell_type": "code",
|
486 |
+
"execution_count": 38,
|
487 |
+
"metadata": {},
|
488 |
+
"outputs": [],
|
489 |
+
"source": [
|
490 |
+
"backlinked_docs = [result['source_documents'][i].page_content for i in range(len(result['source_documents']))]\n",
|
491 |
+
"timestamps = list(map(lambda s: int(re.search(pattern, s).group(1)) if re.search(pattern, s) else None, backlinked_docs))\n"
|
492 |
+
]
|
493 |
+
},
|
494 |
+
{
|
495 |
+
"cell_type": "code",
|
496 |
+
"execution_count": 39,
|
497 |
+
"metadata": {},
|
498 |
+
"outputs": [
|
499 |
+
{
|
500 |
+
"data": {
|
501 |
+
"text/plain": [
|
502 |
+
"{'timestamps': [160, 40, 120, 10],\n",
|
503 |
+
" 'answer': \"Namaste! I'm here to help. What do you need assistance with?\"}"
|
504 |
+
]
|
505 |
+
},
|
506 |
+
"execution_count": 39,
|
507 |
+
"metadata": {},
|
508 |
+
"output_type": "execute_result"
|
509 |
+
}
|
510 |
+
],
|
511 |
+
"source": [
|
512 |
+
"dict(timestamps=timestamps, answer=result['answer'])"
|
513 |
+
]
|
514 |
+
}
|
515 |
+
],
|
516 |
+
"metadata": {
|
517 |
+
"kernelspec": {
|
518 |
+
"display_name": "vchad",
|
519 |
+
"language": "python",
|
520 |
+
"name": "python3"
|
521 |
+
},
|
522 |
+
"language_info": {
|
523 |
+
"codemirror_mode": {
|
524 |
+
"name": "ipython",
|
525 |
+
"version": 3
|
526 |
+
},
|
527 |
+
"file_extension": ".py",
|
528 |
+
"mimetype": "text/x-python",
|
529 |
+
"name": "python",
|
530 |
+
"nbconvert_exporter": "python",
|
531 |
+
"pygments_lexer": "ipython3",
|
532 |
+
"version": "3.10.13"
|
533 |
+
}
|
534 |
+
},
|
535 |
+
"nbformat": 4,
|
536 |
+
"nbformat_minor": 2
|
537 |
+
}
|
requirements.txt
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Flask==3.0.2
|
2 |
+
Flask_Cors==4.0.0
|
3 |
+
langchain==0.1.8
|
4 |
+
langchain_community==0.0.21
|
5 |
+
langchain_openai==0.0.6
|
6 |
+
openai==1.12.0
|
7 |
+
python-dotenv==1.0.1
|
8 |
+
pytube==15.0.0
|
9 |
+
youtube_transcript_api==0.6.2
|
10 |
+
chromadb==0.4.22
|
sample.txt
ADDED
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
~10~Once upon a time in the lush green forest,
|
2 |
+
~13~there lived a curious little squirrel named Sammy.
|
3 |
+
~25~Sammy had soft, brown fur and big, bright eyes that sparkled with excitement. He loved exploring every nook and cranny of the forest, searching for new adventures.
|
4 |
+
~40~One sunny morning, as Sammy scampered through the trees, he stumbled upon a mysterious path he had never seen before. Intrigued, he decided to follow it. The path led him deeper and deeper into the forest until he reached a clearing filled with colorful flowers and fluttering butterflies.
|
5 |
+
~60~In the center of the clearing stood a magical tree with branches that reached up to the sky. Hanging from the branches were shiny, golden acorns that glistened in the sunlight. Sammy's eyes widened with wonder as he realized he had discovered the legendary Golden Acorn Tree.
|
6 |
+
~80~Excitedly, Sammy climbed up the tree and plucked a golden acorn from a branch. Suddenly, he heard a tiny voice coming from the acorn. "Hello there, young squirrel! I am Aurora, the guardian of the Golden Acorn Tree," said the voice.
|
7 |
+
~100~Sammy looked around in amazement until he spotted a tiny fairy perched on the golden acorn. "Wow, you're a fairy!" exclaimed Sammy, his eyes shining with delight.
|
8 |
+
~120~"Yes, indeed! And I am here to grant you a wish for finding the Golden Acorn Tree," said Aurora with a smile.
|
9 |
+
~140~Sammy thought for a moment, his little squirrel brain buzzing with ideas. Finally, he knew exactly what he wanted. "I wish for all the animals in the forest to be happy and healthy forever," he declared.
|
10 |
+
~160~Aurora nodded and waved her magic wand, casting a spell that spread throughout the forest. From that day on, the animals frolicked and played in harmony, their hearts filled with joy and gratitude.
|
11 |
+
~180~As Sammy climbed down from the Golden Acorn Tree, he felt a warm glow of happiness inside. He may have only found one golden acorn, but he had discovered something even more precious – the power of kindness and the magic of friendship.
|
12 |
+
~200~And so, with a twinkle in his eye and a skip in his step, Sammy the squirrel continued his adventures in the enchanted forest, knowing that with a little bit of courage and a lot of love, anything was possible.
|
13 |
+
~220~The end.
|