File size: 3,370 Bytes
94aa156 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 8e79db6 5977163 61ea053 f2f9590 61ea053 5977163 155f547 5977163 155f547 f2f9590 155f547 5977163 155f547 f2f9590 155f547 5977163 155f547 5977163 155f547 94aa156 5977163 9bfcbb7 5977163 8793340 9bfcbb7 5370d9f 9bfcbb7 b29ed0e 5977163 8793340 9bfcbb7 f2f9590 9bfcbb7 5977163 8793340 9bfcbb7 5977163 8793340 5977163 8793340 f2f9590 8793340 5977163 8793340 9bfcbb7 f2f9590 8793340 5977163 8793340 9bfcbb7 f2f9590 9bfcbb7 5977163 8793340 32295a6 5977163 9bfcbb7 5977163 9bfcbb7 8793340 5977163 8793340 7221607 9bfcbb7 5977163 9bfcbb7 5977163 9bfcbb7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 |
---
license: apache-2.0
---
## Installation
To install the necessary dependencies, run:
```bash
pip install huggingface_hub torch transformers datasets argparse
```
### Download the Repository
Use the `huggingface_hub` to download the repository:
```python
from huggingface_hub import snapshot_download
# Download the repository
repo_path = snapshot_download("RobbiePasquale/lightbulb")
print(f"Repository downloaded to: {repo_path}")
```
### 1. Train a Web Search Agent
**Usage:**
```bash
python main_menu.py --task train_agent
```
### 2. Use a Web Search Agent (Inference)
**Description:**
Utilizes the trained web search agent to process queries, perform web searches, and generate summarized responses.
**Usage:**
```bash
python main_menu.py --task test_agent
```
**Options:**
- **Interactive Mode:**
```bash
python main_menu.py --task test_agent
```
- **Single Query Mode:**
```bash
python main_menu.py --task test_agent --query "Your query here"
```
### 3. Train Language Model
**Usage:**
```bash
python main_menu.py --task train_llm_world --model_name gpt2 --dataset_name wikitext --num_epochs 5 --batch_size 8 --max_length 256
```
**Key Arguments:**
- `--model_name`: Pretrained model (e.g., `gpt2`, `bert`).
- `--dataset_name`: Dataset from Hugging Face (e.g., `wikitext`).
- `--num_epochs`: Number of training epochs.
- `--batch_size`: Number of samples per batch.
- `--max_length`: Maximum sequence length.
### 4. Inference Using Language Model
**Usage:**
```bash
python main_menu.py --task inference_llm --query "Your query here"
```
### 5. Train World Model
**Description:**
Develops a comprehensive World Model that encapsulates state representations, dynamics, and prediction networks to simulate and predict state transitions within the Tree of Thought framework.
**Usage:**
```bash
python main_menu.py --task train_world_model --additional_args
```
### 6. Inference with Language World Model
**Usage:**
```bash
python main_menu.py --task inference_world_model --query "Your query here"
```
### 7. Advanced Inference
**Usage:**
```bash
python main_menu.py --task advanced_inference --query "Your complex query here"
```
### Training the World Model
```bash
python main_menu.py --task train_llm_world --model_name gpt2 --dataset_name wikitext --num_epochs 5 --batch_size 8 --max_length 256
```
### Training the Web Search Agent
```bash
python main_menu.py --task train_agent
```
### Use the Web Search Agent in Interactive Mode
```bash
python main_menu.py --task test_agent
```
### Use the Web Search Agent with a Single Query
```bash
python main_menu.py --task test_agent --query "What are the impacts of renewable energy on global sustainability?"
```
### Inference with World Model and Tree of Thought
```bash
python main_menu.py --task advanced_inference --query "Analyze the economic effects of artificial intelligence in the next decade."
```
## Citation
If you use LightBulb in your research, please cite the author:
```
@misc{RobbiePasquale_lightbulb,
author = {Robbie Pasquale},
title = {LightBulb: An Autonomous Web Search and Language Model Framework},
year = {2024},
publisher = {Huggingface},
howpublished = {\url{https://huggingface.co/RobbiePasquale/lightbulb}},
}
```
## License
This project is licensed under the Apache 2.0 License.
---
|