--- license: apache-2.0 --- ## Installation To install the necessary dependencies, run: ```bash pip install huggingface_hub torch transformers datasets argparse ``` ### Download the Repository Use the `huggingface_hub` to download the repository: ```python from huggingface_hub import snapshot_download # Download the repository repo_path = snapshot_download("RobbiePasquale/lightbulb") print(f"Repository downloaded to: {repo_path}") ``` ### 1. Train a Web Search Agent **Usage:** ```bash python main_menu.py --task train_agent ``` ### 2. Use a Web Search Agent (Inference) **Description:** Utilizes the trained web search agent to process queries, perform web searches, and generate summarized responses. **Usage:** ```bash python main_menu.py --task test_agent ``` **Options:** - **Interactive Mode:** ```bash python main_menu.py --task test_agent ``` - **Single Query Mode:** ```bash python main_menu.py --task test_agent --query "Your query here" ``` ### 3. Train Language Model **Usage:** ```bash python main_menu.py --task train_llm_world --model_name gpt2 --dataset_name wikitext --num_epochs 5 --batch_size 8 --max_length 256 ``` **Key Arguments:** - `--model_name`: Pretrained model (e.g., `gpt2`, `bert`). - `--dataset_name`: Dataset from Hugging Face (e.g., `wikitext`). - `--num_epochs`: Number of training epochs. - `--batch_size`: Number of samples per batch. - `--max_length`: Maximum sequence length. ### 4. Inference Using Language Model **Usage:** ```bash python main_menu.py --task inference_llm --query "Your query here" ``` ### 5. Train World Model **Description:** Develops a comprehensive World Model that encapsulates state representations, dynamics, and prediction networks to simulate and predict state transitions within the Tree of Thought framework. **Usage:** ```bash python main_menu.py --task train_world_model --additional_args ``` ### 6. Inference with Language World Model **Usage:** ```bash python main_menu.py --task inference_world_model --query "Your query here" ``` ### 7. Advanced Inference **Usage:** ```bash python main_menu.py --task advanced_inference --query "Your complex query here" ``` ### Training the World Model ```bash python main_menu.py --task train_llm_world --model_name gpt2 --dataset_name wikitext --num_epochs 5 --batch_size 8 --max_length 256 ``` ### Training the Web Search Agent ```bash python main_menu.py --task train_agent ``` ### Use the Web Search Agent in Interactive Mode ```bash python main_menu.py --task test_agent ``` ### Use the Web Search Agent with a Single Query ```bash python main_menu.py --task test_agent --query "What are the impacts of renewable energy on global sustainability?" ``` ### Inference with World Model and Tree of Thought ```bash python main_menu.py --task advanced_inference --query "Analyze the economic effects of artificial intelligence in the next decade." ``` ## Citation If you use LightBulb in your research, please cite the author: ``` @misc{RobbiePasquale_lightbulb, author = {Robbie Pasquale}, title = {LightBulb: An Autonomous Web Search and Language Model Framework}, year = {2024}, publisher = {Huggingface}, howpublished = {\url{https://huggingface.co/RobbiePasquale/lightbulb}}, } ``` ## License This project is licensed under the Apache 2.0 License. ---