|
--- |
|
language: en |
|
tags: |
|
- llama |
|
- llama-3.2 |
|
- function-calling |
|
- instruction-tuning |
|
- conversational |
|
license: llama2 |
|
datasets: |
|
- 0xroyce/NeuralTau-With-Functions-chat |
|
base_model: |
|
- unsloth/Llama-3.2-3B-Instruct |
|
--- |
|
|
|
# NeuralTau Functions 3B v1 |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63bb3f12595fa15f4e4cb368/aubcH-Fg6tibGfJg5bjwV.png) |
|
|
|
NeuralTau Functions 3B v1 is the inaugural model in the NeuralTau series, designed to deliver specialized, expert AI capabilities. This pilot model explores the potential for creating AI teammates and autonomous AI-driven businesses. |
|
|
|
## Key Features |
|
|
|
- **Complexity and Expertise** |
|
NeuralTau Functions 3B v1 is engineered with advanced complexity to tackle niche and specialized tasks, making it ideal for applications requiring deep expertise. |
|
|
|
- **Purpose-Driven Development** |
|
This version serves as a pilot to evaluate performance and usability, laying the groundwork for future iterations aimed at building AI teammates and autonomous AI systems. |
|
|
|
- **Usability** |
|
Designed for developers seeking to integrate specialized AI solutions, the model supports applications requiring autonomous functionality or expert-level knowledge. |
|
|
|
## Future Vision |
|
|
|
This model represents the first step in the NeuralTau journey. Future iterations will build upon insights gained from v1 to create more refined, efficient, and specialized AI models, continually enhancing performance and usability. |
|
|
|
## Model Variants Available |
|
- 16-bit full model |
|
- GGUF Q4_K_M quantized version (recommended for most use cases) |
|
- GGUF Q8_0 quantized version (higher quality, larger size) |
|
|
|
## Training Details |
|
- Base Model: unsloth/Llama-3.2-3B-Instruct |
|
- Training Dataset: 0xroyce/NeuralTau-With-Functions-chat (https://huggingface.co/datasets/0xroyce/NeuralTau-With-Functions-chat) |
|
|
|
## Usage |
|
|
|
The model follows the Llama chat format. You can interact with it using: |
|
|
|
```python |
|
messages = [ |
|
{"role": "user", "content": "Your instruction or question here"}, |
|
] |
|
``` |
|
|
|
|
|
Function calling example: |
|
``` |
|
>>> how do i do a function for weather? use <tool_call> </tool_call> |
|
<tool_call> |
|
{"arguments": {"location": "Los Angeles", "time_period": "current"}, "name": "get_weather_data"} |
|
</tool_call> |
|
``` |
|
|
|
## Model Capabilities |
|
- Understanding and following complex instructions |
|
- Providing detailed explanations and analysis |
|
- Breaking down complex topics into understandable components |
|
- Function-like operations and systematic problem-solving |
|
- Maintaining context in multi-turn conversations |
|
- Generating clear and structured responses |
|
|
|
## License |
|
This model is subject to the Llama 2 license. |