Update README.md
Browse files
README.md
CHANGED
@@ -49,16 +49,16 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
49 |
torch_dtype=torch.bfloat16
|
50 |
)
|
51 |
```
|
52 |
-
Currently, we support LLM endpoint generation, where you need to send a post request to the generation endpoint (we recommend using Text Generation Inference by HuggingFace)
|
53 |
|
54 |
-
|
55 |
-
|
56 |
-
Then you can use it inside via GOAT-Storytelling-Agent:
|
57 |
|
58 |
```python
|
59 |
-
from goat_storytelling_agent import
|
60 |
|
61 |
-
|
|
|
|
|
62 |
```
|
63 |
|
64 |
## License
|
|
|
49 |
torch_dtype=torch.bfloat16
|
50 |
)
|
51 |
```
|
52 |
+
Currently, we support LLM endpoint generation, where you need to send a post request to the generation endpoint (we recommend using Text Generation Inference by HuggingFace).
|
53 |
|
54 |
+
Here is how you can utilize the model via GOAT-Storytelling-Agent:
|
|
|
|
|
55 |
|
56 |
```python
|
57 |
+
from goat_storytelling_agent.storytelling_agent import StoryAgent
|
58 |
|
59 |
+
backend_uri = # Text generation endpoint
|
60 |
+
writer = StoryAgent(backend_uri, form='novel')
|
61 |
+
novel_scenes = writer.generate_story('treasure hunt in a jungle')
|
62 |
```
|
63 |
|
64 |
## License
|