Spaces:
Sleeping
Sleeping
Update app.py
Browse files
app.py
CHANGED
@@ -193,7 +193,7 @@ with gr.Blocks(css=MODEL_SELECTION_CSS, theme='gradio/soft') as demo:
|
|
193 |
"• inspecting the actual prompt that the model sees. The underlying Large Language Model is the [Meta AI](https://ai.meta.com/)'s "
|
194 |
"[LLaMA2-70B](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf) which is hosted as [Hugging Face Inference API](https://huggingface.co/inference-api), "
|
195 |
"and [Text Generation Inference](https://github.com/huggingface/text-generation-inference) is the underlying serving framework. ",
|
196 |
-
elem_classes=["center"]
|
197 |
)
|
198 |
gr.Markdown(
|
199 |
"***NOTE:*** If you are subscribing [PRO](https://huggingface.co/pricing#pro), you can simply duplicate this space and use your "
|
@@ -429,4 +429,4 @@ with gr.Blocks(css=MODEL_SELECTION_CSS, theme='gradio/soft') as demo:
|
|
429 |
_js=GET_LOCAL_STORAGE,
|
430 |
)
|
431 |
|
432 |
-
demo.queue().launch()
|
|
|
193 |
"• inspecting the actual prompt that the model sees. The underlying Large Language Model is the [Meta AI](https://ai.meta.com/)'s "
|
194 |
"[LLaMA2-70B](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf) which is hosted as [Hugging Face Inference API](https://huggingface.co/inference-api), "
|
195 |
"and [Text Generation Inference](https://github.com/huggingface/text-generation-inference) is the underlying serving framework. ",
|
196 |
+
elem_classes=["center"]
|
197 |
)
|
198 |
gr.Markdown(
|
199 |
"***NOTE:*** If you are subscribing [PRO](https://huggingface.co/pricing#pro), you can simply duplicate this space and use your "
|
|
|
429 |
_js=GET_LOCAL_STORAGE,
|
430 |
)
|
431 |
|
432 |
+
demo.queue(concurrency_count=5, max_size=256).launch()
|