Hosting as Inference Endpoint doesn't work
I tried to host it as Inference Endpoint but I keep receiving this error:
{"error":"Translation requires a src_lang
and a tgt_lang
for this model"}
I pass that in as part of the JSON body so not sure why that happens.
Any help would be appreciated.
Thank you.
Maybe @ArthurZ or @philschmid can help!
I have the same issue here when running it on SageMaker:
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from primary with message "{
"code": 400,
"type": "InternalServerException",
"message": "Translation requires a src_lang
and a tgt_lang
for this model"
}
Hey @charliekocsis , @Yippi , I'm not sure you can do so through the UI offering to test your model directly, however, here's how you can do so from your Python runtime:
import requests
API_URL = "https://endpoint-url.us-east-1.aws.endpoints.huggingface.cloud" # Replace this
headers = {
"Authorization": "Bearer <XXX>", # Replace this
"Content-Type": "application/json"
}
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
output = query({"inputs": "My name is Sarah Jessica Parker but you can call me Jessica", "parameters": {"src_lang": "eng_Latn", "tgt_lang": "fra_Latn"}})
print(output)
which returns
[{'translation_text': "Je m' appelle Sarah Jessica Parker mais tu peux m'appeler Jessica"}]
The language tags are those from the FLORES-200 dataset, which you can find here: https://github.com/facebookresearch/flores/tree/main/flores200#languages-in-flores-200