onnx
python -m scripts.convert --quantize --model_id dima806/fairface_gender_image_detection
from https://github.com/huggingface/transformers.js
@dima806 Any reason why this was closed?
@fishcharlie Any reason why this was open?
@dima806 I mean it would be great to have a onnx version of the model. That way it can be used easily in transformers.js.
@fishcharlie should not it be possible directly with huggingface.js/inference, without any transformation?
@dima806 There are some major benefits to doing this directly in transformers.js. For one privacy. You don't need to send your data to Hugging Face's inference API in order to make inferences. Additionally, you can run this locally on clients systems, reducing a dependency on Hugging Face, and reducing latency. Finally, transformers.js is a much more cost effective option for running this.