dayyass's picture
Update README.md
599bb72 verified
|
raw
history blame
No virus
1.73 kB
metadata
license: mit
language:
  - en
  - ru
  - ar
  - zh
  - fr
  - de
  - it
  - ja
  - ko
  - nl
  - pl
  - pt
  - es
  - th
  - tr
library_name: sentence-transformers
pipeline_tag: feature-extraction
tags:
  - mteb
  - Sentence Transformers
  - sentence-similarity
  - arxiv:1803.11175
  - arxiv:1907.04307

Convert MUSE from TensorFlow to PyTorch

This repository contains code to use mUSE (Multilingual Universal Sentence Encoder) transformer model from TF Hub using PyTorch.

The PyTorch model can be used not only for inference, but also for additional training and fine-tuning!

Read more about the project: GitHub.

Usage

The model is available in HF Models directly through torch (currently, without native support from the transformers library).

Model initialization and usage code:

import torch
from functools import partial
from architecture import MUSE
from tokenizer import get_tokenizer, tokenize

PATH_TO_PT_MODEL = "model.pt"
PATH_TO_TF_MODEL = "universal-sentence-encoder-multilingual-large-3"

tokenizer = get_tokenizer(PATH_TO_TF_MODEL)
tokenize = partial(tokenize, tokenizer=tokenizer)

model_torch = MUSE(
    num_embeddings=128010,
    embedding_dim=512,
    d_model=512,
    num_heads=8,
)
model_torch.load_state_dict(
    torch.load(PATH_TO_PT_MODEL)
)

sentence = "Hello, world!"
res = model_torch(tokenize(sentence))

Currently, the checkpoint of the original TF Hub model is used for tokenization, so it is loaded in the code above.