CESI-LINEACT-Laboratory2023 commited on
Commit
fe9e5b2
1 Parent(s): 2e95d24

Upload Evaluation Performance for EPT-MoE Models Transformers_ Report https___avestia.com_EECSS2024_Proceedings_files_paper_MVML_MVML_1.zip

Browse files

The Mixture-of-Experts (MoE) is a widely known deep neural architecture where an ensemble of specialized sub-models (a group of experts) optimizes the overall performance with a constant computational cost. We conduct extensive experiments for scaling studies and evaluation performance overall several variants of EPT-MoE model family to show the advantages of combining parallel transformer encoders with multiple layers of Mixture-of-Experts into one model transformer for 3D hand gesture recognition.
https://avestia.com/EECSS2024_Proceedings/files/paper/MVML/MVML_105.pdf
https://hal-lara.archives-ouvertes.fr/CESI/hal-04711525v1
Accelerating Training and Recognition Accuracy of EPT-MoE Models for 3D Hand Gesture Recognition

Evaluation Performance for EPT-MoE Models Transformers_ Report https___avestia.com_EECSS2024_Proceedings_files_paper_MVML_MVML_1.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a345b69e090c23466fab487b6a03455d7ea06b18c2d3130a8b81e98c32ff7a5c
3
+ size 8931388