Merge pull request #30 from LightricksResearch/fix-no-flash-attention 05cb3e4 unverified Sapir Weissbuch commited on Nov 7, 2024
model: fix flash attention enabling - do not check device type at this point (can be CPU) 5940103 eitanrich commited on Nov 7, 2024
Feature: Add mixed precision support and direct bfloat16 support. 1940326 daniel shalem commited on Oct 31, 2024
transformer3d: init mode xora never happens because lower case needed. a3498bb dudumoshe commited on Oct 8, 2024