Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
smangrul 
posted an update Feb 19
Post
πŸš€ Exciting news from πŸ€— PEFT!

We are introducing new merging methods for LoRA adapters. These methods allow for retaining the unique capabilities of individual LoRAs while enabling them to combine their strengths: https://huggingface.co/blog/peft_merging

We explored the application of merging LoRA adapters in the context of personal code copilot before πŸš€πŸ‘Ύβœ¨. Please go through the below thread on it: https://x.com/sourab_m/status/1718008115726283004?s=20

New merging methods ties, dare, and magnitude_prune introduced alongside existing methods cat, linear, and svd. Blogpost details each method. These methods can be applied on-the-fly during inference time instead of merging offline enabling great developer UX. ✨

How do I merge my LoRA adapters?
Easy, use class method add_weighted_adapter(). For example, below you can see how we can combine three LoRA adapters using ties method. We can observe that merged adapter can retain the capabilities of individual adapters!

Now that we have seen they can retain individual LoRAs, how about use cases wherein we require the capabilities from multiple LoRAs being merged/combined? Below is an application of it in text-to-image domain. πŸ–ΌοΈ

Kudos to @prateeky2806 (TIES author) and Le Yu (DARE author) for their kind and generous guidance on the PRs! Also, if you want to explore full model merging, refer to super cool projects like https://github.com/arcee-ai/mergekit/tree/main, https://github.com/Gryphe/BlockMerge_Gradient and https://github.com/yule-BUAA/MergeLM/tree/main.

Excited to see what the community creates on top of this! πŸš€βœ¨ #LetsBuildTogether
In this post