Lee Park

gogo8232

AI & ML interests

None yet

Recent Activity

Organizations

None yet

gogo8232's activity

upvoted an article about 2 months ago
view article
Article

From DeepSpeed to FSDP and Back Again with Hugging Face Accelerate

44
upvoted an article 5 months ago
view article
Article

Our Transformers Code Agent beats the GAIA benchmark!

46
Reacted to yushun0410's post with 🔥 5 months ago
view post
Post
4609
Hi Huggingfacers!

Thrilled to introduce Adam-mini, an optimizer that achieves on-par or better performance than AdamW with 45% to 50% less memory footprint. Adam-mini can also achieve 49.5% higher throughput than AdamW on Llama2-7B pre-training.

The design of Adam-mini is inspired by certain Hessian structures we observed on Transformers.

Feel free to try it out! Try switching to Adam-mini with the same hyperparams of AdamW, it would work with only half memory. Hope Adam-mini can help save time, cost, and energy in your tasks!

Paper: "Adam-mini: Use Fewer Learning Rates To Gain More" https://arxiv.org/abs/2406.16793

Code: https://github.com/zyushun/Adam-mini

  • 1 reply
·
upvoted 2 articles 5 months ago
view article
Article

BM25 for Python: Achieving high performance while simplifying dependencies with *BM25S*⚡

By xhluca
40
upvoted an article 5 months ago
New activity in maywell/ko_youtube_transcription_sample 6 months ago

1분미만

3
#2 opened 6 months ago by gogo8232
New activity in maywell/gpt4_evol_1.3k 6 months ago

제 지식 기준은

#2 opened 6 months ago by gogo8232