

DoRA: Weight-Decomposed Low-Rank Adaptation
Feb 19, 2024
Exploring DoRA, a novel weight decomposition method enhancing the fine-tuning process by decomposing weights into magnitude and direction components. DoRA outperforms LoRA on various downstream tasks like commonsense reasoning and image understanding while maintaining inference efficiency. The podcast discusses the implementation of DoRA, its performance comparison with LoRA, and its potential beyond language and vision domains.
Chapters
Transcript
Episode notes
1 2 3 4 5 6
Introduction
00:00 • 4min
Efficient Fine-Tuning with Weight Decomposition and PefT Methods
04:09 • 3min
SVD Decomposition and Low-Rank Adaptation in Federated Learning
07:15 • 7min
Introduction of DORA for Weight-Decomposed Low-rank Adaptation
13:47 • 7min
DORA Performance Comparison
20:44 • 14min
Exploring Advancements in Visual Question Answering and Transfer Learning
35:12 • 5min