dEBORA: Efficient Bilevel Optimization-based low-Rank Adaptation
Emanuele Zangrando,
Sara Venturini,
Francesco Rinaldi,
Francesco Tudisco,
In: International Conference on Learning Representations (ICLR),
(2025)
Abstract
Low-rank adaptation methods are a popular approach for parameter-efficient fine-tuning of large-scale neural networks. However, selecting the optimal rank for each layer remains a challenging problem that significantly affects both performance and efficiency. In this paper, we introduce a novel bilevel optimization strategy that simultaneously trains both matrix and tensor low-rank adapters, dynamically selecting the optimal rank for each layer. Our method avoids the use of implicit differentiation in the computation of the hypergradient, and integrates a stochastic away-step variant of the Frank-Wolfe algorithm, eliminating the need for projection and providing identifiability guarantees of the optimal rank structure. This results in a highly efficient and cost-effective training scheme that adaptively allocates the parameter budget across the network layers. On top of a detailed theoretical analysis of the method, we provide different numerical experiments showcasing its effectiveness.
Please cite this paper as:
@inproceedings{zangrando2025debora,
title={dEBORA: Efficient Bilevel Optimization-based low-Rank Adaptation},
author={Zangrando, Emanuele and Venturini, Sara and Rinaldi, Francesco and Tudisco, Francesco},
booktitle={International Conference on Learning Representations (ICLR)},
year={2025}
}
Links:
doi
Keywords:
fine-tuning
peft
deep learning
neural networks
low-rank
pruning
compression