← Back to Vault

Low-Rank Adaptation Use

Cameron Rohn · Category: frameworks_and_exercises

Apply low-rank adaptation (LoRA) for very cheap, narrow-task fine-tuning when absolutely necessary to specialize a general model.