Results for "low-rank adaptation"
LoRA
Intermediate
PEFT method injecting trainable low-rank matrices into layers, enabling efficient fine-tuning.
Dropout
Intermediate
Randomly zeroing activations during training to reduce co-adaptation and overfitting.
Learning Rate
Intermediate
Controls the size of parameter updates; too high diverges, too low trains slowly or gets stuck.
Inter-Annotator Agreement
Intermediate
Measure of consistency across labelers; low agreement indicates ambiguous tasks or poor guidelines.
Online Inference
Intermediate
Low-latency prediction per request.
High-Frequency Trading
Intermediate
Ultra-low-latency algorithmic trading.
Rank
Advanced
Number of linearly independent rows or columns.