Results for "shortcut learning"
A branch of ML using multi-layer neural networks to learn hierarchical representations, often excelling in vision, speech, and language.
Training with a small labeled dataset plus a larger unlabeled dataset, leveraging assumptions like smoothness/cluster structure.
Training one model on multiple tasks simultaneously to improve generalization through shared structure.
Methods that learn training procedures or initializations so models can adapt quickly to new tasks with little data.
Controls the size of parameter updates; too high diverges, too low trains slowly or gets stuck.
Achieving task performance by providing a small number of examples inside the prompt without weight updates.
Selecting the most informative samples to label (e.g., uncertainty sampling) to reduce labeling cost.
Ordering training samples from easier to harder to improve convergence or generalization.
Training across many devices/silos without centralizing raw data; aggregates updates, not data.
A theoretical framework analyzing what classes of functions can be learned, how efficiently, and with what guarantees.
A model is PAC-learnable if it can, with high probability, learn an approximately correct hypothesis from finite samples.
Inferring reward function from observed behavior.
Inferring and aligning with human preferences.