Results for "parameter sensitivity"
Ability to correctly detect disease.
Small prompt changes cause large output changes.
Using same parameters across different parts of a model.
A narrow minimum often associated with poorer generalization.
Popular optimizer combining momentum and per-parameter adaptive step sizes via first/second moment estimates.
Measures how much information an observable random variable carries about unknown parameters.
Bayesian parameter estimation using the mode of the posterior distribution.
Belief before observing data.
Probability of data given parameters.
Error due to sensitivity to fluctuations in the training dataset.
Plots true positive rate vs false positive rate across thresholds; summarizes separability.
Sensitivity of a function to input perturbations.
A conceptual framework describing error as the sum of systematic error (bias) and sensitivity to data (variance).
Uses an exponential moving average of gradients to speed convergence and reduce oscillation.
Controls the size of parameter updates; too high diverges, too low trains slowly or gets stuck.
Techniques that fine-tune small additional components rather than all weights to reduce compute and storage.
Estimating parameters by maximizing likelihood of observed data.
The shape of the loss function over parameter space.
A wide basin often correlated with better generalization.
Updated belief after observing data.
Visualization of optimization landscape.
Methods like Adam adjusting learning rates dynamically.
Of true positives, the fraction correctly identified; sensitive to false negatives.
Often more informative than ROC on imbalanced datasets; focuses on positive class performance.
Framework for identifying, measuring, and mitigating model risks.
Categorizing AI applications by impact and regulatory risk.
Matrix of first-order derivatives for vector-valued functions.
Alternative formulation providing bounds.
Classifying models by impact level.
AI applied to X-rays, CT, MRI, ultrasound, pathology slides.