Results for "true negative rate"
Of true negatives, the fraction correctly identified.
Adjusting learning rate over training to improve convergence.
Plots true positive rate vs false positive rate across thresholds; summarizes separability.
Ability to correctly detect disease.
Failure to detect present disease.
Controls the size of parameter updates; too high diverges, too low trains slowly or gets stuck.
Gradually increasing learning rate at training start to avoid divergence.
Iterative method that updates parameters in the direction of negative gradient to minimize loss.
A table summarizing classification outcomes, foundational for metrics like precision, recall, specificity.
Of true positives, the fraction correctly identified; sensitive to false negatives.
Rate at which AI capabilities improve.
Measures divergence between true and predicted probability distributions.
Scalar summary of ROC; measures ranking ability, not calibration.
Exponential of average negative log-likelihood; lower means better predictive fit, not necessarily better utility.
Returns above benchmark.
Organizational uptake of AI technologies.
Methods like Adam adjusting learning rates dynamically.
Maximum system processing rate.
Trend reversal when data is aggregated improperly.
Model relies on irrelevant signals.
Of predicted positives, the fraction that are truly positive; sensitive to false positives.
Often more informative than ROC on imbalanced datasets; focuses on positive class performance.
The degree to which predicted probabilities match true frequencies (e.g., 0.8 means ~80% correct).
Probabilities do not reflect true correctness.
Systematic error introduced by simplifying assumptions in a learning algorithm.
Activation max(0, x); improves gradient flow and training speed in deep nets.
Measures how one probability distribution diverges from another.
Quantifies shared information between random variables.
A point where gradient is zero but is neither a max nor min; common in deep nets.
Matrix of second derivatives describing local curvature of loss.