Results for "expectation error"
Average value under a distribution.
A conceptual framework describing error as the sum of systematic error (bias) and sensitivity to data (variance).
Measures how much information an observable random variable carries about unknown parameters.
Probabilistic model for sequential data with latent states.
Sampling from easier distribution with reweighting.
Applying learned patterns incorrectly.
Systematic error introduced by simplifying assumptions in a learning algorithm.
Using output to adjust future inputs.
Measures a model’s ability to fit random noise; used to bound generalization error.
A function measuring prediction error (and sometimes calibration), guiding gradient-based optimization.
When a model cannot capture underlying structure, performing poorly on both training and test data.
Average of squared residuals; common regression objective.
Error due to sensitivity to fluctuations in the training dataset.
Model that compresses input into latent space and reconstructs it.
Predicting future values from past observations.
Classical controller balancing responsiveness and stability.
Learning by minimizing prediction error.
A learning paradigm where an agent interacts with an environment and learns to choose actions to maximize cumulative reward.
A scalar measure optimized during training, typically expected loss over data, sometimes with regularization terms.
A proper scoring rule measuring squared error of predicted probabilities for binary outcomes.
Controls the size of parameter updates; too high diverges, too low trains slowly or gets stuck.
Reducing numeric precision of weights/activations to speed inference and reduce memory with acceptable accuracy loss.
Inputs crafted to cause model errors or unsafe behavior, often imperceptible in vision or subtle in text.
A model is PAC-learnable if it can, with high probability, learn an approximately correct hypothesis from finite samples.
Converting audio speech into text, often using encoder-decoder or transducer architectures.
Models evaluating and improving their own outputs.
Diffusion model trained to remove noise step by step.
Simultaneous Localization and Mapping for robotics.
Two-network setup where generator fools a discriminator.
Recovering 3D structure from images.