Results for "linear Gaussian"
Optimal estimator for linear dynamic systems.
Inferring the agent’s internal state from noisy sensor data.
Mathematical foundation for ML involving vector spaces, matrices, and linear transformations.
Diffusion model trained to remove noise step by step.
Vector whose direction remains unchanged under linear transformation.
Normalized covariance.
Optimal control for linear systems with quadratic cost.
Controls amount of noise added at each diffusion step.
Monte Carlo method for state estimation.
Number of linearly independent rows or columns.
Estimating parameters by maximizing likelihood of observed data.
Autoencoder using probabilistic latent variables and KL regularization.
Generative model that learns to reverse a gradual noise process.
Changing speaker characteristics while preserving content.
Predicts next state given current state and action.
AI applied to scientific problems.
Fast approximation of costly simulations.
A formal privacy framework ensuring outputs do not reveal much about any single individual’s data contribution.
A subfield of AI where models learn patterns from data to make predictions or decisions, improving with experience rather than explicit rule-coding.
Activation max(0, x); improves gradient flow and training speed in deep nets.
Systematic error introduced by simplifying assumptions in a learning algorithm.
Set of vectors closed under addition and scalar multiplication.
Control that remains stable under model uncertainty.
Incremental capability growth.
A branch of ML using multi-layer neural networks to learn hierarchical representations, often excelling in vision, speech, and language.
Learning a function from input-output pairs (labeled data), optimizing performance on predicting outputs for unseen inputs.
A parameterized mapping from inputs to outputs; includes architecture + learned parameters.
Networks using convolution operations with weight sharing and locality, effective for images and signals.
Nonlinear functions enabling networks to approximate complex mappings; ReLU variants dominate modern DL.
Local surrogate explanation method approximating model behavior near a specific input.