Results for "sampling"
Sampling
IntermediateStochastic generation strategies that trade determinism for diversity; key knobs include temperature and nucleus sampling.
Sampling is like choosing a flavor of ice cream from a menu. Instead of always picking the most popular flavor, you might want to try something new and different. In AI, sampling helps generate text or sequences by randomly selecting from a range of possible options. There are different ways to s...
Stochastic generation strategies that trade determinism for diversity; key knobs include temperature and nucleus sampling.
Samples from the k highest-probability tokens to limit unlikely outputs.
Samples from the smallest set of tokens whose probabilities sum to p, adapting set size by context.
Sampling from easier distribution with reweighting.
Approximating expectations via random sampling.
Sampling-based motion planner.
Selecting the most informative samples to label (e.g., uncertainty sampling) to reduce labeling cost.
Scales logits before sampling; higher increases randomness/diversity, lower increases determinism.
Learning from data generated by a different policy.
Sampling multiple outputs and selecting consensus.
The internal space where learned representations live; operations here often correlate with semantics or generative factors.
Systematic differences in model outcomes across groups; arises from data, labels, and deployment context.
Artificially created data used to train/test models; helpful for privacy and coverage, risky if unrealistic.
Search algorithm for generation that keeps top-k partial sequences; can improve likelihood but reduce diversity.
Structured dataset documentation covering collection, composition, recommended uses, biases, and maintenance.
Raw model outputs before converting to probabilities; manipulated during decoding and calibration.
Variability introduced by minibatch sampling during SGD.
Balancing learning new behaviors vs exploiting known rewards.
Models that learn to generate samples resembling training data.
Learns the score (∇ log p(x)) for generative sampling.
Generative model that learns to reverse a gradual noise process.
Autoencoder using probabilistic latent variables and KL regularization.
Exact likelihood generative models using invertible transforms.
Randomizing simulation parameters to improve real-world transfer.
Variable whose values depend on chance.
Space of all possible robot configurations.
Unequal performance across demographic groups.
AI-assisted review of legal documents.