Results for "temperature"
Temperature
IntermediateScales logits before sampling; higher increases randomness/diversity, lower increases determinism.
Temperature in AI is like adjusting the heat when cooking. If you turn up the heat, things get more mixed up and unpredictable, leading to more exciting flavors. In text generation, a higher temperature means the model will take more risks and produce varied outputs, while a lower temperature mak...
Scales logits before sampling; higher increases randomness/diversity, lower increases determinism.
Stochastic generation strategies that trade determinism for diversity; key knobs include temperature and nucleus sampling.
A measurable property or attribute used as model input (raw or engineered), such as age, pixel intensity, or token ID.
Samples from the k highest-probability tokens to limit unlikely outputs.
Classical controller balancing responsiveness and stability.