Temperature

Intermediate

Scales logits before sampling; higher increases randomness/diversity, lower increases determinism.

AdvertisementAd space — term-top

Why It Matters

Temperature is a key factor in controlling the creativity and coherence of AI-generated content. By adjusting this parameter, developers can fine-tune models for specific applications, whether they need more diverse outputs for creative writing or more precise responses for technical tasks.

Temperature is a hyperparameter used in the sampling process of probabilistic models, particularly in natural language generation. It controls the randomness of predictions by scaling the logits before applying the softmax function. Mathematically, the temperature T modifies the logits z as follows: z' = z / T, where z' are the adjusted logits. A higher temperature (T > 1) results in a flatter probability distribution, increasing randomness and diversity in the generated outputs, while a lower temperature (T < 1) sharpens the distribution, favoring the most likely outcomes and reducing variability. The choice of temperature is critical in balancing the trade-off between creativity and coherence in generated sequences, impacting the overall quality of the model's outputs and its applicability in various contexts.

Keywords

Domains

Related Terms

Welcome to AI Glossary

The free, self-building AI dictionary. Help us keep it free—click an ad once in a while!

Search

Type any question or keyword into the search bar at the top.

Browse

Tap a letter in the A–Z bar to browse terms alphabetically, or filter by domain, industry, or difficulty level.

3D WordGraph

Fly around the interactive 3D graph to explore how AI concepts connect. Click any word to read its full definition.