Results for "data → model"
Training a smaller “student” model to mimic a larger “teacher,” often improving efficiency while retaining performance.
Halting training when validation performance stops improving to reduce overfitting.
A gradient method using random minibatches for efficient training on large datasets.
Achieving task performance by providing a small number of examples inside the prompt without weight updates.
Nonlinear functions enabling networks to approximate complex mappings; ReLU variants dominate modern DL.
Ordering training samples from easier to harder to improve convergence or generalization.
Time from request to response; critical for real-time inference and UX.
How many requests or tokens can be processed per unit time; affects scalability and cost.
Converting audio speech into text, often using encoder-decoder or transducer architectures.
A narrow hidden layer forcing compact representations.
Compromising AI systems via libraries, models, or datasets.
Graphs containing multiple node or edge types with different semantics.
Extension of convolution to graph domains using adjacency structure.
Predicting future values from past observations.
Monte Carlo method for state estimation.
Low-latency prediction per request.
Describes likelihoods of random variable outcomes.
Measure of spread around the mean.
Normalized covariance.
Loss of old knowledge when learning new tasks.
Ability to inspect and verify AI decisions.
Grouping patients by predicted outcomes.
Unequal performance across demographic groups.
AI discovering new compounds/materials.
Predicting borrower default risk.
Inferring the agent’s internal state from noisy sensor data.
The set of tokens a model can represent; impacts efficiency, multilinguality, and handling of rare strings.
Randomly zeroing activations during training to reduce co-adaptation and overfitting.
Mechanism that computes context-aware mixtures of representations; scales well and captures long-range dependencies.
A broader capability to infer internal system state from telemetry, crucial for AI services and agents.