Results for "false inference"

21 results

Precision Intermediate

Of predicted positives, the fraction that are truly positive; sensitive to false positives.

Foundations & Theory
Recall Intermediate

Of true positives, the fraction correctly identified; sensitive to false negatives.

Foundations & Theory
F1 Score Intermediate

Harmonic mean of precision and recall; useful when balancing false positives/negatives matters.

Foundations & Theory
ROC Curve Intermediate

Plots true positive rate vs false positive rate across thresholds; summarizes separability.

Foundations & Theory
Secure Inference Intermediate

Methods to protect model/data during inference (e.g., trusted execution environments) from operators/attackers.

Foundations & Theory
Latency Intermediate

Time from request to response; critical for real-time inference and UX.

Foundations & Theory
Compute Intermediate

Hardware resources used for training/inference; constrained by memory bandwidth, FLOPs, and parallelism.

Foundations & Theory
Quantization Intermediate

Reducing numeric precision of weights/activations to speed inference and reduce memory with acceptable accuracy loss.

Foundations & Theory
Causal Mask Intermediate

Prevents attention to future tokens during training/inference.

AI Economics & Strategy
Instrumental Variable Advanced

Variable enabling causal inference despite confounding.

Causal AI & Interpretability
Exposure Bias Intermediate

Differences between training and inference conditions.

Model Failure Modes
Token Budgeting Intermediate

Limiting inference usage.

AI Economics & Strategy
False Negative Intermediate

Failure to detect present disease.

AI in Healthcare
Causal Inference Intermediate

Framework for reasoning about cause-effect relationships beyond correlation, often using structural assumptions and experiments.

Foundations & Theory
Bayesian Inference Intermediate

Updating beliefs about parameters using observed evidence and prior distributions.

AI Economics & Strategy
Inference Pipeline Intermediate

Model execution path in production.

MLOps & Infrastructure
Batch Inference Intermediate

Running predictions on large datasets periodically.

MLOps & Infrastructure
Online Inference Intermediate

Low-latency prediction per request.

MLOps & Infrastructure
Inference Cost Intermediate

Cost to run models in production.

AI Economics & Strategy
Edge Inference Intermediate

Running models locally.

AI Economics & Strategy
Active Inference Frontier

Acting to minimize surprise or free energy.

World Models & Cognition