Results for "spatial memory"
Extending agents with long-term memory stores.
Mechanisms for retaining context across turns/sessions: scratchpads, vector memories, structured stores.
Internal representation of environment layout.
Hardware resources used for training/inference; constrained by memory bandwidth, FLOPs, and parallelism.
Extension of convolution to graph domains using adjacency structure.
Recovering 3D structure from images.
An RNN variant using gates to mitigate vanishing gradients and capture longer context.
Reducing numeric precision of weights/activations to speed inference and reduce memory with acceptable accuracy loss.
A system that perceives state, selects actions, and pursues goals—often combining LLM reasoning with tools and memory.
Networks using convolution operations with weight sharing and locality, effective for images and signals.
Using same parameters across different parts of a model.
Neural networks that operate on graph-structured data by propagating information along edges.
Transformer applied to image patches.
Pixel motion estimation between frames.
CNNs applied to time series.
Devices measuring physical quantities (vision, lidar, force, IMU, etc.).
External sensing of surroundings (vision, audio, lidar).
Detecting and avoiding obstacles.
AI predicting crime patterns (highly controversial).
Deep learning system for protein structure prediction.
Number of samples per gradient update; impacts compute efficiency, generalization, and stability.
PEFT method injecting trainable low-rank matrices into layers, enabling efficient fine-tuning.
Networks with recurrent connections for sequences; largely supplanted by Transformers for many tasks.
Identifying speakers in audio.
Predicting future values from past observations.
Temporary reasoning space (often hidden).
Dynamic resource allocation.
Running models locally.
System-level design for general intelligence.