ReLU

Intermediate

Activation max(0, x); improves gradient flow and training speed in deep nets.

Full Definition

Activation max(0, x); improves gradient flow and training speed in deep nets.

Keywords

Domains

Related Terms

Concept Map

See how ReLU connects to other concepts.

Open Knowledge Graph