2015Deep LearningOptimization

Batch Normalization: Accelerating Deep Network Training

Batch Normalization normalizes layer inputs to accelerate training and allow higher learning rates, reducing sensitivity to initialization.

Original Source

Paper Preview

Select text on the current page to save it as a note. Saved notes stay highlighted.

Notes

Select text from the current page. Saving it will persist both the note and the highlight.

Login to unlock note-taking and persistent markers.

Current selection

Select text from the page preview to save it as a note.
Notes lockedSign in to create, view, and revisit saved annotations for this paper.