Mechanisms of dimensionality reduction and decorrelation in deep neural networks

Haiping Huang
Phys. Rev. E 98, 062313 – Published 14 December 2018

Abstract

Deep neural networks are widely used in various domains. However, the nature of computations at each layer of the deep networks is far from being well understood. Increasing the interpretability of deep neural networks is thus important. Here, we construct a mean-field framework to understand how compact representations are developed across layers, not only in deterministic deep networks with random weights but also in generative deep networks where an unsupervised learning is carried out. Our theory shows that the deep computation implements a dimensionality reduction while maintaining a finite level of weak correlations between neurons for possible feature extraction. Mechanisms of dimensionality reduction and decorrelation are unified in the same framework. This work may pave the way for understanding how a sensory hierarchy works.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Received 30 October 2017
  • Revised 26 November 2018

DOI:https://doi.org/10.1103/PhysRevE.98.062313

©2018 American Physical Society

Physics Subject Headings (PhySH)

Interdisciplinary PhysicsStatistical Physics & Thermodynamics

Authors & Affiliations

Haiping Huang*

  • School of Physics, Sun Yat-sen University, Guangzhou 510275, People's Republic of China and Laboratory for Neural Computation and Adaptation, RIKEN Center for Brain Science, Wako-shi, Saitama 351-0198, Japan

  • *physhuang@gmail.com; www.labxing.com/hphuang2018

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 98, Iss. 6 — December 2018

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×