Generalization of back-propagation to recurrent neural networks

Fernando J. Pineda
Phys. Rev. Lett. 59, 2229 – Published 9 November 1987
PDFExport Citation

Abstract

An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the δ rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.

  • Received 10 June 1987

DOI:https://doi.org/10.1103/PhysRevLett.59.2229

©1987 American Physical Society

Authors & Affiliations

Fernando J. Pineda

  • Johns Hopkins University, Applied Physics Laboratory, Johns Hopkins Road, Laurel, Maryland 20707

References (Subscription Required)

Click to Expand
Issue

Vol. 59, Iss. 19 — 9 November 1987

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×