Showing the Equivalence of Two Training Algorithms - Part 1

Fischer I, Koch M, Berthold MR (1998)


Publication Language: English

Publication Type: Conference contribution, Original article

Publication year: 1998

Publisher: IEEE

Book Volume: 1

Pages Range: 441-446 (vol. 1)

Conference Proceedings Title: Proceedings of the IEEE International Joint Conference on Neural Networks

Event location: Anchorage, AK US

ISBN: 0-7803-4859-1

URI: http://www2.informatik.uni-erlangen.de/publication/download/ijcnn98b.ps.gz

DOI: 10.1109/IJCNN.1998.682308

Abstract

Graph transformations offer a powerful way to formally specify neural networks and their corresponding training algorithms. This formalism can be used to prove properties of these algorithms. In this paper graph transformations are used to show the equivalence of two training algorithms for recurrent neural networks; backpropagation through time, and a variant of real-time backpropagation. In addition to this proof a whole class of related training algorithms emerges from the used formalism

How to cite

APA:

Fischer, I., Koch, M., & Berthold, M.R. (1998). Showing the Equivalence of Two Training Algorithms - Part 1. In Proceedings of the IEEE International Joint Conference on Neural Networks (pp. 441-446 (vol. 1)). Anchorage, AK, US: IEEE.

MLA:

Fischer, Ingrid, Manuel Koch, and Michael R. Berthold. "Showing the Equivalence of Two Training Algorithms - Part 1." Proceedings of the IEEE International Joint Conference on Neural Networks, Anchorage, AK IEEE, 1998. 441-446 (vol. 1).

BibTeX: Download