Bull. Coll. Med. Sci. Tohoku Univ. 11(3): 253`259C2002

Superimposing Learning for Backpropagation
Neural Networks


Noriyasu HOMMA*,** and Madan M. GUPTA**
*Department of Radiological Technology, College of Medical Sciences, Tohoku University
**Intelligent Systems Research Laboratory, College of Engineering, University of Saskatchewan


d‚ˏ‘‚«ŠwKƒjƒ…[ƒ‰ƒ‹ƒlƒbƒg

–{ŠΤŒoN*,**CMadan M. GUPTA**
*“Œ–k‘εŠwˆγ—Γ‹Zp’ZŠϊ‘εŠw•” f—Γ•ϊŽΛό‹ZpŠw‰Θ
**ƒTƒXƒJƒbƒ`ƒƒ“‘εŠwHŠw•” ’m“IƒVƒXƒeƒ€Œ€‹†Š


Key words : Neural Networks, Learning, Structural adaptation, Long-term memory,
Short-term memory, Incremental learning, Backpropagation


      In this paper a new neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed network structure is their dynamic and spatial changing weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing connecting weights. To avoid disturbing the past knowledge by the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using a system identification task.