Paper
28 March 2005 Learning of dynamic variations of N-dimension patterns in a noniterative neural network
Chia-Lun John Hu, Sirikahlaya Chanekasit
Author Affiliations +
Abstract
In a set of preprocessed N-dimension, analog pattern vectors {Um, m=1 to M, each Um represents a distinct pattern}, if N>>M, then a one-layered sign-function neural network (OLNN) is sufficient to do a very robust, yet very accurate, noniterative-learning of all patterns. After the learning is done, the OLNN will make an accurate identification on an untrained test pattern even when the test pattern is varying within a certain dynamic range of a particular standard pattern learned during the noniterative learning process. The analytical foundation for making this dynamic neural network pattern recognition possible is the following. If we know that a standard pattern Um will vary gradually among K boundary patterns Um1 to Umk, then we can train the neural network noniteratively to learn JUST THE BOUNDARY vectors {Umi, i=1 to k} for each pattern Um. Then, due to a distinctive property of noniterative learning, for a test input pattern Ut equal to any graduate changes within the boundaries (i.e., Ut = any CONVEX combination of the boundary set {Umi, i=1 to k, m fixed.}), the OLNN can still automatically recognize this changed pattern even though all these gradually changed pattern are NOT learned step by step in the noniterative learning.
© (2005) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chia-Lun John Hu and Sirikahlaya Chanekasit "Learning of dynamic variations of N-dimension patterns in a noniterative neural network", Proc. SPIE 5816, Optical Pattern Recognition XVI, (28 March 2005); https://doi.org/10.1117/12.602946
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Analog electronics

Lithium

Chemical elements

Eye

Pattern recognition

Binary data

Back to Top