AbstractRecurrent neural networks that are trained to behave like deterministic finite-state automata (DFAs) can show deteriorating performance when tested on long strings. This deteriorating performance can be attributed to the instability of the internal representation of the learned DFA states. The use of a sigmoidal discriminant function together with the recurrent structure contribute to this instability. We prove that a simple algorithm can construct second-order recurrent neural networks with a sparse interconnection topology and sigmoidal discriminant function such that the internal DFA state representations are stable, that is, the constructed network correctly classifies strings of arbitrary length. The algorithm is based on encoding strengths of weights directly into the neural network. We derive a relationship between the weight strength and the number of DFA states for robust string classification. For a DFA with n states and m input alphabet symbols, the constructive algorithm generates a ``programmed'' neural network with O(n) neurons and O(mn) weights. We compare our algorithm to other methods in the proposed literature.
The abstract is also available as a LaTeX file, a DVI file, or a PostScript file.
Categories and Subject Descriptors: B.2.2 [Arithmetic and Logic Structures]: Performance Analysis and Design Aids -- simulation, verification; F.1.1 [Computation by Abstract Devices]: Models of Computation -- automata, relations among models, self-modifying machine; G.1.0 [Numerical Analysis] -- stability; G.1.2 [Numerical Analysis]: Approximation -- nonlinear approximation; I.2.4 [Artificial Intelligence]: Knowledge Representation Formalisms and Methods -- representations; I.5.1 [Pattern Recognition]: Models -- neural nets
General Terms: Algorithms, Theory, Verification
Additional Key Words and Phrases: Automata, connectionism, knowledge encoding, neural networks, nonlinear dynamics, recurrent neural networks, rules, stability
Selected references
- Noga Alon, A. K. Dewdney, and Teunis J. Ott. Efficient simulation of finite automata by neural nets. Journal of the ACM, 38(2):495-514, April 1991.
- Christian W. Omlin and C. Lee Giles. Rule revision with recurrent neural networks. IEEE Transactions on Knowledge and Data Engineering, 8(1):183-188, February 1996.