Skip to main content
padlock icon - secure page this page is secure

Connectionist learning of regular graph grammars

Buy Article:

Your trusted access to this article has expired.

$60.00 + tax (Refund Policy)

This paper presents a new connectionist approach to grammatical inference. Using only positive examples, the algorithm learns regular graph grammars, representing two-dimensional iterative structures drawn on a discrete Cartesian grid. This work is intended as a case study in connectionist symbol processing andgeometric concept formation. A grammar is represented by a self-configuring connectionist network that is analogous to a transition diagram except that it can deal with graph grammars as easily as string grammars. Learning starts with a trivial grammar, expressing nogrammatical knowledge, which is then refined, by a process of successive node splitting and merging, into a grammar adequate to describe the population of input patterns. In conclusion, I argue that the connectionist style of computation is, in some ways, better suited than sequential computation to the task of representing and manipulating recursive structures.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media
No Metrics

Keywords: GRAMMATICAL INFERENCE; GRAPH GRAMMARS; NEURAL NETWORKS; PARALLEL PARSING; REGULAR GRAMMARS; STOCHASTIC GRAMMARS; SYMBOL PROCESSING; UNSUPERVISED LEARNING

Document Type: Research Article

Publication date: June 1, 2001

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more