Skip to main content

Free Content Connectionist learning of regular graph grammars

Download Article:
This paper presents a new connectionist approach to grammatical inference. Using only positive examples, the algorithm learns regular graph grammars, representing two-dimensional iterative structures drawn on a discrete Cartesian grid. This work is intended as a case study in connectionist symbol processing andgeometric concept formation. A grammar is represented by a self-configuring connectionist network that is analogous to a transition diagram except that it can deal with graph grammars as easily as string grammars. Learning starts with a trivial grammar, expressing nogrammatical knowledge, which is then refined, by a process of successive node splitting and merging, into a grammar adequate to describe the population of input patterns. In conclusion, I argue that the connectionist style of computation is, in some ways, better suited than sequential computation to the task of representing and manipulating recursive structures.

Keywords: GRAMMATICAL INFERENCE; GRAPH GRAMMARS; NEURAL NETWORKS; PARALLEL PARSING; REGULAR GRAMMARS; STOCHASTIC GRAMMARS; SYMBOL PROCESSING; UNSUPERVISED LEARNING

Document Type: Research Article

Publication date: 01 June 2001

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content