Skip to main content
padlock icon - secure page this page is secure

Perceptrons with polynomial post-processing

Buy Article:

$61.00 + tax (Refund Policy)

We introduce tensor product neural networks, composed of a layer of univariate neurons followed by a net of polynomial post-processing. We look at the general approximation properties of these networks observing in particular their relationship to the Stone-Weierstrass theorem for uniform function algebras. The implementation of the post-processing as a two-layer network, with logarithmic and exponential neurons leads to potentially important 'generalized' product networks, which however require a complex approximation theory of Muntz-Szasz-Ehrenpreis type. A back-propagation algorithm for product networks is presented and used in three computational experiments. In particular, approximation by a sigmoid product network is compared to that of a single layer radial basis network, and a multiple layer sigmoid network. An additional experiment is conducted, based on an operational system, to further demonstrate the versatility of the architecture.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media
No Metrics

Keywords: APPROXIMATION; LOCAL RECEPTIVE FIELD; NEURAL NETWORK; POLYNOMIAL POSTPROCESSING; PRODUCT UNIT

Document Type: Research Article

Publication date: January 1, 2000

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
X
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more