Home
Scholarly Works
Multi-Valued and Universal Binary Neurons: Theory,...
Journal article

Multi-Valued and Universal Binary Neurons: Theory, Learning, and Applications

Abstract

The reviewed book is dedicated to an extension of the perceptron, which in its initial form is able to classify correctly only linearly separable patterns. Minsky and Papert have suggested in their seminal book Perceptron (published in 1969) that this serious shortcoming may be surmounted by two different ways: The first way was an introduction of the so-called higher-order input activities that are represented by products of single input activities e.g., $x_{1}\cdot x_{2}$, while the second way employed hidden neurons. Minsky and Papert have rejected both these simple and straightforward extensions of the perceptron theory mainly due to nonexistence of a proper learning algorithm. The reviewed book discusses extensively another alternative way how to generalize perceptrons towards an ability to classify patterns that are not linearly separable. The idea is very simple, authors postulated that weight coefficients may be complex numbers and that a respective activation function is determined as follows

Authors

Chen K; Kvasnicka V; Kanen PC; Haykin S

Journal

IEEE Transactions on Neural Networks and Learning Systems, Vol. 12, No. 3, pp. 647–647

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

May 1, 2001

DOI

10.1109/tnn.2001.925572

ISSN

2162-237X

Contact the Experts team