A general insight into the effect of neuron structure on classification
Author:
, , , , ,Year
: 2012
Abstract: This paper gives a general insight into how the neuron structure in a multilayer
perceptron (MLP) can affect the ability of neurons to deal with classification. Most of the
common neuron structures are based on monotonic activation functions and linear input
mappings. In comparison, the proposed neuron structure utilizes a nonmonotonic activation
function and/or a nonlinear input mapping to increase the power of a neuron. An MLP of these
high power neurons usually requires a less number of hidden nodes than conventional MLP
for solving classification problems. The fewer number of neurons is equivalent to the smaller
number of network weights that must be optimally determined by a learning algorithm. The
performance of learning algorithm is usually improved by reducing the number of weights,
i.e., the dimension of the search space. This usually helps the learning algorithm to escape
local optimums, and also, the convergence speed of the algorithm is increased regardless of
which algorithm is used for learning. Several 2-dimensional examples are provided manually
to visualize how the number of neurons can be reduced by choosing an appropriate neuron
structure. Moreover, to show the efficiency of the proposed scheme in solving real-world
classification problems, the Iris data classification problem is solved using an MLP whose
neurons are equipped by nonmonotonic activation functions, and the result is compared with
two well-known monotonic activation functions.
perceptron (MLP) can affect the ability of neurons to deal with classification. Most of the
common neuron structures are based on monotonic activation functions and linear input
mappings. In comparison, the proposed neuron structure utilizes a nonmonotonic activation
function and/or a nonlinear input mapping to increase the power of a neuron. An MLP of these
high power neurons usually requires a less number of hidden nodes than conventional MLP
for solving classification problems. The fewer number of neurons is equivalent to the smaller
number of network weights that must be optimally determined by a learning algorithm. The
performance of learning algorithm is usually improved by reducing the number of weights,
i.e., the dimension of the search space. This usually helps the learning algorithm to escape
local optimums, and also, the convergence speed of the algorithm is increased regardless of
which algorithm is used for learning. Several 2-dimensional examples are provided manually
to visualize how the number of neurons can be reduced by choosing an appropriate neuron
structure. Moreover, to show the efficiency of the proposed scheme in solving real-world
classification problems, the Iris data classification problem is solved using an MLP whose
neurons are equipped by nonmonotonic activation functions, and the result is compared with
two well-known monotonic activation functions.
Keyword(s): Neuron structure
Nonmonotonic activation function
Nonlinear input
mapping
Classification
Multilayer perceptron (MLP)
Iris data classification
Collections
:
-
Statistics
A general insight into the effect of neuron structure on classification
Show full item record
contributor author | هادی صدوقی یزدی | en |
contributor author | علیرضا روحانی منش | en |
contributor author | حمید رضا مدرّس | en |
contributor author | Hadi Sadoghi Yazdi | fa |
contributor author | Alireza Rowhanimanesh | fa |
contributor author | Hamidreza Modares | fa |
date accessioned | 2020-06-06T14:35:57Z | |
date available | 2020-06-06T14:35:57Z | |
date issued | 2012 | |
identifier uri | http://libsearch.um.ac.ir:80/fum/handle/fum/3403447?locale-attribute=en | |
description abstract | This paper gives a general insight into how the neuron structure in a multilayer perceptron (MLP) can affect the ability of neurons to deal with classification. Most of the common neuron structures are based on monotonic activation functions and linear input mappings. In comparison, the proposed neuron structure utilizes a nonmonotonic activation function and/or a nonlinear input mapping to increase the power of a neuron. An MLP of these high power neurons usually requires a less number of hidden nodes than conventional MLP for solving classification problems. The fewer number of neurons is equivalent to the smaller number of network weights that must be optimally determined by a learning algorithm. The performance of learning algorithm is usually improved by reducing the number of weights, i.e., the dimension of the search space. This usually helps the learning algorithm to escape local optimums, and also, the convergence speed of the algorithm is increased regardless of which algorithm is used for learning. Several 2-dimensional examples are provided manually to visualize how the number of neurons can be reduced by choosing an appropriate neuron structure. Moreover, to show the efficiency of the proposed scheme in solving real-world classification problems, the Iris data classification problem is solved using an MLP whose neurons are equipped by nonmonotonic activation functions, and the result is compared with two well-known monotonic activation functions. | en |
language | English | |
title | A general insight into the effect of neuron structure on classification | en |
type | Journal Paper | |
contenttype | External Fulltext | |
subject keywords | Neuron structure Nonmonotonic activation function Nonlinear input mapping Classification Multilayer perceptron (MLP) Iris data classification | en |
journal title | Knowledge and Information Systems | fa |
pages | 20-Jan | |
journal volume | 28 | |
journal issue | 1 | |
identifier link | https://profdoc.um.ac.ir/paper-abstract-1022797.html | |
identifier articleid | 1022797 |