“School of Cognitive”

Back to Papers Home
Back to Papers of School of Cognitive

Paper   IPM / Cognitive / 13146
School of Cognitive Sciences
  Title:   Improving combination method of NCL experts using gating network
  Author(s): 
1.  R. Ebrahimpour
2.  A. Abbaszadeh
3.  S. Masoudnia
  Status:   Published
  Journal: Neural Computing & Applications
  Vol.:  22
  Year:  2013
  Pages:   95-101
  Supported by:  IPM
  Abstract:
Negative Correlation Learning (NCL) is a popular combining method that employs special error function for the simultaneous training of base neural network (NN) experts. In this article, we propose an improved version of NCL method in which the capability of gating network, as the combining part of Mixture of Experts method, is used to combine the base NNs in the NCL ensemble method. The special error function of the NCL method encourages each NN expert to learn different parts or aspects of the training data. Thus, the local competence of the experts should be considered in the combining approach. The gating network provides a way to support this needed functionality for combining the NCL experts. So the proposed method is called Gated NCL. The improved ensemble method is compared with the previous approaches were used for combining NCL experts, including winner-take-all (WTA) and average (AVG) combining techniques, in solving several classification problems from UCI machine learning repository. The experimental results show that our proposed ensemble method significantly improved performance over the previous combining approaches.

Download TeX format
back to top
scroll left or right