• 1
  • 1
  • 6
  • 5
  • 6
  • 3
  • 4

“School of Cognitive Sciences”

Back to Papers Home
Back to Papers of School of Cognitive Sciences

Paper   IPM / Cognitive Sciences / 13144
School of Cognitive Sciences
  Title:   Boost-wise pre-loaded mixture of experts for classification tasks
  Author(s): 
1.  R. Ebrahimpour
2.  N. Sadeghnejad
3.  S. Abbaszadeh Arani
4.  N. Mohammadi
  Status:   Published
  Journal: Neural Computing & Applications
  Vol.:  22
  Year:  2013
  Pages:   365-377
  Supported by:  IPM
  Abstract:
A modified version of Boosted Mixture of Experts (BME) is presented in this paper. While previous related works, namely BME, attempt to improve the performance by incorporating complementary features of a hybrid combining framework, they have some drawback. Analyzing the problems of previous approaches has suggested several modifications that have led us to propose a new method called Boost-wise Pre-loaded Mixture of Experts (BPME). We present a modification in pre-loading (initialization) procedure of ME, which addresses previous problems and overcomes them by employing a two-stage pre-loading procedure. In this approach, both the error and confidence measures are used as the difficulty criteria in boost-wise partitioning of problem space.

Download TeX format
back to top
Clients Logo
Clients Logo
Clients Logo
Clients Logo
Clients Logo
Clients Logo
Clients Logo
Clients Logo
scroll left or right