Parsimonious Side Propagation

Loading...
Thumbnail Image

Date

Authors

Mangasarian, O.L.
Bradley, P.S.

Advisors

License

DOI

Type

Technical Report

Journal Title

Journal ISSN

Volume Title

Publisher

Grantor

Abstract

A fast parsimonious linear-programming-based algorithm for training neural networks is proposed that suppresses redundant features while using a minimal number of hidden units. This is achieved by propagating sideways to newly added hidden units the task of separating successive groups of unclassified points. Computational results how improvement o 26.53% and 19.76? in tenfold cross-validation test correctness over a parsimonious perceptron on two publicly available datasets.

Description

Keywords

Related Material and Data

Citation

97-11

Sponsorship

Endorsement

Review

Supplemented By

Referenced By