Symbolic and Neural Learning Algorithms: An Experimental Comparison

dc.contributor.authorShavlik, Jude Wen_US
dc.contributor.authorMooney, Raymond Jen_US
dc.contributor.authorTowell, Geoffrey G.en_US
dc.date.accessioned2012-03-15T16:50:50Z
dc.date.available2012-03-15T16:50:50Z
dc.date.created1989en_US
dc.date.issued1989en_US
dc.description.abstractDespite the fact that many symbolic and neural network (connectionist) learning algorithms are addressing the same problem of learning from classified examples, very little is known regarding their comparative strengths and weaknesses. Experiments comparing the ID3 symbolic learning algorithm with the perceptron and back-propagation neural learning algorithms have been performed using several large real-world data sets. Back-propagation performs about the same as the other two algorithms in terms of classification correctness on new examples, but takes much longer to train. The effects of the amount of training data, imperfect training examples, and the encoding of the desired outputs are also empirically analyzed. Suggestions for handling imperfect data sets are described and empirically justified. Symbolic and neural approaches work equally well in the presence of noise, while back-propagation does better when examples are incompletely specified. Back-propagation is better able to utilize a distributed output encoding, although ID3 is also able to take advantage of this representation style.en_US
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationTR857
dc.identifier.urihttp://digital.library.wisc.edu/1793/59144
dc.publisherUniversity of Wisconsin-Madison Department of Computer Sciencesen_US
dc.titleSymbolic and Neural Learning Algorithms: An Experimental Comparisonen_US
dc.typeTechnical Reporten_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
TR857.pdf
Size:
3.96 MB
Format:
Adobe Portable Document Format