Classification of Paper Quality Using Neural
Now try a feedforward neural network. As for RBF networks, a sigmoidal nonlinearity
is added to the output so that the
outputs are constrained to the interval 0 to 1. Due to the
random nature of the initialization and training processes, the result will vary each time
the commands are evaluated .
Load the Mathematica application package
Networks and the data.
Initialize a feedforward network without any hidden neurons.
Train the initialized network for 10 iterations.
The trained network can now be used to classify input vectors by applying the network to them.
Classify sample 27 of the validation data.
A crisp classification is obtained by setting all output values greater than 0.5 to True.
The 27th sample is correctly classified in class 3.
As with VQ and RBF networks, classification with feedforward networks can also be
illustrated by a bar chart with
correctly classified data on the diagonal and incorrectly classified data on off-diagonal bars.
By choosing OutputNonlinearity -> UnitStep the sigmoids at the outputs
are changed to a discrete step. This gives crisp classification.
Plot the classification result on the validation data.
Next, plot the classification performance improvement during training for each class.
Plot the progress of the classifier on the validation data.
You can repeat the example using different options for the neural network structure.
For example, you can introduce a layer of hidden neurons.
Three types of neural networks have been used to classify the paper quality data:
VQ, RBF, and feedforward.
What conclusion can you draw from the comparison?
As mentioned before, RBF networks often have problems with local
minima, especially if the dimension of the input space is high. To reduce this problem,
only three of the available 15
dimensions were used. Of course, when 12 dimensions are neglected, there is a danger
that the remaining three dimensions
do not contain enough information to separate the classes. Hence, RBF networks
are not very good for this problem.
A feedforward network also may have problems with local minima, especially if you
change the network and include a hidden
layer, but these problems are not as severe as they are for the RBF network.
You can test for problems by reevaluating the
example a few times.
Though the VQ network does not have any problems with local
minima in this example, it may with other data.
It is hard to say which classifier is the best, but the VQ network was the easiest one to train.