Inheritance diagram for prapi::neuro::SOM< I, C >:
Typically, SOM is trained in two phases. First, a "rough" training is made with a relatively large learning rate (~0.05), a relatively large radius (e.g. 10, depending on the size of the map), and a relatively low number of iterations (depending on the number of training samples). Then, fine-tuning is performed with a smaller learning rate (~0.02), a smaller radius (e.g. 3), and a larger number of iterations.
|SOM (int xSize=1, int ySize=1)|
|Create a new SOM with the given size.|
|void||initialize (const util::List< Sample< double, I, C > > &initialCodebook) throw (InvalidArgumentException&)|
|Initialize the code book. |
|void||train (const util::List< Sample< double, I, C > > &trainingSamples) throw (ClassificationException&)|
|Train the SOM with the given training data.|
|void||label (const util::List< Sample< double, I, C > > &trainingSamples) throw (ClassificationException&)|
|Give class labels to SOM nodes. |
|void||setRadius (double radius)|
|Set the learning radius. |
|double||getRadius () const|
|Get the learning radius.|
|void||setTopology (SOMTopology topology)|
|Set the topology of the network.|
|SOMTopology||getTopology () const|
|Get the topology of the network.|
|void||setLearningLength (long length)|
|Set the number of iterations in training.|
|void||setLearningRate (double rate)|
|Set learning rate.|
|void||setLearningRateFunction (SOMLearningRate func)|
|Set the type of learning rate change in training.|
|long||getLearningLength () const|
|Get the number of iterations in training.|
|double||getLearningRate () const|
|Get learning rate.|
|double||getLearningRateFunction () const|
|Get the type of learning rate change in training.|
|int||getWidth () const|
|Get the width of the SOM.|
|int||getHeight () const|
|Get the height of the SOM.|
|void||writeSOM_PAKFile (std::string fileName, std::string neighborhoodName="bubble", const util::List< std::string > &classNames=util::List< std::string >(0)) throw (util::io::IOException&)|
|Write a SOM_PAK formatted SOM description to a file. |
|virtual void||adaptVector (Sample< double, I, C > &code, const Sample< double, I, C > &sample, double alpha)|
|Adapt a code vector towards a sample. |
Adapt a code vector towards a sample.
The default implementation calculates a weighed average of the sample and code vectors.
Initialize the code book.
The number of code vectors must be equal to the size of the SOM, i.e. width*height. Typically, the initial code book consists of samples selected randomly from training data.
Give class labels to SOM nodes.
A simple voting procedure is used here: each training sample is assigned to the closest node, and each node gets a class label of the most common class in it.
Set the learning radius.
The radius affects the number of code vectors that are adapted in the neighborhood of the winning node.
Write a SOM_PAK formatted SOM description to a file.
SOM_PAK is a well-known Matlab implementation of the SOM.