Main Page   Class Hierarchy   Alphabetical List   Compound List   Compound Members  

prapi::neuro::SOM< I, C > Class Template Reference

#include <SOM.h>

Inheritance diagram for prapi::neuro::SOM< I, C >:

prapi::VQClassifier< I, C > prapi::VectorQuantizer prapi::Classifier< double, I, C > Object EventSource< ClassificationEvent< double, I, C > > Object List of all members.

Detailed Description

template<class I = std::string, class C = int>
class prapi::neuro::SOM< I, C >

An implementation of the self-organizing map (Kohonen map).

Typically, SOM is trained in two phases. First, a "rough" training is made with a relatively large learning rate (~0.05), a relatively large radius (e.g. 10, depending on the size of the map), and a relatively low number of iterations (depending on the number of training samples). Then, fine-tuning is performed with a smaller learning rate (~0.02), a smaller radius (e.g. 3), and a larger number of iterations.


Public Methods

 SOM (int xSize=1, int ySize=1)
 Create a new SOM with the given size.

void initialize (const util::List< Sample< double, I, C > > &initialCodebook) throw (InvalidArgumentException&)
 Initialize the code book.

void train (const util::List< Sample< double, I, C > > &trainingSamples) throw (ClassificationException&)
 Train the SOM with the given training data.

void label (const util::List< Sample< double, I, C > > &trainingSamples) throw (ClassificationException&)
 Give class labels to SOM nodes.

void setRadius (double radius)
 Set the learning radius.

double getRadius () const
 Get the learning radius.

void setTopology (SOMTopology topology)
 Set the topology of the network.

SOMTopology getTopology () const
 Get the topology of the network.

void setLearningLength (long length)
 Set the number of iterations in training.

void setLearningRate (double rate)
 Set learning rate.

void setLearningRateFunction (SOMLearningRate func)
 Set the type of learning rate change in training.

long getLearningLength () const
 Get the number of iterations in training.

double getLearningRate () const
 Get learning rate.

double getLearningRateFunction () const
 Get the type of learning rate change in training.

int getWidth () const
 Get the width of the SOM.

int getHeight () const
 Get the height of the SOM.

virtual ~SOM ()
void writeSOM_PAKFile (std::string fileName, std::string neighborhoodName="bubble", const util::List< std::string > &classNames=util::List< std::string >(0)) throw (util::io::IOException&)
 Write a SOM_PAK formatted SOM description to a file.


Protected Methods

virtual void adaptVector (Sample< double, I, C > &code, const Sample< double, I, C > &sample, double alpha)
 Adapt a code vector towards a sample.


Member Function Documentation

template<class I, class C>
void prapi::neuro::SOM< I, C >::adaptVector Sample< double, I, C > &    code,
const Sample< double, I, C > &    sample,
double    alpha
[protected, virtual]
 

Adapt a code vector towards a sample.

The default implementation calculates a weighed average of the sample and code vectors.

Parameters:
code  the code vector to be adapted
sample  the training sample according to which the code vector is to be adapted
alpha  adaptation factor, obtained by multiplying the current learning rate by the value of the neighborhood function at the position of the code vector.

template<class I, class C>
void prapi::neuro::SOM< I, C >::initialize const util::List< Sample< double, I, C > > &    initialCodebook throw (InvalidArgumentException&)
 

Initialize the code book.

The number of code vectors must be equal to the size of the SOM, i.e. width*height. Typically, the initial code book consists of samples selected randomly from training data.

template<class I, class C>
void prapi::neuro::SOM< I, C >::label const util::List< Sample< double, I, C > > &    trainingSamples throw (ClassificationException&)
 

Give class labels to SOM nodes.

A simple voting procedure is used here: each training sample is assigned to the closest node, and each node gets a class label of the most common class in it.

template<class I = std::string, class C = int>
void prapi::neuro::SOM< I, C >::setRadius double    radius [inline]
 

Set the learning radius.

The radius affects the number of code vectors that are adapted in the neighborhood of the winning node.

template<class I, class C>
void prapi::neuro::SOM< I, C >::writeSOM_PAKFile std::string    fileName,
std::string    neighborhoodName = "bubble",
const util::List< std::string > &    classNames = util::List< std::string >(0)
throw (util::io::IOException&)
 

Write a SOM_PAK formatted SOM description to a file.

SOM_PAK is a well-known Matlab implementation of the SOM.

Parameters:
fileName  the name of the file to write into
neighborhoodName  the type of the neighborhood function represented textually. Use only if you know what you are doing.
classNames  a list of textual names for the class indices. If this list is empty, class indices will be used as class names.
Exceptions:
IOException  if the file cannot be written to


The documentation for this class was generated from the following file:
Documentation generated on 11.09.2003 with Doxygen.
The documentation is copyrighted material.
Copyright © Topi Mäenpää 2003. All rights reserved.