#include <ProximityMeasure.h>
Inheritance diagram for prapi::JDDistance< T >:
It is defined as JD = sum_{i=1..N}(S_{i}ln(2*S_{i}/(M_{i}+S_{i})) + M_{i}ln(2*M_{i}/(M_{i}+S_{i}))), where S and M represent the sample and model distributions, respectively. N is the length of the distributions. If the normalization flag is set to true, M is normalized prior to distance calculation.
Public Methods  
JDDistance ()  
Construct a new Jeffrey's Divergence proximity measure.  
void  setMinValue (double value) throw (InvalidArgumentException&) 
Set the minimum value for a distribution entry that will be encountered.  
double  getMinValue (void) const 
Get the minimum value that is substituted to zerovalued feature vector entries.  
void  setNormalized (bool normalize) 
Set the normalized state.  
bool  isNormalized (void) 
Get the normalize state.  
double  getProximity (const util::List< T > &sample, const util::List< T > &model, double stopAfter=MAXDOUBLE) const throw (ProximityException&) 
Proximity between two lists. 

Construct a new Jeffrey's Divergence proximity measure. The measure is normalized by default. 

Proximity between two lists.
Implements prapi::ProximityMeasure< T >. 

Set the minimum value for a distribution entry that will be encountered. All zeros in histograms will be replaced with this value in proximity calculations. (Because otherwise a logarithm of zero would be calculated.) Setting this value is particularty important with sparse distributions. If your distribution is a histogram calculated from an matrix, you may want to set this value to 1/(rows*cols).


Set the normalized state. If true, then all feature vectors given to this proximity measure are normalized prior to calculating the proximity. If false, then the feature vectors are assumed to be normalized a priori. In the latter case, the computational performance of the measure will be significantly higher. The default value is true. 