k-Nearest-Neighbour classifier.
This is a simple classifier that bases its decision on the distances
between the training dataset samples and the test sample(s). Distances
are computed using a customizable distance function. A certain number
(k)of nearest neighbors is selected based on the smallest distances
and the labels of this neighboring samples are fed into a voting
function to determine the labels of the test sample.
Training a kNN classifier is extremely quick, as no actuall training
is performed as the training dataset is simply stored in the
classifier. All computations are done during classifier prediction.
|
|
__init__(self,
k=2,
dfx=squared_euclidean_distance,
voting='weighted',
**kwargs)
Cheap initialization. |
source code
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
getMajorityVote(self,
knn_ids)
Simple voting by choosing the majority of class neighbors. |
source code
|
|
|
|
getWeightedVote(self,
knn_ids)
Vote with classes weighted by the number of samples per class. |
source code
|
|
|
|
|
|
Inherited from base.Classifier:
clone,
getSensitivityAnalyzer,
isTrained,
predict,
repredict,
retrain,
summary,
train,
trained
Inherited from misc.state.ClassWithCollections:
__getattribute__,
__new__,
__setattr__,
reset
Inherited from object:
__delattr__,
__hash__,
__reduce__,
__reduce_ex__
|