![]() |
Multivariate Pattern Analysis in Python |
k-Nearest-Neighbour classifier.
The comprehensive API documentation for this module, including all technical details, is available in the Epydoc-generated API reference for mvpa.clfs.knn (for developers).
Bases: mvpa.clfs.base.Classifier
k-Nearest-Neighbour classifier.
This is a simple classifier that bases its decision on the distances between the training dataset samples and the test sample(s). Distances are computed using a customizable distance function. A certain number (k)of nearest neighbors is selected based on the smallest distances and the labels of this neighboring samples are fed into a voting function to determine the labels of the test sample.
Training a kNN classifier is extremely quick, as no actuall training is performed as the training dataset is simply stored in the classifier. All computations are done during classifier prediction.
Note
If enabled, kNN stores the votes per class in the ‘values’ state after calling predict().
Note
Available state variables:
- feature_ids: Feature IDS which were used for the actual training.
- predicting_time+: Time (in seconds) which took classifier to predict
- predictions+: Most recent set of predictions
- trained_dataset: The dataset it has been trained on
- trained_labels+: Set of unique labels it has been trained on
- training_confusion: Confusion matrix of learning performance
- training_time+: Time (in seconds) which took classifier to train
- values+: Internal classifier values the most recent predictions are based on
(States enabled by default are listed with +)
| Parameters: |
|
|---|
See also
Derived classes might provide additional methods via their base classes. Please refer to the list of base classes (if it exists) at the begining of the kNN documentation.
Full API documentation of kNN in module mvpa.clfs.knn.