
Local outlier factor
From Wikipedia, the free encyclopedia
In anomaly detection, the local outlier factor (LOF) is an algorithm proposed by Markus M. Breunig, Hans-Peter Kriegel, Raymond T. Ng and Jörg Sander in 2000 for finding anomalous data points by measuring the local deviation of a given data point with respect to its neighbours.[1]
Part of a series on |
Machine learning and data mining |
---|
![]() |
Problems |
Learning with humans |
Model diagnostics |
LOF shares some concepts with DBSCAN and OPTICS such as the concepts of "core distance" and "reachability distance", which are used for local density estimation.[2]