The Fréchet Mean Prototype is the center point in a metric space that minimizes the sum of the squared distances to all data points. It extends the averaging operation from flat vector spaces to arbitrary curved spaces or complex objects.
When the data being processed are not simple vectors (e.g., complex matrices, probability distributions, points on a curved surface) and cannot be directly summed and divided by , the Fréchet mean becomes necessary.
The core value of the Fréchet mean prototype is that it generalizes the "averaging" operation from flat vector spaces to arbitrary curved spaces or complex objects, enabling many classical statistical methods and machine learning algorithms to handle non-vector data.
Mean Prototype
The Mean Prototype of a set of vectors refers to a new vector obtained by arithmetically averaging each corresponding component of the vectors.
More specifically: Suppose we have -dimensional vectors:
where each vector .
The mean prototype is simply defined as:
That is:
Intuitively, the mean vector is the "center point" or "average position" of the set of vectors. In data analysis and machine learning (e.g., clustering, PCA), it is commonly used to represent the overall trend of the dataset. The Mean Prototype can be seen as the "optimal representative point under Euclidean geometry + Gaussian assumptions." If your samples are embedded on a unit sphere or follow a vMF distribution, the Euclidean mean is no longer optimal.
Geometric Median Prototype
The Geometric Median Prototype refers to: For a set of vectors (data points), find a point that minimizes the sum of Euclidean distances to all data points. This point is the geometric median of the set of vectors and serves as the "prototype" representing the entire dataset.
Formal definition: Given -dimensional vectors , the geometric median satisfies:
where is the Euclidean distance under the norm.
Intuitively, in 1D: the geometric median is the ordinary median (minimizing the sum of absolute deviations). Extended to multiple dimensions: it is a point in space that minimizes the "total straight-line distance" to all data points. Imagine finding a location on a plane that minimizes the total travel distance to all given points — similar to finding a gathering point that minimizes total travel distance among several cities.
The Geometric Median Prototype is highly robust, which is its most important characteristic — even if the data contains a few outliers, the geometric median still reflects the center of the "majority" of points. It is equivariant under rotation and translation: if the data is rotated or translated, the geometric median changes accordingly. It can lie anywhere between data points (unlike K-medoids, which can only select actual data points).
Since there is no closed-form solution, the Weiszfeld algorithm (iteratively reweighted average) is commonly used:
The initial value can be the mean vector, iterating until convergence. Note: Special handling is required when coincides exactly with a data point (skip that point).
It is used to represent "typical samples" instead of the mean when data contains outliers or noise. Hence, it serves as cluster centers in some clustering algorithms (more robust than K-means).
In video background modeling, the geometric median of multiple frames can serve as the background (unaffected by foreground moving objects). It is also used to find the optimal service point among multiple demand points (e.g., hospital, warehouse location).
The Geometric Median Prototype is the "spatial center" of data points, minimizing the total Euclidean distance to all points, and is more robust to outliers than the mean vector.
Euclidean Norm
The Euclidean norm is the most familiar "vector distance". Different norms essentially define "what distance/magnitude means", and different definitions directly change the geometric structure, optimization behavior, and even model preferences. Generally, we refer to the norm, where is a parameter.
For an -dimensional vector , the Lp norm is defined as:
Different values of yield distances under different norms. When , we obtain the distance under the norm:
Intuitively, it is the sum of absolute values of each component. In a 2D plane, the distance is the Manhattan distance (moving only along coordinate axes). For example, has norm .
When , the distance under the norm is defined as:
Intuitively, it is the straight-line distance from the origin to that point in geometry. For example, has norm . Geometrically, when , the vectors lie on the unit circle.
In summary, the distance under the norm is the sum of absolute values per coordinate (Manhattan distance), yielding sparse solutions. The norm is the square root of the sum of squares (Euclidean distance), smooth and easy to optimize. The general formula is the -th power of the sum of the -th powers of each component, approaching the maximum as .
Coordinate-wise Median Prototype
The Coordinate-wise Median Prototype refers to: For a multi-dimensional dataset, compute the median on each dimension (coordinate) separately, and then combine these medians into a new vector that serves as the "prototype" or central representative of the entire dataset.
Suppose we have -dimensional vectors , where
The Coordinate-wise Median Prototype is defined as:
That is, each component independently takes the median of all data points on that dimension.
If rotated by 45 degrees, the coordinate-wise median result changes (not rotation equivariant), whereas the geometric median rotates accordingly.
In computation, it only requires sorting per dimension or using a selection algorithm , and it can resist up to 50% of arbitrary extreme outliers.
However, the choice of coordinate axes affects the result. If the data has correlations (e.g., a tilted ellipse), the coordinate-wise median can deviate completely from the true center. It necessarily lies within the value range of each dimension, so it remains within the axis-aligned bounding box of the data, but not necessarily within the convex hull of the data points (because each dimension is chosen independently). In machine learning, it lacks a global optimization objective, unlike the median or geometric median which minimize some sum of distances. It is not suitable for objective functions.
It is robust to outliers when the data has outliers and the dimensions are relatively independent.
For example, in 3D space, we have a vector matrix:
The median of the 1st dimension is , the 2nd dimension is , and the 3rd dimension is .
Its Coordinate-wise Median Prototype is .
Extend to High Norm
Having defined the Euclidean norm, let's revisit the definitions of the mean prototype, geometric median prototype, and coordinate-wise median prototype. Clearly, we can define them using the Euclidean norm.
The mean prototype is the point that minimizes the sum of squared Euclidean distances ( distance norm) to all points:
The geometric median prototype is the point that minimizes the sum of Euclidean distances ( distance norm) to all points:
The coordinate-wise median prototype is the point that minimizes the sum of Manhattan distances ( distance norm) to all points:
At this point, we can unify all the above concepts:
where is the distance norm, and is the power to which the distance under that norm is raised. determines how distance is measured, while determines the penalty on outliers: the larger , the more sensitive to outliers.
This formula also describes the core objective function of the Generalized Weber Location Problem.