tslearn.metrics.cdist_soft_dtw_normalized¶

tslearn.metrics.
cdist_soft_dtw_normalized
(dataset1, dataset2=None, gamma=1.0)[source]¶ Compute crosssimilarity matrix using a normalized version of the SoftDTW metric.
SoftDTW was originally presented in [1] and is discussed in more details in our userguide page on DTW and its variants.
SoftDTW is computed as:
\[\text{softDTW}_{\gamma}(X, Y) = \min_{\pi}{}^\gamma \sum_{(i, j) \in \pi} \X_i, Y_j\^2\]where \(\min^\gamma\) is the softmin operator of parameter \(\gamma\).
In the limit case \(\gamma = 0\), \(\min^\gamma\) reduces to a hardmin operator and softDTW is defined as the square of the DTW similarity measure.
This normalized version is defined as:
\[\text{normsoftDTW}_{\gamma}(X, Y) = \text{softDTW}_{\gamma}(X, Y)  \frac{1}{2} \left(\text{softDTW}_{\gamma}(X, X) + \text{softDTW}_{\gamma}(Y, Y)\right)\]and ensures that all returned values are positive and that \(\text{normsoftDTW}_{\gamma}(X, X) = 0\).
Parameters:  dataset1
A dataset of time series
 dataset2
Another dataset of time series
 gamma : float (default 1.)
Gamma paraneter for SoftDTW
Returns:  numpy.ndarray
Crosssimilarity matrix
See also
soft_dtw
 Compute SoftDTW
cdist_soft_dtw
 Cross similarity matrix between time series datasets using the unnormalized version of SoftDTW
References
[1] M. Cuturi, M. Blondel “SoftDTW: a Differentiable Loss Function for TimeSeries,” ICML 2017. Examples
>>> time_series = numpy.random.randn(10, 15, 1) >>> numpy.alltrue(cdist_soft_dtw_normalized(time_series) >= 0.) True