Keogh’s Theorems on Estimating Error
Keogh claims in his recent paper that he gets O(n) algorithm for linear approximation. The reason that the linear approximation algorithms, like Sliding Window and Bottom Up, are O(Ln) is because we need O(L) for calculating the error for each segment, where L= n/k is the average length of k segments. Keogh claims that you need constant time to calculate the error of the merged segment from the two original segments. I tried to calculate the error today. It becomes complex after a while. I did not get the closed form formula. But I think he is right. For calculating the error, there is only SSxx= Sum(Xi^2) – n*Avg(X)^2; SSyy= Sum(Yi^2) – nAvg(Y)^2; SSxy = Sum(Xi*Yi)-n*Avg(X)*Avg(Y), in the formula.
Think about Avg(X)=Sum(Xi)*n. So you have only terms like Sum(Xi), Sum(Xi^2) in the formula. If you have a merged sequence, these terms can be calculated from the terms from the two original segments.
Think about Avg(X)=Sum(Xi)*n. So you have only terms like Sum(Xi), Sum(Xi^2) in the formula. If you have a merged sequence, these terms can be calculated from the terms from the two original segments.
0 Comments:
Post a Comment
<< Home