I'm searching for algorithms to reduce the LOD of polylines, lines (looped or not) of nodes. In simple words, I want to take hi-resolution coastline data and be able to reduce its LOD hundred- or thousandfold to render it in small-scale.
I found polygon reduction algorithms (but they require triangles) and Laplacian smoothing, but that doesn't seem exactly what I need.
I've modified the code in culebrón's answer, removing the need for the Vec2D/Line classes, instead handling the points as a list of tuples.
The code is slightly less tidy, but shorter, and a bit quicker (for 900 points, the original code took 2966ms, and this version takes 500ms - still a bit slower than I'd like, but an improvement)
def _vec2d_dist(p1, p2):
return (p1[0] - p2[0])**2 + (p1[1] - p2[1])**2
def _vec2d_sub(p1, p2):
return (p1[0]-p2[0], p1[1]-p2[1])
def _vec2d_mult(p1, p2):
return p1[0]*p2[0] + p1[1]*p2[1]
def ramerdouglas(line, dist):
"""Does Ramer-Douglas-Peucker simplification of a curve with `dist`
threshold.
`line` is a list-of-tuples, where each tuple is a 2D coordinate
Usage is like so:
>>> myline = [(0.0, 0.0), (1.0, 2.0), (2.0, 1.0)]
>>> simplified = ramerdouglas(myline, dist = 1.0)
"""
if len(line) < 3:
return line
(begin, end) = (line[0], line[-1]) if line[0] != line[-1] else (line[0], line[-2])
distSq = []
for curr in line[1:-1]:
tmp = (
_vec2d_dist(begin, curr) - _vec2d_mult(_vec2d_sub(end, begin), _vec2d_sub(curr, begin)) ** 2 / _vec2d_dist(begin, end))
distSq.append(tmp)
maxdist = max(distSq)
if maxdist < dist ** 2:
return [begin, end]
pos = distSq.index(maxdist)
return (ramerdouglas(line[:pos + 2], dist) +
ramerdouglas(line[pos + 1:], dist)[1:])