ABSTRACT
The basic idea of edge detection comes from Marr-Hildreth [15] who pro posed to convolve the image with Gaussians of increasing variance and then define the edges at each scale as the set of points where the norm of the gra dient of the resulting smoothed signal has a local maximum. This theory has been improved by Witkin, Koenderink [20,12], etc, who noticed that the con volution of the signal with Gaussians at each scale was equivalent to solving the heat equation with the signal as initial datum:
The solution of this equation for an initial datum with bounded quadratic norm is u = Gt *tio, where G t is the Gaussian function of variance t. It is well known that the edges at low scales give an inexact account of the boundaries.