Uncertainty abounds in all aspects of computer vision. As a result, methods which explicitly model and manage that uncertainty have a better chance in producing meaningful results in such complex situations. Fuzzy set theory provides a framework to initially model the uncertainty in a vision problem, and many methods to exploit it in producing realistic results. One very important tool which has been used extensively in pattern recognition and computer vision is that of objective function based clustering. In this chapter, we will review classical and novel clustering methods as they apply to computer vision and show examples of their utility. In particular, we will focus on the use of clustering to detect and recognize regular boundaries of objects from images of edge magnitudes. The problem of fitting an unknown number of boundary curves to edge magnitude data is one of the major challenges of computer vision. We will show that fuzzy clustering can be readily adapted to the problem of curve detection, and that a new possibilistic clustering method introduced by the authors can produce significantly more accurate results than either crisp or fuzzy clustering in noisy situations.
|