Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/177906
Title: EXTRACTION OF CONTOURS IN GREY-LEVEL IMAGES
Authors: CHUA KAH HEAN
Issue Date: 1997
Citation: CHUA KAH HEAN (1997). EXTRACTION OF CONTOURS IN GREY-LEVEL IMAGES. ScholarBank@NUS Repository.
Abstract: This thesis presents an algorithm for extracting contours in grey-level images. The detection of valid contours is based on the closure of contours. No a priori information in the form of a mathematical description of the target contour is required. The extraction of contours is achieved in three stages. In the first stage, gradient derivatives are evaluated and the more likely edge points are selected to play a bigger role in the contour tracking process. The selection of edge points is based on the similarity in edge direction in the local neighbourhood. In the second stage, a contour tracking algorithm is used to extract contours and line segments in the image. In this algorithm, the search for the next contour point is conducted in the annulus-shaped neighbourhood of the current contour point, and is guided by the gradient orientation of the edge points. In the third stage, groups of line segments which can be joined to form the complete contour are detected. The linking of line segments are carried out based on grouping cues such as proximity and angular continuity. Results of extensive simulations carried out on a wide variety of images (including noise-corrupted images under non-uniform illumination, tool images, text images, MRI images, face images and cell images) are presented to demonstrate the efficiency and effectiveness of the proposed contour extraction algorithm.
URI: https://scholarbank.nus.edu.sg/handle/10635/177906
Appears in Collections:Master's Theses (Restricted)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
b20614846.pdf17.53 MBAdobe PDF

RESTRICTED

NoneLog In

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.