Please use this identifier to cite or link to this item: http://ddms.usim.edu.my:80/jspui/handle/123456789/9190
Full metadata record
DC FieldValueLanguage
dc.contributor.authorA.S., Bin Samma-
dc.contributor.authorR.A., Salam-
dc.contributor.authorA.Z., Talib-
dc.date.accessioned2015-08-25T04:21:17Z-
dc.date.available2015-08-25T04:21:17Z-
dc.date.issued2010-
dc.identifier.isbn9781-4244-7167-6-
dc.identifier.urihttp://ddms.usim.edu.my/handle/123456789/9190-
dc.description.abstractIn this paper an enhanced background subtraction approach for image segmentation is proposed in order to precisely detect and represent the objects in the images. It is based on automatic detection of the background by estimating the background and then subtracting it from the original image. This step is incorporated in the background subtraction approach in order to reduce the computational cost and overcome the problems where the environment is complex such as underwater images and when there are many kinds of objects. The segmentation results using this enhanced approach are compared with the recent background subtraction techniques in terms of speed and accuracy in order to show the efficiency and the effectiveness of the proposed approach. © 2010 IEEE.en_US
dc.language.isoen_USen_US
dc.subjectAutomatic detectionen_US
dc.subjectBackground subtractionen_US
dc.subjectComputational costsen_US
dc.subjectSegmentation resultsen_US
dc.subjectInformation scienceen_US
dc.subjectSignal processingen_US
dc.subjectImage segmentation-
dc.subjectSignal detection-
dc.subjectUnderwater image-
dc.subjectOriginal images-
dc.subjectBackground subtraction techniques-
dc.titleEnhancement of background subtraction approach for image segmentationen_US
dc.typeConference Paperen_US
Appears in Collections:2014 International Conference On Information Science And Applications (Icisa)

Files in This Item:
File Description SizeFormat 
Enhancement of background subtraction approach for image segmentation.pdf180.25 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.