Please use this identifier to cite or link to this item: http://dspace.univ-bouira.dz:8080/jspui/handle/123456789/6230
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBenzaoui, Amir-
dc.date.accessioned2019-11-12T09:25:54Z-
dc.date.available2019-11-12T09:25:54Z-
dc.date.issued2012-
dc.identifier.citationJournal of Electronic Science and Technology (JEST)en_US
dc.identifier.urihttp://dspace.univ-bouira.dz:8080/jspui/handle/123456789/6230-
dc.description.abstractThe automatic detection of faces is a very important problem. The effectiveness of biometric authentication based on face mainly depends on the method used to locate the face in the image. This paper presents a hybrid system for faces detection in unconstrained cases in which the illumination, pose, occlusion, and size of the face are uncontrolled. To do this, the new method of detection proposed in this paper is based primarily on a technique of automatic learning by using the decision of three neural networks, a technique of energy compaction by using the discrete cosine transform, and a technique of segmentation by the color of human skin. A whole of pictures (faces and no faces) are transformed to vectors of data which will be used for learning the neural networks to separate between the two classes. Discrete cosine transform is used to reduce the dimension of the vectors, to eliminate the redundancies of information, and to store only the useful information in a minimum number of coefficients while the segmentation is used to reduce the space of research in the image. The experimental results have shown that this hybridization of methods will give a very significant improvement of the rate of the recognition, quality of detection, and the time of execution.en_US
dc.language.isoenen_US
dc.publisheruniversity bouiraen_US
dc.titleHybrid System for Robust Faces Detectionen_US
dc.typeArticleen_US
Appears in Collections:Articles

Files in This Item:
File Description SizeFormat 
1202-14 (1).pdf833,85 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.