Show simple item record

dc.contributor.advisorResendiz, Maria D.
dc.contributor.authorStewart, Anna ( )
dc.date.accessioned2020-07-20T18:49:14Z
dc.date.available2020-07-20T18:49:14Z
dc.date.issued2020-05
dc.identifier.citationStewart, A. (2020). Emotions and emoticons: Facial expression recognition app accuracy (Unpublished thesis). Texas State University, San Marcos, Texas.en_US
dc.identifier.urihttps://digital.library.txstate.edu/handle/10877/12127
dc.descriptionPresented to the Honors Committee of Texas State University in Partial Fulfillment of the Requirements for Graduation in the University Honors Program, May 2020.en_US
dc.description.abstract

Currently, there are apps available that can identify emotions. These applications use feature-matching to compare photos within a dataset. However, this information cannot be generalized to new datasets or real-time photos. There is a gap in available apps that can generalize facial feature information and accommodate for lighting and angle variance. A graduate student at Texas State University created an app for real-world use that addresses this gap by utilizing deep machine learning to identify emotions in various lighting conditions and angles with real-time photos. This iOS application detects a human face, processes the image with deep machine learning, identifies the facial expression, and shows a corresponding emoticon representing the emotion identified.

This project aims to identify the level of agreement between the app’s emotion identification and people’s judgement. Participants were asked via survey, to provide information regarding their agreement with the app’s identification of emotions. The survey included screenshots of real-time photos and the emoticon generated by the app. Participants were asked if they agreed with the emoticon generated by the app. They selected one of the three following responses, “yes,” “no,” or “I do not know.” Preliminary data suggests that there was some agreement between the app and people. However, there were varying amounts of agreement for the different emotions. Future studies are needed to assess if the addition of an auditory component to the application would affect the percentage of agreement, as well as if this app can be useful for people with difficulties recognizing emotions like individuals with Autism Spectrum Disorder (ASD).

en_US
dc.formatText
dc.format.extent45 pages
dc.format.medium1 file (.pdf)
dc.language.isoen_USen_US
dc.subjectFacial expression recognitionen_US
dc.subjectSpeech-therapyen_US
dc.subjectDeep learningen_US
dc.subjectArtificial intelligenceen_US
dc.subjectAppen_US
dc.titleEmotions and Emoticons: Facial Expression Recognition App Accuracyen_US
txstate.documenttypeThesis
dc.contributor.committeeMemberSchwarz, Amy Louise
thesis.degree.departmentHonors College
thesis.degree.disciplineCommunication Disorders
thesis.degree.grantorTexas State University
txstate.departmentHonors College


Download

Thumbnail

This item appears in the following Collection(s)

Show simple item record