Vowel quality scoring on speech rehabilitation assistance

Dahnial Syauqy, Chao Min Wu, Onny Setyawati

Research output: Contribution to journalArticlepeer-review

Abstract

This paper attempted to develop a tool to assist speech therapy and rehabilitation which focused on vowel quality analysis of the speech. The tool was designed to extract the speech features information to determine the vowel quality of the patient and compare it with a normal speech recording. In order to help the assessment to be done by a basic user without particular knowledge of speech processing, the tool was designed in simple interface. However, the tool also provided deep analysis of the speech which can be useful for the speech therapist. Two speech features including pitch and formants were used as input information to classify the vowel of voiced speech segment which became the comparison quantity with another particular template of speech. Cepstrum based pitch tracking algorithm were used to estimate the pitch. Then, two popular classification methods, KNearest Neighbor (K-NN) and Multilayer Perceptron (MLP) were investigated and compared as the vowel classification algorithm. Finally, the vowel features similarity between both speeches was quantified and overall score was made. For the vowel classification algorithm, MLP method provided better accuracy (92.61% for men, 86.75% for women and 83.75% for children) compared to K-NN method (91.67%, 86.21% and 80.69%) and up to 5-times faster in the computation time. The overall result also indicated the advantage of the tool for both patient and therapist by using provided simple and professional mode.

Original languageEnglish
Pages (from-to)27199-27210
Number of pages12
JournalInternational Journal of Applied Engineering Research
Volume9
Issue number24
StatePublished - 2014

Keywords

  • Computer assisted speech therapy
  • Speech disorder
  • Speech processing
  • Vowel classification

Fingerprint

Dive into the research topics of 'Vowel quality scoring on speech rehabilitation assistance'. Together they form a unique fingerprint.

Cite this