Full text

Turn on search term navigation

© 2017. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Background: Diabetes is a major health crisis for Hispanics and Asian Americans. Moreover, Spanish and Chinese speakers are more likely to have limited English proficiency in the United States. One potential tool for facilitating language communication between diabetes patients and health care providers is technology, specifically mobile phones.

Objective: Previous studies have assessed machine translation quality using only writing inputs. To bridge such a research gap, we conducted a pilot study to evaluate the quality of a mobile language translation app (iTranslate) with a voice recognition feature for translating diabetes patient education material.

Methods: The pamphlet, “You are the heart of your family…take care of it,” is a health education sheet for diabetes patients that outlines three recommended questions for patients to ask their clinicians. Two professional translators translated the original English sentences into Spanish and Chinese. We recruited six certified medical translators (three Spanish and three Chinese) to conduct blinded evaluations of the following versions: (1) sentences interpreted by iTranslate, and (2) sentences interpreted by the professional human translators. Evaluators rated the sentences (ranging from 1-5) on four scales: Fluency, Adequacy, Meaning, and Severity. We performed descriptive analyses to examine the differences between these two versions.

Results: Cronbach alpha values exhibited high degrees of agreement on the rating outcomes of both evaluator groups: .920 for the Spanish raters and .971 for the Chinese raters. The readability scores generated using MS Word’s Flesch-Kincaid Grade Level for these sentences were 0.0, 1.0, and 7.1. We found iTranslate generally provided translation accuracy comparable to human translators on simple sentences. However, iTranslate made more errors when translating difficult sentences.

Conclusions: Although the evidence from our study supports iTranslate’s potential for supplementing professional human translators, further evidence is needed. For this reason, mobile language translation apps should be used with caution.

Details

Title
Machine or Human? Evaluating the Quality of a Language Translation Mobile App for Diabetes Education Material
Author
Chen, Xuewei  VIAFID ORCID Logo  ; Acosta, Sandra  VIAFID ORCID Logo  ; Barry, Adam E  VIAFID ORCID Logo 
Section
Diabetes Education and Elearning for Patients
Publication year
2017
Publication date
Jan-Jun 2017
Publisher
JMIR Publications
e-ISSN
23714379
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2512414358
Copyright
© 2017. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.