Content area

Abstract

Background:The idea of making science more accessible to nonscientists has prompted health researchers to involve patients and the public more actively in their research. This sometimes involves writing a plain language summary (PLS), a short summary intended to make research findings accessible to nonspecialists. However, whether PLSs satisfy the basic requirements of accessible language is unclear.

Objective:We aimed to assess the readability and level of jargon in the PLSs of research funded by the largest national clinical research funder in Europe, the United Kingdom’s National Institute for Health and Care Research (NIHR). We also aimed to assess whether readability and jargon were influenced by internal and external characteristics of research projects.

Methods:We downloaded the PLSs of all NIHR National Journals Library reports from mid-2014 to mid-2022 (N=1241) and analyzed them using the Flesch Reading Ease (FRE) formula and a jargon calculator (the De-Jargonizer). In our analysis, we included the following study characteristics of each PLS: research topic, funding program, project size, length, publication year, and readability and jargon scores of the original funding proposal.

Results:Readability scores ranged from 1.1 to 70.8, with an average FRE score of 39.0 (95% CI 38.4-39.7). Moreover, 2.8% (35/1241) of the PLSs had an FRE score classified as “plain English” or better; none had readability scores in line with the average reading age of the UK population. Jargon scores ranged from 76.4 to 99.3, with an average score of 91.7 (95% CI 91.5-91.9) and 21.7% (269/1241) of the PLSs had a jargon score suitable for general comprehension. Variables such as research topic, funding program, and project size significantly influenced readability and jargon scores. The biggest differences related to the original proposals: proposals with a PLS in their application that were in the 20% most readable were almost 3 times more likely to have a more readable final PLS (incidence rate ratio 2.88, 95% CI 1.86-4.45). Those with the 20% least jargon in the original application were more than 10 times as likely to have low levels of jargon in the final PLS (incidence rate ratio 13.87, 95% CI 5.17-37.2). There was no observable trend over time.

Conclusions:Most of the PLSs published in the NIHR’s National Journals Library have poor readability due to their complexity and use of jargon. None were readable at a level in keeping with the average reading age of the UK population. There were significant variations in readability and jargon scores depending on the research topic, funding program, and other factors. Notably, the readability of the original funding proposal seemed to significantly impact the final report’s readability. Ways of improving the accessibility of PLSs are needed, as is greater clarity over who and what they are for.

Details

1009240
Title
Jargon and Readability in Plain Language Summaries of Health Research: Cross-Sectional Observational Study
Author
Lang, Iain A  VIAFID ORCID Logo  ; King, Angela  VIAFID ORCID Logo  ; Boddy, Kate  VIAFID ORCID Logo  ; Stein, Ken  VIAFID ORCID Logo  ; Asare, Lauren  VIAFID ORCID Logo  ; Day, Jo  VIAFID ORCID Logo  ; Liabo, Kristin  VIAFID ORCID Logo 
Publication title
Volume
27
First page
e50862
Publication year
2025
Publication date
2025
Section
eHealth Literacy / Digital Literacy
Publisher
Gunther Eysenbach MD MPH, Associate Professor
Place of publication
Toronto
Country of publication
Canada
e-ISSN
1438-8871
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Online publication date
2025-01-13
Milestone dates
2023-07-14 (Preprint first published); 2023-07-14 (Submitted); 2024-03-04 (Revised version received); 2024-09-23 (Accepted); 2025-01-13 (Published)
Publication history
 
 
   First posting date
13 Jan 2025
ProQuest document ID
3222367823
Document URL
https://www.proquest.com/scholarly-journals/jargon-readability-plain-language-summaries/docview/3222367823/se-2?accountid=208611
Copyright
© 2025. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2025-11-07
Database
ProQuest One Academic