Skip to main content

Main menu

  • HOME
  • ONLINE FIRST
  • CURRENT ISSUE
  • ALL ISSUES
  • AUTHORS & REVIEWERS
  • SUBSCRIBE
  • BJGP LIFE
  • MORE
    • About BJGP
    • Conference
    • Advertising
    • eLetters
    • Alerts
    • Video
    • Audio
    • Librarian information
    • Resilience
    • COVID-19 Clinical Solutions
  • RCGP
    • BJGP for RCGP members
    • BJGP Open
    • RCGP eLearning
    • InnovAIT Journal
    • Jobs and careers

User menu

  • Subscriptions
  • Alerts
  • Log in

Search

  • Advanced search
British Journal of General Practice
Intended for Healthcare Professionals
  • RCGP
    • BJGP for RCGP members
    • BJGP Open
    • RCGP eLearning
    • InnovAIT Journal
    • Jobs and careers
  • Subscriptions
  • Alerts
  • Log in
  • Follow bjgp on Twitter
  • Visit bjgp on Facebook
  • Blog
  • Listen to BJGP podcast
  • Subscribe BJGP on YouTube
British Journal of General Practice
Intended for Healthcare Professionals

Advanced Search

  • HOME
  • ONLINE FIRST
  • CURRENT ISSUE
  • ALL ISSUES
  • AUTHORS & REVIEWERS
  • SUBSCRIBE
  • BJGP LIFE
  • MORE
    • About BJGP
    • Conference
    • Advertising
    • eLetters
    • Alerts
    • Video
    • Audio
    • Librarian information
    • Resilience
    • COVID-19 Clinical Solutions
Research

The readability of general practice websites: a cross-sectional analysis of all general practice websites in Scotland

Guy Rughani, Peter Hanlon, Neave Corcoran and Frances S Mair
British Journal of General Practice 2021; 71 (706): e391-e398. DOI: https://doi.org/10.3399/BJGP.2020.0820
Guy Rughani
Institute of Health and Wellbeing, University of Glasgow, Glasgow.
Roles: SCREDS, clinical lecturer
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Peter Hanlon
Institute of Health and Wellbeing, University of Glasgow, Glasgow.
Roles: Academic fellow
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Neave Corcoran
Institute of Health and Wellbeing, University of Glasgow, Glasgow.
Roles: Academic fellow
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Frances S Mair
Institute of Health and Wellbeing, University of Glasgow, Glasgow.
Roles: Norie Miller professor of general practice
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info
  • eLetters
  • PDF
Loading

Abstract

Background General practice websites are an increasingly important point of interaction, but their readability is largely unexplored. One in four adults struggle with basic literacy, and there is a socioeconomic gradient. Readable content is a prerequisite to promoting health literacy.

Aim To assess general practice website readability by analysing text and design factors, and to assess whether practices adapted their website text to the likely literacy levels of their populations.

Design and setting Websites for all general practices across Scotland were analysed from March to December 2019, using a cross-sectional design.

Method Text was extracted from five webpages per website and eight text readability factors were measured, including the Flesch Reading Ease and the Flesch-Kincaid Grade Level. The relationship between readability and a practice population’s level of deprivation, measured using the Scottish Index of Multiple Deprivation (SIMD), was assessed. Overall, 10 design factors contributing to readability and accessibility were scored.

Results In total, 86.4% (n = 813/941) of Scottish practices had a website; 22.9% (n = 874/3823) of webpages were written at, or below, the government-recommended reading level for online content (9–14 years old), and the content of the remaining websites, 77.1% (n = 2949/3823), was suitable for a higher reading age. Of all webpages, 80.5% (n = 3077/3823) were above the recommended level for easy-to-understand ‘plain English’. There was no statistically significant association between webpage reading age and SIMD. Only 6.7% (n = 51/764) of websites achieved all design and accessibility recommendations.

Conclusion Changes to practice websites could improve readability and promote health literacy, but practices will need financial resources and ongoing technical support if this is to be achieved and maintained. Failure to provide readable and accessible websites may widen health inequalities; the topic will become increasingly important as online service use accelerates.

  • digital divide
  • general practice
  • health literacy
  • online systems
  • primary health care

INTRODUCTION

General practice websites are an increasingly important source of information and may provide the first point of interaction between patients and healthcare providers, yet, to the authors’ knowledge, there has been no large-scale research that assesses how understandable general practice websites are to their practice populations. Most practices in the UK have a website and there is impetus from the UK and Scottish government to increase the provision of services that GPs offer online, such as appointment booking and repeat prescription requests.1,2 In Scotland, it will soon be a requirement for all practices to make information and services available digitally;2 this process has been accelerated by the COVID-19 pandemic.3,4

The basis for general practice websites is commonly the practice leaflet, a contractually required document that provides information about the surgery’s services, opening times, appointments, prescriptions, data protection policy, and staff.2,5,6 A small number of website providers operate in the primary care market, with one company supplying nearly half of the general practice websites in the UK; consequently, many websites are similar in basic design and structure. As patients are sometimes required — and, increasingly, expect — to interact with health services via the internet, poorly produced websites can create a barrier to accessing health care.

The comprehensibility of text is often termed ‘readability’, which is defined as ‘the state or quality of being readable.’ 7 Text factors, such as word length or the number of syllables in a word, and design factors, such as line spacing and typeface, influence readability.8–10 Context is also important: familiar formatting — for example, having opening times written in table format — may help people understand complex information.9,11

Readability matters: in the latest major review of adult literacy 16.4%, or around 5.8 million people, in England and Northern Ireland score at the lowest level of proficiency in literacy (at or below Level 1) that is, they are only able to comprehend short sentences and identify single pieces of information if they were identical or synonymous with the information in a question;12 similar results were found in Scotland.13 Healthcare jargon and context adds complexity, even for those with otherwise good literacy levels.14 It has been found that 43% of written health information is too complex for UK adults to fully understand, a figure that rises to 61% when numerical information is included.15 Health literacy has been defined as the skills of individuals to ‘gain access to, understand, and use information to promote and maintain good health’;16 however, in order to promote health literacy, text must be readable.17 Low basic literacy and low health literacy are associated with higher levels of socioeconomic deprivation.13,18 In Scotland, those living in the 15% of areas with the greatest levels of deprivation, according to the Scottish Index of Multiple Deprivation (SIMD), were twice as likely to only reach a basic level of literacy compared with those in all other areas.13 Computing literacy also varies with socioeconomic status: individuals in areas of greatest deprivation are least likely to be able to use, have access to, or know about online services.19–21

GPs are encouraged to make more services available online, yet websites that are poorly written or produced can inadvertently create a barrier to accessing healthcare and widen health inequalities. In the largest study on website readability to date, all 813 general practice websites in Scotland were reviewed and most (77.1%) were more difficult to read than UK government-recommended limits. Websites were not adapted to their local population’s likely literacy levels, and only 6.7% met design and accessibility recommendations. Websites should be written in language suitable for people aged 9–14 years; simple measures can be taken to improve design and accessibility, but practices will need financial resources and technical support on an ongoing basis if this is to be achieved and maintained.

How this fits in

The NHS Information Standard states that, when creating information, providers should take ‘into consideration the health literacy and/or accessibility needs of the population it is aimed at.’ 22 Ensuring information is understandable is vital to enabling equitable access to health services.

This study explores general practice website readability by analysing text and design factors, and assessing whether readability varied according to the SIMD measure of a practice population’s level of deprivation.

METHOD

Data extraction

The authors used publicly available information from the Scottish Government — namely, the Information Services Division’s (ISD’s) list of all GP practices in Scotland, ranked by the percentage of each practice population’s level of multiple deprivation, as measured by the SIMD.23 The SIMD divides Scotland into 6976 neighbourhoods; each area is scored against 38 indicators of deprivation.24

Between January and July 2019, one author searched the internet to identify which of the 941 practices on the ISD/SIMD list had their own website. Directory-style entries on websites such as NHS Inform were not counted as independent general practice websites. Practice websites hosted by their local health board were included. If practices had merged with others and had a single group-practice website, the data were collected under the code of the practice whose physical site they shared; no data were collected for the relocated practice. Data were extracted from webpages that on discussion as a team were determined would be visited most often, such as the homepage (introductory page when clicking from a search result) and those with information on the following:

  • appointments: how to make an appointment with a doctor;

  • clinics and services: description of the clinics or extra services offered;

  • repeat prescriptions: how to order repeat medicines; and

  • new patient information: how to register.

Text factors

The primary measures of text readability were the Flesch-Kincaid Grade Level (FKGL) and Flesch Reading Ease (FRE) scores.25 These are well-established tools, and proxies for gold-standard comprehension tests.18 They consider average sentence length and syllables per word,19 and both are widely used and freely available in word-processing software.19 Both formulae have correlation coefficients of >0.9 with comprehension tests.20

UK government website designers and literacy campaigners suggest that websites should be comprehensible by a 9–14 year old.8,25 Text should follow the principles of ‘plain English’ and:

  • use short, everyday words;

  • avoid jargon;

  • be written in the first person; and

  • use an active, rather than a passive, voice.22,23

The readability statistic target for ‘plain English’ is an FRE of ≥60/100.22

Six secondary measures were recorded, in line with recommendations from a previous review of readability:

  • character count;

  • characters per word;

  • word count;

  • words per sentence;

  • sentences per paragraph; and

  • paragraph count.1

One author checked each of the five webpages for all of the general practice websites. If practices did not have a separate webpage for the different areas of information (for example, appointments), but had the relevant text on another part of the website, that text was extracted. Where possible, only the main area of webpage text (‘body text’) was extracted; navigation information, headers, and footers were not assessed.

Body text was manually selected, copied, and pasted into Microsoft Word (2016) and formatting elements such as bulleted or numbered lists, headings, titles, tables, figures, and paragraph breaks were discarded.

Where there had been a bulleted or numbered list, full stops were added to the end of each line; without full stops, the software calculated the whole list as a single sentence, which was misleading as the purpose of a list was, generally, to improve readability.

A different researcher checked the readability statistics for 10% of the websites, selected at random. This revealed 100% agreement, so the first researcher extracted the remaining data. With R software (version 3.6.2), linear regression was used to model the SIMD ranking for each practice against the FKGL score.26

A significant change in readability score was considered to be one grade level — that is, one UK school year.

Design factors

Design factors that contributed to readability and accessibility were also assessed. Informed by NHS England’s Information Standard, the UK government website’s design system, and recommendations from the Plain English Campaign, a 10 factor design score of desirable features was created (Box 1).10,11,22 Typeface size was not investigated because it adjusts automatically based on individual settings, making it difficult to reliably record. The appointments page (or equivalent section) was assessed, as it was thought that there would be appointments content on most websites and it has relevance to both new and existing patients. The body text (rather than the navigation information, header, or footer) was scored. A score out of eight was given if there were no images; a score out of 10 was given if images had been used. To allow for a true comparison of webpage scores, a scaled design score was calculated — each score was divided by the maximum possible score for that webpage, giving a final score of 0.0–1.0.

Basic factors (1 point each)
  • Use of sans-serif typeface in headings, main text, and captions

  • Use of a single typeface in headings, main text, and captions

  • ‘Scannable’ text — use of features such as subheadings, bullet lists, or paragraph breaks to divide information

  • Bold: used for emphasis only

  • No block capitals

  • No italicised text

  • Clear contrast between text and background colours

  • Optimised for smartphone browsers — the webpage must automatically detect it is being viewed on a smartphone screen and adjust to the screen ratio so that it is usable


Additional items if images present (1 point each)
  • Captioned illustrations — all images should be captioned

  • Alt text (a meaningful description of the image that screen-reading software can read aloud to aid users who are partially sighted) on illustrations

Box 1.

Design score

The first researcher scored each website’s appointments page or section, and the second scored a random 10% of pages. Discrepancies were discussed and a third researcher reviewed those that could not be resolved.

Whether design scores varied within, and between, website providers was assessed. R software (version 3.6.2) was used to calculate the scaled design scores, mean scaled score, and standard deviation for each provider, and the number of webpages that scored full marks.27 Using linear regression, the authors also assessed whether there was a correlation between the design score and the FRE readability statistic.

Sensitivity analyses

Sensitivity analyses were conducted for both the readability and scaled design-score investigations by excluding webpages with <150 words.

RESULTS

Table 1 gives details of the websites and anonymised providers. Of all 941 practices, 813 had a functioning website, 122 did not have a website, and six practices had merged with other practices (their web address re-directed to the new joint practice website).

View this table:
  • View inline
  • View popup
Table 1.

Practice websites and website providers

Reliability

A random 10% of the ISD list of 941 practices (95 websites) were independently scored. There were no discrepancies in readability scoring. Of the 95 webpage design scores, seven had discrepancies of a maximum of 1 point (Cohen’s k coefficient: 0.98); these discrepancies were resolved by discussion.

Readability statistics

If all 813 functioning websites had five webpages with extractable data, there would have been 4065 potential webpages to analyse. Readability statistics were calculable for 94.0% (n = 3823/4065) of possible webpages (see Supplementary Table S1). Of all 3823 webpages, 77.1% (n = 2949) featured information that exceeded the recommended 9–14-year-old reading age for online content; 22.9% (n = 874/3823) featured information at, or below, the recommended age range. Figure 1 presents the FKGL scores (converted from the score’s grade level of education used in its native US to age equivalents) from all the practice websites, plotted by webpage type.

Figure 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1.

Flesch-Kincaid Grade Level by webpage.

Grade level has been converted to reading-age range.

The FRE results were similar (Figure 2). In total, 80.5% (n = 3077/3823) of webpages scored below the recommended FRE cut-off of ≥60 for ‘plain English’ (data not shown).

Figure 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2.

Flesch Reading Ease score by webpage. Outlying peaks at ‘FRE = 0’ in ‘Clinics and services’ and ‘Homepage’ were removed when pages with <150 words were excluded, but the overall proportions above the FRE threshold remained consistent (see sensitivity analysis and Supplementary Figure S3). FRE = Flesch Reading Ease.

There was no statistically significant association between the webpage reading ages (FKGL converted to UK age ranges) and the practice population’s level of multiple deprivation (SIMD quintile) (data not shown); Supplementary Figure S1 presents the reading-age level for the five webpage types by SIMD quintile. Secondary readability statistics are reported in Supplementary Table S2.

Design and combined scores

In total, 94.0% (n = 764/813) of practice websites had an appointments section (Table 1), allowing a scaled design score to be calculated. Of these, 6.7% (n = 51/764) scored full marks for design and accessibility (data not shown).

There was a spread of scaled design scores for each provider (Figure 3), but a similar variation in mean scaled design scores between website providers (see Supplementary Table S3). Figure 3 presents the scaled design score for each appointment webpage by that webpage’s readability (classified as ‘hard’ or ‘easy’ to read). There was no statistically significant association between the design score and readability (FRE statistic), with a correlation of 0.004 and P-value from linear regression of 0.93 (data not shown).

Figure 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3.

Combined readability (Flesch Reading Ease) and scaled design scores by website provider/provider type. aDesigned by a web company providing <10 websites. FRE = Flesch Reading Ease.

Sensitivity analysis

Webpages with <150 words were excluded and the analyses were repeated; results are shown in Supplementary Figures S2–S5. This removed the most extreme outlying readability scores (for example, ‘clinics and services’ that are visible in Figure 2), but the proportions of pages above the thresholds remained consistent (see Supplementary Figure S3). A slightly stronger correlation (0.04) was noted between the design score and FRE, but remained not statistically significant (P = 0.23). Supplementary Table S3 and Supplementary Table S4 detail further descriptions of the remaining outliers.

DISCUSSION

Summary

In Scotland, 86.4% of GP practices have a website; however, 77.1% (n = 2949/3823) of webpages include content that exceeds the recommended target reading age of 9–14 years old,8,27 and 80.5% (n = 3077/3823) include content that exceeds the target for ‘plain English’.26 There was no evidence that practice websites were adapted to meet the likely literacy levels of the populations they serve. Only 6.7% of websites met accessibility and design recommendations28 (Box 1). All website providers had evidence of suboptimal design in terms of readability and the market was dominated by a single provider using a limited number of website design templates; however, the spread in design scores across providers may demonstrate that practices retain some control over the readability of their output. Surprisingly, there was no association between text readability and the design scores, highlighting that a clear-looking website is not necessarily readable.

Strengths and limitations

To the authors’ knowledge, this is the most comprehensive assessment of general practice websites to date, and the first to analyse design factors. Variability in website production required judgement to decide which text should be analysed, but there was minimal variation in scoring between researchers.

The main limitation was the use of readability scoring tools. The FRE and FKGL measures were designed for an US context and, although they are reliable, other measures are arguably better adapted to health care;1,29,30 the Flesch formulae, however, are the only ones embedded in commonly available software and were, therefore, the only practical option for this study.

Readability formulae can be misled by low word counts and special characters1 — for example, telephone numbers score as single difficult words.1,31,32 As recommended, the authors were consistent in the formatting that was permitted, and performed sensitivity analyses.1 Formulae also ignore word meaning and the added complexity of numerical information, so are a proxy for comprehensibility.15,18,31 The high volume of health-related language on general practice websites may mean the formulae underestimate the impact of poor health literacy — as an example, ‘gastric’ and ‘tummy’ are two-syllable words that score equally, yet may be differently understood.1

The ease of website navigation can be a barrier and it was not possible to establish a robust method of assessment. It is possible that seemingly readable websites may be difficult to use. User testing would clarify the link between proxy scores and the real-world usability and comprehensibility of websites but that process was beyond the scope of this study. The authors also did not have the capacity to investigate accessibility for people who do not speak or understand the English language.

Comparison with existing literature

General practice website readability has been under-researched. One small study assessed the readability of 10 English practice websites; it reported that ‘most’ websites had an FRE score of 50–60, suggesting that half of the UK adult population would struggle to fully understand the content.33

Patient information leaflets (PILs) and online condition-specific information has been more widely studied; poor readability has been universally reported.34–38 In comparison with Protheroe et al ’s17 UK study of PILs in general practices, which used the same readability formulae, the study presented here found that a greater proportion of webpages featured content that exceeded the reading level of a 14 year old (77.1% for webpages versus 37.4% for PILs). Both studies found a similar proportion fell within the respective readability targets (23.0% for webpages versus 24.3% for PILs).17

Implications for practice

The data presented here suggest most general practice websites across Scotland do not meet the standards recommended by the NHS, government, and literacy campaigners; in addition, there is no consistent evidence that practices in more socioeconomically deprived areas adjust the readability of their websites to meet the likely lower literacy levels of the populations that they serve, and vice versa.

It is possible that the hastened uptake of digital health due to the COVID-19 pandemic could exacerbate health inequalities, especially if literacy is not considered.39

In Scotland, a national template for practices to adapt is being considered. Although website design may improve, practices will need support to create accessible content; pre-population with user-tested accessible text could help, as could the development of an NHS style guide, like that developed by Government Digital Service.25

While awaiting national efforts, practices can take steps to improve the situation: the authors’ simple 10-point design score (Box 1) could be used as a guide. Flesch readability scores are freely available, but readability can be improved by asking, while editing the website or creating the content, ‘could a 9-year-old child understand this?’. Removing medical terms only understood by those in the medical profession is critical.9

The authors’ assumption is that readability and design improvements promote comprehension and health literacy, but this can only be assessed by testing websites with their target users.14,31,40

Practices will require financial resources and technical support on an ongoing basis to enact and maintain such recommendations, but failure to do so may inadvertently widen health inequalities.

In a time of scarce resources, partnerships between patient participation groups, literacy charities/campaign bodies, government, and practices may be necessary to ensure digital changes are inclusive.

Acknowledgments

Thanks to Dr David Blane, clinical research fellow, General Practice and Primary Care, Institute of Health and Wellbeing, University of Glasgow, Glasgow, UK, for advice on Scottish Index of Multiple Deprivation data.

Notes

Funding

Peter Hanlon is funded through a Clinical Research Training Fellowship from the Medical Research Council (grant reference: MR/S021949/1).

Ethical approval

Not required.

Provenance

Freely submitted; externally peer reviewed.

Competing interests

The authors have declared no competing interests.

Discuss this article

Contribute and read comments about this article: bjgp.org/letters

  • Received September 3, 2020.
  • Revision requested November 13, 2020.
  • Accepted December 28, 2020.
  • © The Authors
http://creativecommons.org/licenses/by/4.0/

This article is Open Access: CC BY 4.0 licence (http://creativecommons.org/licences/by/4.0/).

REFERENCES

  1. 1.↵
    1. Wang L-W,
    2. Miller MJ,
    3. Schmitt MR,
    4. Wen FK
    (2013) Assessing readability formula differences with written health information materials: application, results, and recommendations. Res Social Adm Pharm 9, 5, 503–516.
    OpenUrlCrossRefPubMed
  2. 2.↵
    1. Scottish Government
    (2017) The 2018 General Medical Services Contract in Scotland, https://www.gov.scot/binaries/content/documents/govscot/publications/advice-and-guidance/2017/11/2018-gms-contract-scotland/documents/00527530-pdf/00527530-pdf/govscot%3Adocument/00527530.pdf (accessed 25 Mar 2021).
  3. 3.↵
    1. Trethewey SP,
    2. Beck KJ,
    3. Symonds RF
    (2020) Video consultations in UK primary care in response to the COVID-19 pandemic. Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp20X709505.
  4. 4.↵
    1. Royal College of General Practitioners
    (2020) General practice in the post covid world Challenges and opportunities for general practice, https://www.rcgp.org.uk/-/media/Files/News/2020/general-practice-post-covid-rcgp.ashx (accessed 25 Mar 2021).
  5. 5.↵
    1. NHS England
    (2018) NHS England Standard General Medical Services Contract 2017/18, https://www.england.nhs.uk/wp-content/uploads/2018/01/17-18-gms-contract.pdf (accessed 25 Mar 2021).
  6. 6.↵
    1. NHS England
    (2018) NHS England Standard Personal Medical Services Agreement 2017/18, https://www.england.nhs.uk/wp-content/uploads/2018/01/17-18-pms-contract.pdf (accessed 25 Mar 2021).
  7. 7.↵
    1. Dictionary.com
    (2021) Readability, https://www.dictionary.com/browse/readability (accessed 25 Mar 2021).
  8. 8.↵
    1. Learning and Work Institute
    (2009) Readability: how to produce clear written materials for a range of readers. https://learningandwork.org.uk/resources/research-and-reports/readability-how-to-produce-clear-writtenmaterials-for-a-range-of-readers (accessed 25 Mar 2021).
  9. 9.↵
    1. Plain English Campaign
    (2001) How to write medical information in plain English, http://www.plainenglish.co.uk/files/medicalguide.pdf (accessed 25 Mar 2021).
  10. 10.↵
    1. Government Digital Service
    (2019) Government design principles. https://www.gov.uk/guidance/government-design-principles (accessed 25 Mar 2021).
  11. 11.↵
    1. Plain English Campaign
    Guide to design and layout. http://www.plainenglish.co.uk/design-and-layout.html (accessed 25 Mar 2021).
  12. 12.↵
    1. Organisation for Economic Co-operation and Development
    (2012) Country note Survey of Adult Skills first results England & Northern Ireland (UK), http://www.oecd.org/skills/piaac/Country%20note%20-%20United%20Kingdom.pdf (accessed 25 Mar 2021).
  13. 13.↵
    1. St Clair R,
    2. Tett L,
    3. Maclachlan K
    (2010) Scottish Survey of Adult Literacies 2009: Report of Findings, http://eprints.hud.ac.uk/id/eprint/13416/1/TettScottish0102005.pdf (accessed 25 Mar 2021).
  14. 14.↵
    1. Protheroe J,
    2. Nutbeam D,
    3. Rowlands G
    (2009) Health literacy: a necessity for increasing participation in health care. Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp09X472584.
  15. 15.↵
    1. Rowlands G,
    2. Protheroe J,
    3. Winkley J,
    4. et al.
    (2015) A mismatch between population health literacy and the complexity of health information: an observational study. Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp15X685285.
  16. 16.↵
    1. Nutbeam D
    (2000) Health literacy as a public health goal: a challenge for contemporary health education and communication strategies into the 21st century. Health Promot Int 15, 3, 259–267.
    OpenUrlCrossRef
  17. 17.↵
    1. Protheroe J,
    2. Estacio EV,
    3. Saidy-Khan S
    (2015) Patient information materials in general practices and promotion of health literacy: an observational study of their effectiveness. Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp15X684013.
  18. 18.↵
    1. Rowlands G,
    2. Nutbeam D
    (2013) Health literacy and the ‘inverse information law’ Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp13X664081.
  19. 19.↵
    1. Department for Business Innovation and Skills
    (2012) BIS Research Paper Number 81 The 2011 Skills for Life Survey: A Survey of Literacy, Numeracy and ICT Levels in England, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/36000/12-p168-2011-skills-forlife-survey.pdf (accessed 25 Mar 2021).
  20. 20.↵
    1. Robinson L,
    2. Cotten SR,
    3. Ono H,
    4. et al.
    (2015) Digital inequalities and why they matter. Inf Commun Soc 18, 5, 569–582.
    OpenUrl
  21. 21.↵
    1. Gomez-Cano M,
    2. Atherton H,
    3. Campbell J,
    4. et al.
    (2020) Awareness and use of online appointment booking in general practice: analysis of GP Patient Survey data. Br J Gen Pract, DOI: https://doi.org/10.3399/bjgp20X711365.
  22. 22.↵
    1. NHS England
    The Information Standard principles. The Information Standard for Health and Care Information Production Quality Statements. https://www.england.nhs.uk/tis/about/the-info-standard (accessed 25 Mar 2021).
  23. 23.↵
    1. Scottish Government Information Services Division
    (2021) General practice — GP workforce and practice list sizes. Table 7: practice populations by deprivation status, 2010 to 2020. https://beta.isdscotland.org/find-publicationsand-data/health-services/primary-care/general-practice-gp-workforce-andpractice-list-sizes (accessed 25 Mar 2021).
  24. 24.↵
    1. Scottish Government
    (2016) Introducing The Scottish Index of Multiple Deprivation 2016. https://www.gov.scot/publications/scottish-index-multipledeprivation-2016 (accessed 25 Mar 2021).
  25. 25.↵
    1. The R Foundation
    What is R? https://www.r-project.org/about.html (accessed 25 Mar 2021).
  26. 26.↵
    1. Flesch R
    How to write in plain English Chapter 2: let‘s start with the formula. http://pages.stern.nyu.edu/~wstarbuc/Writing/Flesch.htm (accessed 25 Mar 2021).
  27. 27.↵
    1. Government Digital Service
    (2021) Content design: planning, writing and managing content. https://www.gov.uk/guidance/content-design/writing-for-gov-uk (accessed 25 Mar 2021).
  28. 28.↵
    1. HM Government
    Making online public services accessible. https://accessibility.campaign.gov.uk (accessed 25 Mar 2021).
  29. 29.↵
    1. Beaunoyer E,
    2. Arsenault M,
    3. Lomanowska AM,
    4. Guitton MJ
    (2017) Understanding online health information: evaluation, tools, and strategies. Patient Educ Couns 100, 2, 183–189.
    OpenUrlPubMed
  30. 30.↵
    1. Meade CD,
    2. Smith CF
    (1991) Readability formulas: cautions and criteria. Patient Educ Couns 17, 2, 153–158.
    OpenUrlCrossRef
  31. 31.↵
    1. Janan D,
    2. Wray D
    (2012) Readability: the limitations of an approach through formulae, http://www.leeds.ac.uk/educol/documents/213296.pdf (accessed 25 Mar 2021).
  32. 32.↵
    1. Zhou S,
    2. Jeong H,
    3. Green PA
    (2017) How consistent are the best-known readability equations in estimating the readability of design standards? IEEE Trans Prof Commun 60, 1, 97–111.
    OpenUrl
  33. 33.↵
    1. Aslanyan D,
    2. McDonald P
    (2018) GP websites: are they readable? (Health Literacy UK).
  34. 34.↵
    1. Graber MA,
    2. Roller CM,
    3. Kaeble B
    (1999) Readability levels of patient education material on the world wide web. J Fam Pract 48, 1, 58–61.
    OpenUrlPubMed
  35. 35.
    1. Ley P,
    2. Florio T
    (1996) The use of readability formulas in health care. Psychol Health Med 1, 1, 7–28.
    OpenUrlCrossRef
  36. 36.
    1. Huang G,
    2. Fang CH,
    3. Agarwal N,
    4. et al.
    (2015) Assessment of online patient education materials from major ophthalmologic associations. JAMA Ophthalmol 133, 4, 449–454.
    OpenUrl
  37. 37.
    1. Rooney MK,
    2. Sachdev S,
    3. Byun J,
    4. et al.
    (2019) Readability of patient education materials in radiation oncology: are we improving? Pract Radiat Oncol 9, 6, 435–440.
    OpenUrl
  38. 38.↵
    1. Kher A,
    2. Johnson S,
    3. Griffith R
    (2017) Readability assessment of online patient education material on congestive heart failure. Adv Prev Med 2017, 9780317.
    OpenUrl
  39. 39.↵
    1. Roberts ET,
    2. Mehrotra A
    (2020) Assessment of disparities in digital access among Medicare beneficiaries and implications for telemedicine. JAMA Intern Med 180, 10, 1386–1389.
    OpenUrl
  40. 40.↵
    1. Sydes M,
    2. Hartley J
    (1997) A thorn in the Flesch: observations on the unreliability of computer–based readability formulae. Br J Educ Technol 28, 2, 143–145.
    OpenUrlCrossRef
Back to top
Previous ArticleNext Article

In this issue

British Journal of General Practice: 71 (706)
British Journal of General Practice
Vol. 71, Issue 706
May 2021
  • Table of Contents
  • Index by author
Download PDF
Download PowerPoint
Article Alerts
Or,
sign in or create an account with your email address
Email Article

Thank you for recommending British Journal of General Practice.

NOTE: We only request your email address so that the person to whom you are recommending the page knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
The readability of general practice websites: a cross-sectional analysis of all general practice websites in Scotland
(Your Name) has forwarded a page to you from British Journal of General Practice
(Your Name) thought you would like to see this page from British Journal of General Practice.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
The readability of general practice websites: a cross-sectional analysis of all general practice websites in Scotland
Guy Rughani, Peter Hanlon, Neave Corcoran, Frances S Mair
British Journal of General Practice 2021; 71 (706): e391-e398. DOI: 10.3399/BJGP.2020.0820

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
The readability of general practice websites: a cross-sectional analysis of all general practice websites in Scotland
Guy Rughani, Peter Hanlon, Neave Corcoran, Frances S Mair
British Journal of General Practice 2021; 71 (706): e391-e398. DOI: 10.3399/BJGP.2020.0820
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
  • Mendeley logo Mendeley

Jump to section

  • Top
  • Article
    • Abstract
    • INTRODUCTION
    • METHOD
    • RESULTS
    • DISCUSSION
    • Acknowledgments
    • Notes
    • REFERENCES
  • Figures & Data
  • Info
  • eLetters
  • PDF

Keywords

  • digital divide
  • general practice
  • health literacy
  • online systems
  • primary health care

More in this TOC Section

  • Physical activity for chronic back pain: qualitative interviews among patients and GPs
  • OpenSAFELY NHS Service Restoration Observatory 2: changes in primary care activity across six clinical areas during the COVID-19 pandemic
  • Academic performance of ethnic minority versus White doctors in the MRCGP assessment 2016-2021: cross sectional study
Show more Research

Related Articles

Cited By...

Intended for Healthcare Professionals

BJGP Life

BJGP Open

 

@BJGPjournal's Likes on Twitter

 
 

British Journal of General Practice

NAVIGATE

  • Home
  • Current Issue
  • All Issues
  • Online First
  • Authors & reviewers

RCGP

  • BJGP for RCGP members
  • BJGP Open
  • RCGP eLearning
  • InnovAiT Journal
  • Jobs and careers

MY ACCOUNT

  • RCGP members' login
  • Subscriber login
  • Activate subscription
  • Terms and conditions

NEWS AND UPDATES

  • About BJGP
  • Alerts
  • RSS feeds
  • Facebook
  • Twitter

AUTHORS & REVIEWERS

  • Submit an article
  • Writing for BJGP: research
  • Writing for BJGP: other sections
  • BJGP editorial process & policies
  • BJGP ethical guidelines
  • Peer review for BJGP

CUSTOMER SERVICES

  • Advertising
  • Contact subscription agent
  • Copyright
  • Librarian information

CONTRIBUTE

  • BJGP Life
  • eLetters
  • Feedback

CONTACT US

BJGP Journal Office
RCGP
30 Euston Square
London NW1 2FB
Tel: +44 (0)20 3188 7400
Email: journal@rcgp.org.uk

British Journal of General Practice is an editorially-independent publication of the Royal College of General Practitioners
© 2023 British Journal of General Practice

Print ISSN: 0960-1643
Online ISSN: 1478-5242