Summary
The rapid uptake of digital healthcare channels offers huge benefits, but evidence also suggests a close correlation between digital exclusion and social disadvantage. People with protected characteristics under the Equality Act are among those least likely to have access to the internet and the skills needed to use it.
Experts from across health and care came together to contribute to "Access Denied", a new whitepaper on digital health inequalities. This whitepaper sets out recommendations to ensure that those innovating in digital healthcare can do so in a way which addresses healthcare inequalities.
Content
Recommendations
The report makes the following recommendations for designers and developers of digital tools, and the NHS organisations who select and implement them:
- Work with digital innovations that meet the highest standards for accessibility and usability;
- Test digital products and services thoroughly with a cross section of patients, providers and commissioners;
- Use data to optimise and improve delivery to improve outcomes and minimise exclusion over time.
- Understand how different people may need specific channels of delivery at different times or for different services;
- Ensure you capture data so you can measure and compare outcomes and experience by channel.
- Don’t plan care pathways for the majority – ensure it is optimised for those from minority backgrounds too;
- Consider the support needed to move people to digital pathways;
- Ensure equality impact assessments for transforming care pathways pay attention to digital exclusion as a potential risk of inequality.
It is also calling for designers, developers and the NHS to work together on the following areas:
- We need to develop frameworks, similar to those seen for information governance and clinical safety, which would set out guidance for mitigating against health inequalities that could become adopted and embedded by design;
- Ethical considerations must be built into the clinical safety case of the tool and data used to inform or train algorithms must be thoroughly examined for bias.
0 Comments
Recommended Comments
There are no comments to display.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now