Search the hub
Showing results for tags 'AI'.
-
Content ArticleIn August 2019, the government announced a £250 million investment in Artificial Intelligence (AI) applications for health and care through the creation of the NHS AI Lab. The hope is that over time this investment will enable health and care providers to benefit from the very best data-driven technology and help us achieve our goals for technology use in the NHS and in the care system. This report from NHS X provides a cohesive overview of the current state of play of data-driven technologies within the health and care system. It makes clear where in the system AI technologies can be utilised and the policy work that is, and will need to be done, to ensure this utilisation is done in a safe, effective and ethically acceptable manner.
- Posted
- 3 comments
-
2
-
Content Article
Diversity in digital health ‘is a matter of patient safety’
Claire Cox posted an article in Motivating staff
Encouraging diversity in the NHS isn’t simply a matter of inclusion, it’s a matter of patient safety, delegates at the Healthcare Excellence Through Technology (HETT) conference have heard.- Posted
-
- Leadership style
- Team culture
-
(and 3 more)
Tagged with:
-
Content ArticleRisk scores are widely used in healthcare, but their development and implementation do not usually involve input from practitioners and service users and carers (SU/C). This study from Dyson et al., published in BMJ Open contributes to the development of The Computer-Aided Risk Score (CARS) by eliciting views of staff and who provided important, often complex, insights to support the development and implementation of CARS to ensure successful implementation in routine clinical practice.
-
Content ArticleJeroen Tas, Philips’ Chief Innovation & Strategy Officer, met with three young and inspiring data scientists to discuss technology opportunities in healthcare. At Philips, the journey towards a healthier and more sustainable world starts with listening to the younger generation and future decision-makers.
-
Content Article
Should we trust algorithms?
Patient Safety Learning posted an article in Data and insight
There is increasing use of algorithms in the healthcare and criminal justice systems, and corresponding increased concern with their ethical use. But perhaps a more basic issue is whether we should believe what we hear about them and what the algorithm tells us. Large numbers of algorithms of varying complexity are being developed within the healthcare and the criminal justice system, and include, for example, the UK HART (Harm Assessment Risk Tool) system for assessing recidivism risk, which is based on a machine-learning technique known as a random forest. But the reliability and fairness of such algorithms for policing are being strongly contested: apart from the debate about facial recognition on predictive policing algorithms says that ”their use puts our rights at risk.”- Posted
-
- AI
- Risk assessment
-
(and 1 more)
Tagged with:
-
Content ArticleThe COVID-19 pandemic is sweeping across the length and breadth of the UK. As a result, NHS England has issued guidelines for effective triaging of urgent cancer 'two-week wait' referrals. The intention of this guideline is to minimise the disruption to cancer services. In order to fully understand the implications of this manual triage approach, this article, Data-Drive Triage Automation – YouDiagnose’s fight against COVID-19, will first explain the triage process during normal circumstances, and then highlight the additional impacts due to the coronavirus emergency. Finishing with a suggested solution (from YouDiagnose) to improve the efficiency of the triaging process and save lives during the pandemic.
-
Content Article
How AI health chatbots can help stem coronavirus pandemic chaos
Patient Safety Learning posted an article in Blogs
AI health chatbots around the world have been racing to add coronavirus detection into algorithms or put up helpful information to demonstrate they are part of the response to coronavirus (COVID-19). But to be honest, it’s pointless. A symptom checker can’t diagnose you with COVID-19. That can only be done through testing. The symptoms are too close to cold and flu. However, Prof Dr. Maureen Baker, Chief Medical Officer at Your.MD and former Chair of the UK’s Royal College of General Practitioners, has been involved at the highest level of pandemic preparation planning in the UK for decades and she is clear that AI chatbots, like Your.MD, can play a vital role in reducing the number of people who unnecessarily seek medical treatment and the deaths of individuals who are endangered by symptoms unrelated to COVID-19. So, if AI health chatbots can’t reliably detect COVID-19 and should only advise you to stay at home, what else can they do? “They can work in tandem with governments and health services to stop the worried well not at risk from the virus from seeking treatment, and also support people to self-care where that is appropriate,” says Prof Baker. She thinks that with collaboration, there is enormous potential for chatbots to act as reliable companions providing guidance and tracking symptoms.