Jump to content

Search the hub

Showing results for tags 'AI'.


More search options

  • Search By Tags

    Start to type the tag you want to use, then select from the list.

  • Search By Author

Content Type


Forums

  • All
    • Commissioning, service provision and innovation in health and care
    • Coronavirus (COVID-19)
    • Culture
    • Improving patient safety
    • Investigations, risk management and legal issues
    • Leadership for patient safety
    • Organisations linked to patient safety (UK and beyond)
    • Patient engagement
    • Patient safety in health and care
    • Patient Safety Learning
    • Professionalising patient safety
    • Research, data and insight
    • Miscellaneous

Categories

  • Commissioning, service provision and innovation in health and care
    • Commissioning and funding patient safety
    • Digital health and care service provision
    • Health records and plans
    • Innovation programmes in health and care
    • Climate change/sustainability
  • Coronavirus (COVID-19)
    • Blogs
    • Data, research and statistics
    • Frontline insights during the pandemic
    • Good practice and useful resources
    • Guidance
    • Mental health
    • Exit strategies
    • Patient recovery
    • Questions around Government governance
  • Culture
    • Bullying and fear
    • Good practice
    • Occupational health and safety
    • Safety culture programmes
    • Second victim
    • Speak Up Guardians
    • Staff safety
    • Whistle blowing
  • Improving patient safety
    • Clinical governance and audits
    • Design for safety
    • Disasters averted/near misses
    • Equipment and facilities
    • Error traps
    • Health inequalities
    • Human factors (improving human performance in care delivery)
    • Improving systems of care
    • Implementation of improvements
    • International development and humanitarian
    • Safety stories
    • Stories from the front line
    • Workforce and resources
  • Investigations, risk management and legal issues
    • Investigations and complaints
    • Risk management and legal issues
  • Leadership for patient safety
    • Business case for patient safety
    • Boards
    • Clinical leadership
    • Exec teams
    • Inquiries
    • International reports
    • National/Governmental
    • Patient Safety Commissioner
    • Quality and safety reports
    • Techniques
    • Other
  • Organisations linked to patient safety (UK and beyond)
    • Government and ALB direction and guidance
    • International patient safety
    • Regulators and their regulations
  • Patient engagement
    • Consent and privacy
    • Harmed care patient pathways/post-incident pathways
    • How to engage for patient safety
    • Keeping patients safe
    • Patient-centred care
    • Patient Safety Partners
    • Patient stories
  • Patient safety in health and care
    • Care settings
    • Conditions
    • Diagnosis
    • High risk areas
    • Learning disabilities
    • Medication
    • Mental health
    • Men's health
    • Patient management
    • Social care
    • Transitions of care
    • Women's health
  • Patient Safety Learning
    • Patient Safety Learning campaigns
    • Patient Safety Learning documents
    • Patient Safety Standards
    • 2-minute Tuesdays
    • Patient Safety Learning Annual Conference 2019
    • Patient Safety Learning Annual Conference 2018
    • Patient Safety Learning Awards 2019
    • Patient Safety Learning Interviews
    • Patient Safety Learning webinars
  • Professionalising patient safety
    • Accreditation for patient safety
    • Competency framework
    • Medical students
    • Patient safety standards
    • Training & education
  • Research, data and insight
    • Data and insight
    • Research
  • Miscellaneous

News

  • News

Find results in...

Find results that contain...


Date Created

  • Start
    End

Last updated

  • Start
    End

Filter by number of...

Joined

  • Start

    End


Group


First name


Last name


Country


Join a private group (if appropriate)


About me


Organisation


Role

Found 157 results
  1. Content Article
    In August 2019, the government announced a £250 million investment in Artificial Intelligence (AI) applications for health and care through the creation of the NHS AI Lab. The hope is that over time this investment will enable health and care providers to benefit from the very best data-driven technology and help us achieve our goals for technology use in the NHS and in the care system. This report from NHS X provides a cohesive overview of the current state of play of data-driven technologies within the health and care system. It makes clear where in the system AI technologies can be utilised and the policy work that is, and will need to be done, to ensure this utilisation is done in a safe, effective and ethically acceptable manner.
  2. Content Article
    Encouraging diversity in the NHS isn’t simply a matter of inclusion, it’s a matter of patient safety, delegates at the Healthcare Excellence Through Technology (HETT) conference have heard.
  3. Content Article
    Risk scores are widely used in healthcare, but their development and implementation do not usually involve input from practitioners and service users and carers (SU/C). This study from Dyson et al., published in BMJ Open contributes to the development of The Computer-Aided Risk Score (CARS) by eliciting views of staff and who provided important, often complex, insights to support the development and implementation of CARS to ensure successful implementation in routine clinical practice.
  4. Content Article
    Jeroen Tas, Philips’ Chief Innovation & Strategy Officer, met with three young and inspiring data scientists to discuss technology opportunities in healthcare. At Philips, the journey towards a healthier and more sustainable world starts with listening to the younger generation and future decision-makers.
  5. Content Article
    There is increasing use of algorithms in the healthcare and criminal justice systems, and corresponding increased concern with their ethical use. But perhaps a more basic issue is whether we should believe what we hear about them and what the algorithm tells us.  Large numbers of algorithms of varying complexity are being developed within the healthcare and the criminal justice system, and include, for example, the UK HART (Harm Assessment Risk Tool) system for assessing recidivism risk, which is based on a machine-learning technique known as a random forest. But the reliability and fairness of such algorithms for policing are being strongly contested: apart from the debate about facial recognition on predictive policing algorithms says that ”their use puts our rights at risk.”
  6. Content Article
    The COVID-19 pandemic is sweeping across the length and breadth of the UK. As a result, NHS England has issued guidelines for effective triaging of urgent cancer 'two-week wait' referrals. The intention of this guideline is to minimise the disruption to cancer services. In order to fully understand the implications of this manual triage approach, this article, Data-Drive Triage Automation – YouDiagnose’s fight against COVID-19, will first explain the triage process during normal circumstances, and then highlight the additional impacts due to the coronavirus emergency. Finishing with a suggested solution (from YouDiagnose) to improve the efficiency of the triaging process and save lives during the pandemic. 
  7. Content Article
    AI health chatbots around the world have been racing to add coronavirus detection into algorithms or put up helpful information to demonstrate they are part of the response to coronavirus (COVID-19). But to be honest, it’s pointless. A symptom checker can’t diagnose you with COVID-19. That can only be done through testing. The symptoms are too close to cold and flu. However, Prof Dr. Maureen Baker, Chief Medical Officer at Your.MD and former Chair of the UK’s Royal College of General Practitioners, has been involved at the highest level of pandemic preparation planning in the UK for decades and she is clear that AI chatbots, like Your.MD, can play a vital role in reducing the number of people who unnecessarily seek medical treatment and the deaths of individuals who are endangered by symptoms unrelated to COVID-19. So, if AI health chatbots can’t reliably detect COVID-19 and should only advise you to stay at home, what else can they do? “They can work in tandem with governments and health services to stop the worried well not at risk from the virus from seeking treatment, and also support people to self-care where that is appropriate,” says Prof Baker. She thinks that with collaboration, there is enormous potential for chatbots to act as reliable companions providing guidance and tracking symptoms.
×
×
  • Create New...