Cookies on Pinsent Masons website

Our website uses cookies and similar technologies to allow us to promote our services and enhance your browsing experience. If you continue to use our website you agree to our use of cookies.

To understand more about how we use cookies, or for information on how to change your cookie settings, please see our Cookie Policy.

Code sets benchmarks for AI use in the NHS

Businesses supplying artificial intelligence (AI) tools to the NHS in England will be expected to be open about the way their algorithms work, and to define key performance indicators (KPIs) for the technology, under a new code of conduct that has been developed.20 Feb 2019

Drawn up by the UK government and NHS England following consultation with industry, academics, regulators and patient representative organisations, the code is designed "to enable the development and adoption of safe, ethical and effective data-driven health and care technologies", the Department of Health & Social Care (DHSC) said.

The code is comprised of 10 principles, including one which encourages those developing, deploying and using data-driven technologies to "explain algorithms to those taking actions based on their outputs".

"When building an algorithm, be it a stand-alone product or integrated within a system, show it clearly and be transparent of the learning methodology (if any) that the algorithm is using," the code said. "Undertake ethical examination of data use specific to this use-case."

"Achieving transparency of algorithms that have a higher potential for harm or unintended decision-making, can ensure the rights of the data subject as set out in the Data Protection Act 2018 are met, to build trust in users and enable better adoption and uptake," it said.

The code also recommends that there is transparency over the context for the algorithm as well as for potential alternative contexts, and around the model on which the specifications are based.

"Show in a clear and transparent specification: the functionality of the algorithm; the strengths and limitations of the algorithm (as far as they are known); its learning methodology; whether it is ready for deployment or still in training; how the decision has been made on the acceptable use of the algorithm in the context it is being used (for example, is there a committee, evidence or equivalent that has contributed to this decision?); the potential resource implications." the code said. "This specification and transparency in development will build trust in incorporating machine-led decision-making into clinical care."

Guidance on how data analytics tools should be applied is also contained in the code.

"Algorithms should be trained to understand the levels of data quality first and then achieve their objective by using the variables given," the code said. "This two-stage approach should be built in so that high fluxes in data quality are handled appropriately."

Other principles contained in the code include those focused on ensuring new technology is secure, and that there is fairness, transparency and accountability around data use. According to the code, those developing or designing data-driven technology for use in the NHS should carry out a data protection impact assessment (DPIA).

The code also calls on data-driven technology developers and designers to present evidence of the effectiveness of their products or services and of their value for money, and said a "business case highlighting outputs, outcomes, benefits and key performance indicators (KPIs)" should also be set out.

"Clearly define KPIs and where the product will result in better provision and/or outcomes for people, in addition to outlining where and how cost savings or reductions are likely to be made," it said.

The evidence standards the technologists will need to present "will increase as the risk associated with the technology grows," the code said.

The government said the code will evolve over time.

"The code tackles a number of emerging ethical challenges associated with the use of data-driven technologies in the NHS and the wider health care system," it said. "We therefore expect to engage with the Centre for Data Ethics and Innovation on developing and monitoring the code to ensure it fits with the latest best practice. We will continue to engage with all members of the health and care community."

Dr Simon Eccles, chief clinical information officer for health and care in England, said: "Parts of the NHS have already shown the potential impact AI could have in the future of the NHS in reading scans, for example, to enable clinicians to focus on the most difficult cases. This new code sets the bar companies will need to meet to bring their products into the NHS so we can ensure patients can benefit from not just the best new technology, but also the safest and most secure.

The new code was publicised on the same day that the government announced the creation of 'NHSX', a new unit tasked with leading digital transformation in the NHS. NHSX will be made up of people from government, NHS bodies and industry.

The government said: "Currently, much NHS technology relies on systems designed for a pre-internet age. Patients are not getting the care they need because their data does not follow them round the system. Change has been slow because responsibility for digital, data and tech has been split across multiple agencies, teams and organisations. NHSX will change this by bringing together all the levers of policy, implementation and change for the first time."