We recently had a great talk by Chris Alderson of Hempsons Law firm on the importance of understanding information governance when it comes to health tech.
The NHS is a large economy which, in England alone, has a budget of £124.7 billion. A massive advantage of technology is that it will prove and deliver actual savings through not only ‘back office’ but in enhancing care and finding better ways of improving care pathways, compared to the existing system.
There is currently a strong political drive and will to utilise Big Data and Artificial Intelligence, proven by the announcement made by Theresa May on 21 May 2018. The use of Big Data is great for researchers, especially due to linkage of primary and secondary care data. There are 54.3 million individuals in England alone with huge amounts of data collected on all of them already.
Rules that people should be aware of are as follows:
Article 6 of GDPR (General Data Protection Regulation): Processing is necessary for a task carried out in the public interest; or for the purposes of legitimate interests pursued by a data controller or third party, except where overridden by interests or fundamental rights and freedoms of data subjects which require protection of personal data.
Article 9 conditions also need to be met, which state processing is necessary for health or social care treatment; management of health or social care systems; public health (and undertaken by a health professional or someone owing the equivalent duty of confidentiality) or scientific research.
Processing data for research must follow article 5 (1)(b) and 89.
In the NHS there are various sources of ethics about how one can use data, one of which is the Caldicott Principles.
Principle 1: Justify the purpose(s) for using confidential information
Principle 2: Don’t use personal confidential data unless it is absolutely necessary
Principle 3: Use the minimum necessary personal confidential data
Principle 4: Access to personal confidential data should be on a strict need-to-know basis.
Data should be appropriate and shared where necessary, in order to avoid mistakes. There is also the duty to share versus the duty to protect, where sometimes information has to be shared.
GDPR brings all these laws closer together. However, it is worth noting that GDPR is still less strict than the NHS. The NHS policy is there to protect all information including name, address, age. This warrants the same security as health data, so not just parts of data.
There is an NHS code of practice which should be used as a guide if you want to work in the NHS.
How do you develop an app?
One way to do this is the anonymisation of data, where there is never a need to identify the patient.
Pseudonymisation of data, where you may need to link back to patient identifiable data, will need Research Ethics Committee (REC) approval.
If you want to use NHS data, then you will need a partner organisation to supply the data. Their REC will need to provide approval. The NHS partner then defines what is done with that data, even if it is anonymised, and the NHS partner will ‘remain’ the owner of the database.
Ways around this include getting patient consent, their expressed informed consent, for example in clinical trials. Also, Section 251 of the NHS Act 2006, recommends authorisation of use of personal confidential data on a case-specific basis if there is no way to progress the project.
Please note that every NHS organisation has to meet information governance standards, set out in the Data Protection and Security Toolkit, in order to be allowed access to NHS secure networks.
To finish, Chris reminded us of lessons that can be learned from the Google DeepMind and Royal Free case, as an example of what not to do with data.
Many thanks to Chris for sharing his time and expertise with us. Our next Lunch&Learn: How to get investment ready is on Thursday 16 August 12:30pm - 1:30pm so put that in your diaries and do come and join us.