Not long ago your medical records consisted of a paper file stored in your family doctor’s office. There was little risk of the contents being stolen, altered or made public. It was rare for patients to see their own records. As the concept of the family doctor has changed to the use of walk-in clinics, the need to make these records accessible has increased.
To facilitate this change, medical records are being transferred to electronic form introducing new risks. Adding to this risk, people have become more involved in their own health care, creating a demand for personal devices that monitor and record information related to their health. The ease of networking and cloud storage now makes all of this information much more vulnerable and makes the security of Personal Health Information (PHI) more and more of an issue. The result has been the evolution of regulations to protect PHI:
- Canada – Personal Information Protection and Electronic Documents Act (PIPEDA)
- USA – Health Insurance Portability and Accountability Act (HIPAA)
- EU – Data Protection Directive (95/46/EC)
The rules differ in each jurisdiction, but in all cases the cost of documenting compliance is considerable. Health care providers are subject to these regulations, and any device providing this data to a clinic or hospital database is subject to a subset of these rules while storing and transmitting the data.
Where the device makes use of services such as commercial servers for on-line storage or transaction processing, Business Agreements with the service providers must be put in place to ensure compliance with the regulations.
As medical devices evolve and connect to online systems they must not compromise the security of PHI. While there are many items to consider, multiple approaches are possible.
The first option to consider when dealing with PHI is to avoid the issue by data de-identification. In many cases the goals and benefits of having all of this information can be achieved by using anonymous data. The regulations include the criteria for making information anonymous and they involve more than just removing names. Birthdates and location can be used to infer identity so this information must be removed as well.
A few definitions help to clarify what is required:
Vulnerability – every point of data / information access to a component should be considered a vulnerability. This needs to include: system logs, debug logs, application caches, system caches, exports, backups, and application access points. A good philosophy is to ensure data is de-identified wherever possible. By narrowing identifiable data we can greatly reduce risk.
Threat – generally only human threats will apply. The others should be noted for completeness.
Risks – for each vulnerability we need to consider “Unauthorized (malicious or accidental) disclosure, modification, or destruction of information” and “Unintentional errors and omissions”. We need to consider the robustness of the system in addressing how things may be affected by “IT disruptions” or “failure to maintain the system”.
Hazard – as defined in ISO 14971, is “a source of potential harm”. Exposing PHI in of itself is not necessarily a hazard, but if someone modified PHI in a way which could cause harm it should be considered a hazard. Loss of privacy can also be a hazard when the information falls into malicious hands.
In order to comply with the regulations, the device and the database and their method of use must be proven to present an acceptably low risk of disclosure. From a development perspective this involves taking a life cycle approach that includes everything from design and development to manufacture, use, and finally, disposal of devices containing patient information.
The scope of a PHI related project must be carefully defined, otherwise the potential for scope creep is very high. With analysis and architectural design the issues can be addressed early in the cycle and devices can participate in the benefits of providing patient data to the right context.