This past April, Apple launched ResearchKit - a framework used to develop apps that allow patients to participate anonymously in medical research studies. While these apps have the potential to advance medical research, there are concerns about privacy and security. In addition, the accuracy and integrity of the data being provided by participants is being questioned.
ResearchKit is an open source software framework created specifically for medical research, with Apple HealthKit acting as the accompanying developer application programming interface. As ResearchKit works seamlessly with HealthKit, researchers can gain access to even more relevant data for their studies, such as daily step counts, calorie consumption and heart rate.
An organization or individual that wishes to create an app on the Apple ResearchKit or HealthKit framework does not need to meet HIPAA compliance, meaning that anyone can create an app providing it is IRB or ethics approved before it is available for download.
The way in which healthcare apps collect data falls into a bit of a gray area, especially in relation to HIPAA compliance. As it stands, the HIPAA rules that govern the collection, use and protection of ePHI do not apply to data collected and shared by an app that a consumer chooses to use, and HIPAA rules only apply to protected health information. If the application permits the user to send information, the application itself is not subject to HIPAA, although the information would become subject to HIPAA once the organization conducting the research receives it.
The lack of compliance and regulation around these apps makes participants vulnerable. They are under the impression that they are providing medical information as part of a medical study to an authorized researcher, when in fact they could be sending data straight to a malicious developer who has created an app for the sole purpose of stealing personal medical information. With ePHI reportedly worth up to 10 times more than financial data on the black market, this method of harvesting data presents an easy opportunity for data thieves.
Apple states that the data collected from ResearchKit and HealthKit apps is not available to them – which, given the iCloud hacks they experienced recently, should reassure those providing healthcare data. Apple has met with the Federal Trade Commission to discuss its commitment to protecting the information and identities of its customers, namely by prohibiting the sharing of collected data with third parties. Despite this, there is no guarantee that the data collected is safe when being shared within an app, and what happens to the data once exported from the app poses even more questions around security.
The data collected by organizations already aware of, or regulated by, HIPAA is likely to be stored in a secure cloud. However, those not familiar with the process of how data should be transmitted and stored will undoubtedly stock sensitive healthcare data of participants in a way that makes it vulnerable of being compromised.
Any medical researcher or developer who creates an app on the ResearchKit framework will need to keep in mind that HIPAA compliance will be their sole responsibility once the data has been collected. To meet this standard, they need to ensure that the data is encrypted and very secure, both during transmission and while at rest.
That being said, there are benefits to data being collected remotely. Providing no protected health information is being gathered by the app, data can be sent directly to researchers outside of HIPAA. By sending data this way, the process becomes less intrusive and less impactful on the quality of life of seriously ill patients who are willing to share medical information.
Another area in which ResearchKit has come under criticism is the data being collected from participants from a scientific and ethical standpoint. Firstly, to take part in the study the user has to be within a demographic likely to afford an iPhone. This would make the data collected from the studies too biased toward more homogenous populations, ruling out persons of a different ethnic or social background, which may include people who are potentially more susceptible to health problems.
Secondly, the participant only has to provide consent to take part in the study, meaning it is easier to lie on the initial screening questions to fake eligibility, as there is no way of vetting whether the participant has the condition being researched. This leaves it wide open to fraud, which in turn further undermines the integrity of the data.
Despite these concerns, ResearchKit will change the way in which clinical research is conducted in the future, and it will be interesting to see how this develops as more medical research apps become available.
Gene Fry is the compliance officer and vice president of technology at Scrypt. He joined Scrypt in October 2001 and has 25 years of IT experience, working in industries such as healthcare and for companies based in the U.S. and Latin America. He is a Certified HIPAA professional (CHP) through the Management and Strategy Institute. In addition, he is certified as a HIPAA Privacy and Security Compliance Officer by the Identity Management Institute and as an Electronic Health Record Specialist Certification (CEHRS) through the National Health Career Association, and he holds a Gramm-Leach Bliley Act (GLBA) certification from BridgeFront and J.J Kellers.