With the IoMT market set to hit $158 billion by 2022, the sweeping digitisation of healthcare organisations the world over, and increased user interest in self-monitoring health stats, there has never been a better time to develop a healthcare app.
But product owners need to keep in mind that healthcare app development is fraught with risk. Applications that deal with such a serious matter as human health have no room for mistakes. Key to a successful healthcare app are the following:
In this article, we will go into the specifics of the last point and explain how healthcare apps manage governmental regulations and guarantee users personal data security.
The health data recorded and accumulated by mobile apps falls under the category of sensitive data. In the US the term PHI (protected health information) designates individually identifiable information used by healthcare services and regulated by the Health Insurance Portability and Accountability Act (HIPAA). Under Australian laws, health information is a subset of personal information that requires additional protection.
What is considered health information?
If the app doesn’t collect, use, disclose, or hold users’ personal information, you don’t need to comply with any legislation. But that is rarely the case with healthcare solutions. Since they aim to improve a person’s wellbeing or help identify health issues and optimal treatment plans, healthcare apps usually need to collect personal data. Mobile applications use all possible health information categories including daily activity levels, sleep, cardiac measurements and so on.
Leakage of sensitive data can lead to fraudulent activity and damage a company’s reputation. Recently, the widely publicised cases of AMP and HealthEngine prove, once again, that protecting patient information should be top of the agenda for the healthcare industry and other businesses that use health data.
Before we get into nation-specific regulatory details, there are two crucial conditions under which a mobile app may collect health data:
Health applications can store data on self-hosted databases (run on EC2 instances), managed databases (managed by a provider such as Mongo Atlas or AWS RDS), or cloud databases as a service (for example, Firebase or AWS DinamoDB).
The first approach allows for the use of any database but requires greater costs and dedicated staff with the required skill set. On the contrary, managed databases free product owners from the complexities of scaling, security, and monitoring, but this choice is limited to standard servers which can’t be modified.
Using cloud databases is the most flexible option as these offer virtual servers which can synchronise data between regional clusters. However, it leaves you without the opportunity to customise data or isolate it from other users’ data.
What is important, regardless of the chosen approach, is that any database should support data encryption for storage and exchange. Backups should be set up automatically and also encrypted.
For users who might need offline access to their data, there can be an option to create a local copy on their device. When being exported to a device and accessed from local storage, data should be encrypted. For instance, the HIPAA-compliant app Epocrates allows people to download the entire database of prescription drugs for use offline; helpful when there is no reliable internet signal.
Data fragmentation has been an enormous challenge for the healthcare industry and one of the major reasons digital solutions aren’t adopted at a larger scale. A patient who receives a diagnosis and then goes through further analysis and treatment generates information in several separate EHR systems used by each doctor. There are platforms aimed to make data exchange between medical specialists easier but generally, data interoperability is problematic.
With so many gigabytes of medical data stored and exchanged via different platforms, it’s essential to keep it secure. The example of the MyHealthRecord system which stores millions of medical documents shows the complexity of the security issue in healthcare. Startups that want to offer a new patient-doctor solution or an app for doctor communication must understand that unless they facilitate the process they won’t bring any value to users. Creating another portal for medical data will simply cut into the time a doctor can spend with a patient.
Most developed countries have similar legislation regarding data security and privacy. Regulations may vary depending on the type of organisation (whether it operates in the public or private sector) and the business’ size. Regards mobile applications, there are heavier regulations for those identified as medical devices.
There are also individual regulations for cloud-based services. Apps that store patient information in the cloud must adopt the latest security practices such as using HTTPS TLS 1.3 protocol, two-factor authentication, protection from man-in-the-middle attacks, and encryption on-premises and in transfer.
In the US, HIPAA sets the privacy and security rules that companies collecting patient data must follow. HIPAA compliance for mobile apps encompasses two major rules. The privacy rule, with standards established by the United States Department of Health and Human Services (HHS), requires ensuring that patients have access to their data and it’s stored securely. The security rule defines technical and nontechnical safeguards for the protection of health data kept and transferred electronically. HIPAA also sets requirements for healthcare app development services to execute audit trails for proper information handling and patient data exchange.
Apart from HIPAA compliance, mobile apps can be regulated by the US Food and Drug Administration (FDA). General wellness solutions such as FitBit are classified as apps that generate traditional customer data and don’t fall under FDA requirements. Those apps that aim to diagnose, prevent, or treat a certain disease are subject to such requirements. Data they collect must be stored in a HITRUST-compliant environment.
In the EU, the most important document regarding personal data is the 2016 EU General Data Protection Regulation (GDPR). It has some unique requirements including data minimization (the right to collect health data is granted only to those with a clearly defined purpose) and a patient’s right to correct or erase their data.
The Data Protection Act in the UK sets similar privacy and security standards. Essentially, developers must get consent to collect personal data and ensure the privacy and confidentiality of patient data.
In Australia, the Privacy Act sets health data regulations. Enforced by the Office of the Australian Information Commissioner (OAIC), such regulations relate to commercial entities with a turnover of over AUD$3 million. There are exceptions when sensitive information can be used without consent and these are related to so-called permitted situations when the use or disclosure of personal data is reasonably necessary. For more details, check this compliance guide.
Australians strongly support governmental control over digital health: in a Health Panel survey, the majority of respondents said that the government should review and rate health and wellness apps for accuracy and efficacy.
An app that must comply with Australian privacy principles is one that claims to do one or more of the following:
The promotional materials distributed by mobile app companies are also regulated. Australian consumer law prohibits any deceptive or misleading statements and requires clearance on the app’s cost, functionality, and data usage.
Besides varied national regulations, there are globally acknowledged requirements pertaining to the security of payments. Since 2006, Payment Card Industry Data Security Standards (PCI-DSS) have been regulated by the Payment Card Industry Security Standards Council (PCI SSC). PCI-DSS compliance involves maintaining secure systems, protecting cardholder data, and running regular tests on networks.
Reporting data breaches is part of national security policies. In the last decade, over 29 million patient health records have been compromised in data breaches, and that’s in the US alone.
HIPAA has the Omnibus Rule and the Breach Notification Rule which require healthcare app development companies to submit a breach and notify patients about possible data disclosure. In Australia, companies are obliged to report such events to the OAIC within 30 calendar days. Fines for data breaches, as well as non-compliance, reach up to AUD$10 million, and even up to €20 million in Europe.
Besides following legal requirements, companies should take action to isolate affected systems and find a recovery point to return to normal operation. It’s important to identify what caused the problem and implement procedures to protect systems from potential attacks and breaches.
Despite the detailed regulations set by governments and hefty non-compliance fines, many existing health apps fail to provide an adequate level of security. A 2015 study of accredited health and wellness apps demonstrated that 73 out of 79 applications suffered from the lack of encryption regards stored data. Patient access to their personal details is also an issue. As the authors of this 2019 study of health apps claim, people often lack access to the data they contributed; either it’s not accessible at all or access requires payment.
Non-specific applications offering guidance on maintaining a healthy lifestyle and certain medical apps only providing information but not analysing symptoms or providing treatment recommendations are not classified as medical devices.
If an app provides some health recommendations based on integration with a sensor or other measuring device, it is considered part of a medical device. For example, an app that gathers and analyses data from a glucose meter is regulated as part of the glucose meter.
Applications that control medical devices through Bluetooth or WiFi are classified as software as a medical device (SaMD). As we’ve mentioned earlier, the FDA is responsible for regulating such programs in the US.
In Australia, the Therapeutic Goods Administration (TGA) regulates medical devices according to what level of risk the product represents.
In the UK, the Medicines and Healthcare Products Regulatory Agency (MHRA) regulates apps classified as medical devices. Developers need to register with the MHRA and obtain a certification CE marking.
In the EU, the 2017 Medical Devices Regulation sets the rules for such applications. Mobile solutions that collect data from the human body through sensors to provide a diagnosis or coach patients to follow treatment recommendations, need to fulfil the Medical Device Directive requirements and pass the CE marking process. If such an app is developed for internal use, it doesn’t need CE marking.
User data, especially large volumes of it, can be used to improve healthcare in general. When patient-generated information is compared against clinical outcomes, it offers valuable insights about health conditions and the precautionary steps needed in particular situations. Given these details, healthcare providers can better identify health issues and deliver the optimal treatment in each individual case.
According to a 2017 health trends report, the majority of patients are willing to share their data. Of the respondents, 84 per cent were ready to contribute their vital statistics or basic lab tests and almost half were willing to share their health records.
We’ve already seen how apps with heart-rate sensor integration can save lives by identifying problems. If analysed, patient data can make the industry more efficient by eliminating unnecessary hospital visits and prompting critically important examinations.
Data analytics companies are the highest funded in digital health, and the mobile apps category is second when it comes to the number and amount of investments. The funding situation shows that the industry is geared towards information collection and usage, not least via applications.
In essence, users of health applications should understand why and how their data is being collected. There are four major aspects app owners must adhere to:
To make a trustworthy health application, make sure authorship, funding sources, and contact information are disclosed. When dealing with third-party partners, check how transparent their data collection policies are. For all in-app purchases, explain to customers in detail how they pay for the service and which exact features are included. If your service makes a health claim, it must be supported by clinical evidence.
In case you have any questions about developing a compliant healthcare app, reach out to us to find out how we can help.