The modern healthcare ecosystem is undeniably rewarding to one and all. It dramatically improves the efficiency of healthcare services, optimizes healthcare workflows, and originates cutting-edging research that improves vitality. But it is also immensely complex and inherently insecure, with a high susceptibility to security threats, especially from threat actors whose primary intention is to either commit fraud, obtain non-prescribed drugs, or secure ransom.

A Privacy Nightmare?!

The complexity and low-security maturity of the ecosystem primarily stem from the presence of diverse legacy and modern technologies that have significant inherent vulnerabilities (OWASP IoT Top 10) and contrasting security pre-requisites that cloud the prevailing efforts. Unhygienic security practices such as casual data sharing in conversations, social media, and chat groups also exacerbate the situation. Other, yet common issues that add to the complexity include:

  • Unethical motives – selling PHI data to advertising agencies
  • Acquisitions – consolidation of assets and practices is common in the healthcare industry and security is only as strong as the weakest link
  • Inept and Garbled Privacy Policies – facilitate and encourage users to share data thoughtlessly
  • Disclosure Exceptions – the government is exempted from privacy rules regarding national security by law. Therefore, healthcare providers occasionally do reveal sensitive information in good faith to uphold the safety and security of the public.
  • Misrepresented Public Records – FDA requires physicians and healthcare providers to report issues with devices and in some cases, this is voluntary and cumbersome. Therefore, an issue gets tagged with another issue as a subsidiary to save up on the paperwork. This therefore results in not painting the complete picture of issues for a device accurately and hence physicians who reply on these publicly accessible reports to assess the safety of the devices prior to prescribing it to their patients may inadvertently prescribe an insecure product.
  • “Intended for medical purpose” – there are situations in which a product is not conceived by its manufacturer to be used “clearly” for medical purposes, but “intended” to be used for medical purposes, such as wellness apps and wearables. This means that manufacturers of medical apps and devices that may incidentally be medical devices do not have to create them to the same security standards required for conventional medical devices according to the law.
  • Unchecked and Uncontrolled
    • Data collection: One might not realize it, but PII and PHI are often added to a mandated public database without one’s consent in the name of national interest
    • Data usage by third parties – In an instance, it was also found that third-party analytical services could potentially link data from the fitness and health applications to other applications that contain identifying information about the user, leading to Big-brother like surveillance

The Impact

In gist, the ecosystem as a whole is an avenue for dire and far-reaching medical data privacy violations, the impact of which is manifold. It can lead to excessive fines and reputational damage for both healthcare providers and manufacturers in the event of a breach.

At an individual level, privacy violations can spring stigma, embarrassment, and discrimination, in turn resulting in unemployment, loss of health insurance and housing, and so forth. Patients may lose trust in their care provider, resulting in ineffective communication between physician and patient.

At a societal level, loss of individual trust may lead to unacceptance of medical assistive technologies, thereby hampering the efficient development and successful rollout of E-health technologies into society. Moreover, trust engenders individuals to participate in and support research if they believe their privacy is being protected (The equation is simple: higher quality data means higher quality medical care).

At a national level, it can also lead to an espionage-like situation as was evident with the Cambridge Analytica scandal. It may result in a situation where a private, for-profit, or a government organization knows and owns a lot of data about individuals, while the individuals do not know much about the company or government entity. This situation combined with unchecked usage of PII/PHI data by the data owners can inspire authoritarian intrusions into citizens’ private lives and unfairly scrutinize citizens based on opaque computer algorithms.

In part two we will cover the privacy enigma and conclude with best practices to preserve privacy.

Vishruta Rudresh