Personalisation has become a key and critical component of modern marketing. Greater personalisation makes for a more enjoyable and relevant customer experience, while at the same time allowing businesses to make better use of their resources through targeted, high-impact interactions.
But increased personalisation comes at the cost of personal privacy. It requires greater knowledge of individuals’ commercial, social, financial and personal habits, goals, and concerns. Most people are willing to sacrifice a little privacy for a more personalised experience, but balancing the two pragmatically and ethically is a thorny issue, one that the tightly regulated space of pharma and healthcare continues to struggle with.
Privacy concerns are on the rise
The proliferation and increased sophistication of tracking software has not gone unnoticed by consumers.
According to a study by the Pew Research Center, two thirds of Americans believe it’s no longer possible to go through life without being tracked by companies and governments. A majority feel they have little or no control over how their data is used, and, importantly, more than 80% of individuals feel the potential risks outweigh the benefits. Finally, most individuals feel companies are unlikely to take responsibility (79%) or be held accountable by governments (75%) for data misuse.
As for healthcare, data privacy issues have always been a key concern. But the addition of wearables and fitness trackers, capable of collecting vast amounts of personally identifiable health information and run by app makers and commercial entities not traditionally bound by HIPAA or the various healthcare clauses of the GDPR, have added complexity to an already complicated regulatory space. So too has the multiplication of start-ups that straddle the border between commerce and healthcare.
Tech and privacy: Evolution in lockstep
As technology has evolved, so too have privacy concerns. The democratisation of sophisticated artificial intelligence (AI) applications, APIs and tools means that companies and governments can penetrate more deeply into people’s lives, at scale, and understand individual and collective human behaviour in astounding, even alarming new ways.
Conversely, evolving privacy policies are also having a profound effect on emerging technologies.
Third-party cookies, for example, are in the final stages of being phased out. A number of innovative, privacy-forward solutions have been proposed in their place, like Google’s FLoC and then Topics. The former operates by grouping users into cohorts with similar interests, thus preserving privacy while enabling advertisers to target individuals. With Topics, information regarding the websites you visit and decisions on which ads to show you are stored and made on-device, which also enhances privacy.
The last two or so decades, however, have been marked by a kind of “arms race” between privacy and marketers attempts to personalise experiences, sometimes despite users specific requests for them not to do so. Fingerprinting, for example, is a clever but ethically questionable practice employed by hundreds of the world’s most popular websites which uses the profile of a user’s computer, from their screen resolution to their operating system and even the presence of “Do Not Track” requests to do just that.
This kind of approach to privacy and personalisation is exactly why so many individuals have concerns over privacy, and why people are less and less willing to share their data with marketers.
The way forward for the pharma and healthcare marketer
Instead, the way forward, for pharma and healthcare marketers and beyond, is to establish a harmonious relationship between privacy and personalization, one that’s resolutely customer-centric and places the needs and interests of customers, voiced or not, at the heart of the customer experience.
How to get started
To get started, businesses should:
(1) Ensure privacy is understood and championed at all business levels through training, both on day-to-day best practices and high-level policies. The essential concepts of consent, sanitization, accountability, minimalism, context and portability should be well understood by all those who collect, view, analyse, manage or share data.
(2) Ensure individual pieces of data stay in the right context and are appropriately siloed. Departments dealing with practitioner data oughtn’t have access to patient data unless absolutely necessary, and vice-versa. Information from one region oughtn’t be made available to those operating in another, unrelated region.
(3) Make decisions on privacy from a customer, rather than a business or commercial perspective. Personal information, identifiable or not, should be used for specific ends that benefit that customer directly. Only data that is both sufficient and necessary should be collected, stored and used, as reflected in e.g. the GDPR’s principles of data minimisation and purpose limitation.
(4) Recognize that privacy laws are mutable and ever changing. The work of businesses vis-à-vis the privacy of their customers is never “done” and there is no “solution” to privacy.
(5) Be transparent in how data is collected, analysed, organised and used and gain informed consent. Patients, practitioners and payers should have a full, complete, and simple explanation of how their information is used and how it benefits them directly.
How to scale
For businesses who need to articulate privacy and personalisation on a larger scale, the following steps are essential:
(1) Create roles and a budget for privacy officers. Depending on the size and complexity of the company, it may be necessary to have several privacy officers who understand the unique challenges and requirements of individual business units.
(2) Invest in appropriate privacy legal counsel and perform regular security and privacy audits. As mentioned above, privacy laws and customer expectations are constantly evolving. Large businesses will need to ensure audits and gap analyses are being performed at regular intervals and action being taken in the aftermath.
(3) Build privacy directly into business processes, software and tools to eliminate the need for human intervention and the introduction of human error. Privacy shouldn’t be considered as an afterthought or a nuisance, but rather an integral part of every business operation and process, on the same level as revenue and medical regulatory compliance.
(4) Ensure AI and machine learning (ML) tools are being implemented and operated in full compliance of privacy laws. Because of the “black box” nature of many such tools, especially machine learning and deep learning, and the fact that machines are not active, conscious “observers” like humans, there is a tendency to be more lax with privacy. This is a mistake and can have catastrophic consequences for businesses who fail to take appropriate measures to protect privacy even in the absence of human observers.
Conclusion
Businesses must walk a fine line between privacy and personalisation. Most customers, including practitioners and patients, prefer personalised services and respond better to messages and products that are relevant to them. But they are also wary of how their personal information is being collected and used. In the tightly regulated healthcare space, achieving this balance is primordial.
Found this article interesting?
To learn more about AI and privacy versus personalization, contact Eularis today.
For more information, contact Dr Andree Bates abates@eularis.com.