“Big Data, Big Risk: Strategies to Mitigate Risks Associated with Data Monetization”
InsideCounsel 12/18/14 By Simone Colgan Dunlap
As stated in a 2014 report from the Executive Office of the President, “[w]e live in a world of near-ubiquitous data collection.” Companies are responsible for the lion’s share of data-gathering activity; however, large, fast (i.e., conducted at a speed that is increasingly approaching real-time) and diverse data — so-called “big data” — is not just a by-product of business, but a business unto itself, offering endless possibilities for monetization.
Interested? You should be. Big data is no longer a tool reserved for mega-companies and governments. Recent advances in technology have enabled businesses of all sizes to amass, share and analyze data in ways never before possible. In addition to internal monetization strategies like streamlining processes and enabling new products, the widespread accessibility of big data has resulted in a myriad of new opportunities for external data monetization.
Take, for example, the concept of healthcare providers as purchasers of consumer data. Skeptical? Well, as reported in a recent Bloomberg Businessweek, it is already happening. Carolinas HealthCare System, one of the nation’s largest healthcare systems, has started buying data about patients (e.g., credit card purchases) and plugging it into algorithms designed to predict and prevent illness. The thought is that mundane data, like types of food purchased or whether a patient has let their gym membership lapse, will allow providers to identify at-risk patients and initiate targeted interventions.
Further, monetizing big data is not just a theoretical way to generate additional revenue — it will be essential to remain competitive. According to Gartner Research, Inc., 30 percent of businesses will have begun directly or indirectly monetizing their information assets through bartering or direct sales by 2016, and 64 percent of organizations invested or planned to invest in big data in 2014. These percentages are expected to increase dramatically in pace with rapid innovations in data-related technology. The lesson: Use it or lose.
Ready to dive in? Before you leap head-first, consider that big data monetization comes with potentially BIG risks.
One of the biggest risks associated with use of big data stems from regulatory issues. The regulation of data is complex and is shifting rapidly. Accordingly, a critical part of creating a successful data monetization strategy involves understanding regulatory constraints related to data acquisition, use and disclosure.
Unlike other jurisdictions, the U.S. does not have a single data protection law. Rather, there are numerous federal and state laws that address privacy on a sectoral basis — primarily by industry. Such sector-specific laws create protections applicable only to specific types of entities, data or activities. Prominent examples of industry specific-regulation include the Gramm-Leach-Bliley Act and the Health Insurance Portability and Accountability Act of 1996.
What’s the risk? Statutory penalties for violating privacy laws can be severe and include hefty fines and even criminal penalties. In addition, some laws permit affected individuals to bring lawsuits to enforce violations of the law or could arguably be used as grounds for such a lawsuit. So, unless your organization’s executive team has embraced the mantra that orange is the new black, these laws should not be taken lightly.
Outside of the regulated industries context, the Federal Trade Commission (FTC) has broadly interpreted its authority under Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices in or affecting commerce,” to pursue numerous enforcement actions for a broad range of alleged violations against entities whose information practices are deemed “deceptive” or “unfair.” Section 5 allows the FTC to bring enforcement actions against alleged violators that usually come with their own special brand of penalty — consent decrees or orders that prohibit future misconduct and can and do require periodic audits for up to 20 years! Notably, the FTC is able to fine organizations that have violated a consent decree or order.
Chances are, if your organization is operating within a regulated industry, you are aware of applicable data privacy requirements. But, mitigating risk requires organizations to go beyond awareness to developing what the FTC has dubbed “privacy by design,” or building and periodically re-evaluating workable privacy protections into policies, procedures and products. This is no small task for a number of reasons. Privacy laws can be onerous. In addition, relevant laws, particularly at the state level, cannot keep pace with developments in big data resulting in unintended consequences and regulatory gray areas. Moving data across borders? Matters become even more complicated because privacy laws differ by country and many are in flux.
In addition to risk arising from regulatory noncompliance, organizations should also evaluate contractual sources of risk related to data monetization. Many commercial agreements restrict data usage and re-disclosure of data. This is especially true if you are an agent or contractor. Contractual restrictions are not likely to be handily labeled and may be buried in ownership or confidentiality provisions. Accordingly, a careful review of existing contracts should be conducted to determine the parameters of relevant restrictions and how they affect intended uses. Ideally, organizations should also determine how they want to use data and how they need to use data and ensure that future contracts are negotiated accordingly.
Another source of risk stems from relationships with consumers. Preserving trusting relationships is critical to the long term health of an organization. Accordingly, legal requirements should provide a floor for protection of data, but companies should also evaluate whether the potential economic gains from lawfully using data in new ways outweigh the potential for breaches of trust that may result in reputational harm and financial losses. Keep in mind that angry consumers can do more than sheath their credit cards and tweet scathing reviews. U.S. common law has handed consumers a serious weapon in the form of privacy torts — i.e., no caps on damages, potential for class-actions, torts with a capital “T”.
Looking for guidance on whether a particular use passes the sniff test? In 2012, the Obama Administration published a “Consumer Bill of Rights” intended to provide consumers with guidance on what they should expect from companies who handle their personal information and expectations for companies that use data. Key tenants include providing transparency regarding privacy practices (usually in the form of a privacy notice), respecting consumers’ right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data, and providing consumers with the ability to exercise control over how their data is used. As with all relationships, clear communication will go a long way to maintain consumer trust.
Also, consider whether some degree of de-identifying or anonymizing data is an option for managing risk. Most laws related to privacy protection were enacted with the goal of protecting the individual that is the subject of the information. As such, it stands to reason that sanitizing data appropriately so that the person who is the subject of the data is no longer identifiable greatly reduces risk to the individual and the organization that maintains the data. The trade off? Reduced utility.
Cyber liability insurance is another tool available to manage risks associated with data. Most general liability policies specifically exclude losses incurred because of the Internet. A good cyber liability policy can go a long way to plugging this “gap” in coverage. Cyber liability insurance is still fairly new, so there's a lot of variation among policies, and a lot of room for negotiation. When evaluating cyber-liability policies, pay particular attention to the triggers and limitations for coverage. For example, some policies will pay for the costs and expenses associated with identifying the identity of who must be notified of a breach and providing required notices, but stop short of providing funds for business interruption or hiring a public relations firm to repair reputational damage in the wake of a breach.
Perhaps most importantly, in order to effectively assist your organization navigate the risks associated with monetization, you need to have a firm handle on how it uses data now and how it plans to use data in the future. We have included a series of questions to jump-start a dialogue with relevant stakeholders below:
How is data collected?
What type of data is collected?
What is the source of the data collected?
Is the data coming from outside the U.S.?
Are we a regulated entity (e.g., healthcare provider, financial institution, etc.)?
What does our Privacy Notice say?
Was consent obtained from individuals?
If de-identified data is being used, is how is de-identification being accomplished and is it in accordance with applicable law?
What do our contracts provide about data use and monetization?
How and where is the data stored?
What purpose do you want to use or disclose the data for?
Do we have cyber, privacy, and breach notification policies and procedures in place?
Are we periodically conducting risks assessments related to data?
Do you want to disclose data to a third party?
Will we receive any remuneration for the data?
Finally, remember: No risk, no reward.