Weaponizing HIPAA Privacy
How health insurers hide behind HIPAA in sharing any meaningful claims data with you about your employees while farming every imaginable fact about all plan members.
Special Healthcare Privacy Alert from Craig Gottwals via BenefitsPRO
I also sat for a half-hour interview with Armstrong and Getty on this article and more here.
Yes, this space is primarily for material other than healthcare, but this topic is too insidious and too important to all of us for me to not post here.
Weaponizing HIPAA Privacy
How health insurers hide behind HIPAA in sharing any meaningful claims data with you about your employees while farming every imaginable fact about all plan members.
You come to work one day, and notice Susan is not there. Nobody knows what happened to her, and everyone appears oddly tight-lipped about her absence. Finally, you and your coworkers are told she has taken a leave of absence. No other details are given. Attorneys, corporate compliance officers, and human resource personnel have been properly coached as to the myriad of stringent health privacy rules in the workplace, and everyone is rightfully paranoid.
I am reminded of an eccentric law professor I had who relished saying, “No good deed goes unpunished,” whenever discussing the inevitable unintended consequences of legislation or contract terms.
But after 22 years of HIPAA Privacy, I am not even sure the main impetus behind its passage was ever a good deed – at least not for those who have weaponized its use against employers.
The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule was designed to protect individuals’ medical records and other personal health information. However, the latest practices by health insurance carriers raise serious concerns about how they circumvent these rules to maximize premiums.
Overly cautious here, galloping through loopholes there
If you have less than about 250 employees (the number is a little different for each carrier) on an insurance plan, you will only receive meager large-claims data at the end of the year. You will not get month-to-month claims versus premium information. The stated reason: carriers are concerned that if they give you too much specific detail about your employees’ health and claim activity, you may be able to discern who has what condition. This, the logic goes, violates HIPAA. The smaller your population, the greater this risk. So, most carriers will start to give out detailed claims information once you have at least 250 employees on a plan. That is 250 with Carrier A, not 250 split across Carriers A and B.
This argument is absurd, but it has been the reality in the industry for twenty years. HIPAA gave carriers political and social cover to appear as vehement stewards of a member’s private health information while keeping you from telling concerned friends and co-workers why Susan is no longer at work. Never mind that somebody in human resources knows precisely why an employee may be out, for example, because they administer the Family Medical Leave Act (FMLA). And yes, while ideally, you get a doctor’s note expressing the need for a leave without explaining the medical condition’s specifics, we all know that this ideal circumstance rarely pans out with loose lips revealing exactly what malady befalls your employee or their dependent.
Real medical trend (claim inflation) is around 4%. Yet, again, your carrier hits you with the obligatory 8% to 12% renewal and provides no justification for that boated number because you are too small to receive detailed claims information. To do so may violate HIPAA, of course.
Yet, when that carrier came to bid on your group a few years ago, they sang an entirely different tune.
Until about five to 10 years ago, brokers used to be able to get medical insurance proposals from carriers without names, socials, or employee IDs of any kind. We would state “Employee 1” or have no identifier to go along with the required demographic information.
But then something changed. The very same carriers who were too scared even to give you de-identified claims information if you had less than 250 employees on a plan started refusing to provide proposals for your company without a list of the names of all employees. Now it is so bad that carriers essentially only offer a proposal for your group if they have a list of all employees and all dependents on the plan. If they do provide a proposal based on a census with no names, they regularly increase the premium so much that it might as well have been a refusal to bid.
Scouring your PHI from everyone, everywhere, all at once
Insurance companies now regularly access third-party databases to obtain health-related information on potential customers before providing insurance quotes. While the HIPAA Privacy Rule restricts the use and disclosure of protected health information (PHI) by health care providers, health plans, and health insurers acting as your current provider, there is a loophole when it comes to health insurance carriers acting as bidders.
As bidders, health insurance carriers are not considered “covered entities” under HIPAA, which means they are not subject to the same privacy standards. This allows them to access and use PHI in ways that other entities, such as health care providers, cannot.
This practice is legally permissible because it does not involve a direct relationship between the insurance company and the individual. These third-party databases are often populated with information from various sources, such as public records, consumer data brokers, and even social media. Since the information is not obtained directly from the individual or their health care provider, it is not subject to the same protections as information governed by HIPAA.
This is analogous to how the U.S. Government circumvents Constitutional protections prohibiting it from directly conducting mass surveillance programs by spying on Americans. Instead of spying on its citizens, the U.S. relies on friendly foreign governments, such as England, to do so via the Prism software system. England can then report the desired information to the U.S. because the U.S. Constitution does not constrain England.
Ultimately, while the use of third-party databases by health insurance carriers to bloat their proposals may be legal under current regulations, the practice raises monumental ethical concerns and highlights the need for further discussion and reform of privacy laws.
Third-party databases use increasingly powerful artificial intelligence (AI) to amass and aggregate information from every imaginable source to create comprehensively weaponized profiles of individuals. Health care is the nation’s largest industry and private employer. The Herculean amount of time, money, and energy levied at knowing exactly what kind of health risk you are cannot be understated. Health insurance carriers and other businesses can use the information in these databases for various purposes, such as targeted marketing, risk assessment, and, most importantly, pricing. Some examples of sources that contribute to these databases include:
Public records: Public records consist of information made available by government agencies, such as birth records, marriage records, and court records. These records may contain health-related information, like details of a disability or a history of substance abuse, which insurance carriers will then use to assess an individual’s risk.
Consumer data brokers: Consumer data brokers are companies that collect, analyze, and sell personal information about individuals to other businesses. They gather data from warranty registrations, magazine subscriptions, and online surveys. Data brokers may also have access to health-related information, such as prescription drug purchase records or medical claims data, which can provide insights into an individual’s health conditions.
Social media: Social media platforms, such as Facebook, Twitter, and Instagram, can provide a wealth of personal information about individuals, including their health habits and conditions. In the modern world, people are seemingly compelled to share incredibly personal information publicly. This becomes a carrier’s treasure trove. For example, someone may share information about a recent surgery, a chronic illness, or their fitness goals on their social media profiles. Insurance carriers may then use this information to make assumptions about an individual’s health and potential risks.
Online forums and support groups: Many individuals with specific health conditions participate in online forums and support groups to share their experiences and learn from others. Insurance carriers, often through data brokers, mine these forums for information about individuals’ health conditions and treatment experiences to inform their underwriting decisions.
Since the information in these third-party databases is not obtained directly from the individual or their health care provider, it is not subject to the same protections as information governed by HIPAA. This means that health insurance carriers can access and use this data without violating HIPAA privacy rules, even though the information may be sensitive and personal.
There are several well-known data brokers reported to provide consumer information to various industries, including insurance. Some of these data brokers include:
LexisNexis Risk Solutions: LexisNexis is a leading provider of data and analytics solutions for various industries, including insurance. They offer risk assessment products that can help insurance carriers determine the potential risk associated with an individual based on data from public records, court documents, and other sources.
Acxiom: Acxiom is a data broker that collects and sells consumer information to various industries. They provide data on demographics, lifestyle, and purchasing habits, which insurance carriers can use to assess potential risks and tailor their products accordingly.
Experian, TransUnion, and Equifax: are primarily known as credit reporting agencies, but they also offer data and analytics solutions to businesses in various sectors, including insurance. They provide consumer data and risk assessment tools to help insurers with underwriting and pricing decisions.
CoreLogic: CoreLogic specializes in property and casualty insurance data, providing information on property characteristics, risk, and claims history. This information can be useful for insurance carriers in assessing the potential risk associated with insuring a specific property or individual.
Probably the most popular of these models has been DxCG, now known as Verisk Health. In a complex web of corporate gluttons tripping over smaller ravenous devourers to penetrate the infinitely lucrative world of analyzing and scoring your health, in 2018, Veritas Capital portfolio and Verscend Technologies completed the acquisition of Cotiviti Holdings, a leading provider of “payment accuracy and analytics.” You really have to appreciate the sanitized corporate buzzwords used to describe the seedy methods employed. The combined company now operates under the Cotiviti name.
It specializes in health care analytics solutions, particularly in risk adjustment and predictive modeling. One of their key offerings is the Diagnostic Cost Group (DCG) model, used by insurance carriers, health care providers, and other organizations to predict health care costs and assess the risk of individual patients or populations.
The DCG model is based on an individual’s diagnostic and demographic information and data on their health care utilization. By analyzing this information, the model can predict the future health care costs of an individual or a group. The DCG model is particularly useful for analyzing and scoring in populations with chronic conditions or high health care utilization.
So, what is the point of HIPAA?
If one were conspiracy-minded, she might conclude that HIPAA’s primary goal has been to protect the gargantuan government-health care complex’s bottom line by obfuscating inflated premium renewals while doing nothing, in practice, to keep bidding carriers from extracting the very same PHI the law is supposed to be shielding. Was that the goal at inception? Not amongst most who championed it, but I suspect it was for some. And undoubtedly, things have progressed down that path with the expanding power of AI’s ability to vacuum up these kernels of health information from every nook and cranny of the interweb.
As previously discussed, the use of diagnostic and demographic information in risk adjustment models like the DCG model is not subject to the same restrictions as the PHI governed by HIPAA when used in the bidding process. But even if it were, a simple sleight of hand could easily make it usable in this process. For example, suppose the data were to be de-identified, meaning that it has been stripped of information that could be used to identify an individual. In that case, it may be shared and used for underwriting without violating HIPAA rules. This underscores just how difficult it would be to try and put this proverbial ooze of toothpaste back into the tube.
In that case, a carrier could send over a roster of five hundred employees and all their dependents and ask that the DCG model report back any specific claimant concerns. The model would send back a report to the carrier, explaining that eleven potential members are likely to exceed $100,000 in claims cost next year, with three over $500,000 and one of more than a million. The model could even then report, person by person, without the use of names, the exact diagnosis and prognosis of each individual by labeling them as members 1 through 11, and HIPAA would never be invoked because no individual would ever be unmasked to that person’s employer, carrier, or broker.
Additionally, specific organizations involved in health care data management, such as data aggregators, may not be considered covered entities under HIPAA at all, which means they are not subject to the same privacy standards, anyway. These organizations may be able to share diagnostic and demographic information with insurance carriers for underwriting purposes as long as they comply with other applicable laws and regulations, such as the Fair Credit Reporting Act (FCRA) or state-specific privacy laws.
Using third-party databases, risk adjustment models like the DCG, and other analytical tools by insurance carriers to inflate their underwriting decisions exploits a nefarious loophole that violates the spirit of HIPAA. Although the practices may be legal under current inane regulations, they raise significant ethical concerns about privacy and potential discrimination.
Contrast this unsavory practice to how those same carriers hide behind HIPAA privacy to shield a client’s claim data while asking for the latest 12% increase. It boggles the mind that we have allowed this to happen. But I am again reminded that “no good deed goes unpunished.”
Meanwhile, concerned coworkers and friends still have no idea where Susan is and if she is okay.
Craig Gottwals is a healthcare attorney and senior vice president at McGriff Insurance Services.