Article main image
Nov 15, 2022

Not surprisingly, inflation has been weighing heavy on lots of people’s minds over the last few months.

Despite it easing from a high of 9.1% in June to 8.3% in August, the cost of basics like food and energy are still hurting.

In fact rising prices became such a concern that lawmakers were effectively pushed into crafting the Inflation Reduction Act, which President Joe Biden recently signed into law. This legislation is designed to provide financial relief to Americans, with provisions to reduce energy costs, make prescription medications cheaper for seniors, and lower crop prices through rural development programs and funding for farmers.

However, the jury is still out on how much of a difference this will make to ordinary employees.

Why? Well, in 2020 (based on data from the US Bureau of Labor Statistics), food, housing, and transportation accounted for about 71% of the average household’s post-withholding expenditures. The remainder went toward healthcare, entertainment, apparel, services, education, and personal care — with 8.4% dedicated to healthcare.

In the 12 months ending June 2022, however, rising inflation meant necessities like food, housing, and transportation accounted for nearly 80% of household expenditure – and this is assuming total household expenditure held flat. In other words, barely any disposable income.

So how are Americans adjusting to this increase in household costs? Well, some will inevitably simply lean into savings accounts, credit cards, and other forms of debt to fund higher costs. But (worrying), there are indicators that suggest consumers are going to start reducing or eliminating other expenditures instead. It’s worrying, because while it’s not the end of the world if people don’t buy new cloths for a bit, other sacrifices do far more harm than good.

What happens – for instance – if/when Americans start to defer or forego healthcare as a means to offset inflation?

Skipping care often leads to adverse health impacts for Americans – and it carries unintended consequences for employers too:

The steep price of delayed care

A growing number of people are already delaying or deferring care because they don’t feel like they can handle the associated out-of-pocket costs. According to KFF polling, 43% of U.S. adults said they delayed or went without medical care in the past year because of cost concerns. This mirrors our own internal research, which found that 43% of our surveyed members would have skipped important healthcare without access to their Paytient Visa card.

But when people skimp on preventive care and early treatment, they frequently require reactive and more intensive care later on — and it brings much higher costs. To better illustrate these consequences, let’s look at a few examples.

  • Type 2 Diabetes A 2012 study tracked 740,195 veterans with Type 2 diabetes to evaluate the cost of not taking medications as prescribed. Even though pharmacy and outpatient costs for non-adherent diabetics decreased by 37% and 7%, respectively, because of skipping prescribed care, the inpatient costs (which are much more expensive) for those folks increased by 41%. The study predicts improving adherence for the non-adherent cohort would have saved $661 million to $1.16 billion in costs.
  • Mental Health By February 2021, the percentage of adults who experienced recent anxiety or depressive disorder symptoms climbed to 41.5%, but only 2% of adults with any mental illness had received mental health services over the past year. Many depressive disorders and anxiety can be treated through therapy and a single affordable prescription (e.g., generic Lexapro is offered for $5-$35 at many locations through GoodRx). But untreated mental health problems lead to worse outcomes that cascade into other areas of health.
  • Breast Cancer Cleveland Clinic reports that up to 50% of women will find a lump in their breast tissue at some point in their life. The process to evaluate whether a lump is cancerous or benign involves diagnostic imaging and biopsies, which can cost thousands out of pocket if those services are subject to the patient’s health insurance deductible and copays. For practical reasons, these services are often put off. While deteriorating health outcomes due to delayed care are the biggest concern, these delays can also lead to increased costs downstream. A 2016 study found that the average treatment costs within two years of diagnosis were $71,909 for stage 0, $97,066 for stage I/II, $159,442 for stage III, and $182,655 for stage IV. Said differently, breast cancer costs 35% more when caught at stage I/II versus stage 0; 122% more when caught at stage III versus stage 0; and 154% more when caught at stage IV versus stage 0.

It’s also worth noting that up to 69% of hospital admissions are due to prescription non-adherence, which is estimated to cost $100 billion to $300 billion per year.

The underlying theme of the above examples applies to many other common conditions, including asthma, chronic back pain, high cholesterol, and hypertension.

But while increased care costs for employees are a significant (and growing) problem, the buck doesn’t stop there.

Employers also face some serious repercussions due to this lack of preventive care:

A cascade of consequences for employers

The first and most apparent unintended consequence to employers is that deferred and foregone medical care results in increased medical costs. Those increased medical costs translate directly into higher health insurance costs.

On top of those direct costs, consider that the Bureau of Labor Statistics reports the average full-time worker has an absence rate of 3.2% ­– approximately eight days annually. Employees with cost-related barriers to care have 70% more absences than those without those barriers, roughly equating to a three-day difference in absenteeism between the two groups.

But it doesn’t stop at absenteeism. Employees can technically be in the office but still be mentally or physically unable to work at their full capacity. The Harvard Business Review suggests that “presenteeism” carries an annual cost of about $150 billion in the U.S., which works out to about $1,000 per employee. Presenteeism is most commonly caused by issues related to mental health, asthma, allergies, and chronic health – all of which are medical conditions with frequently deferred care.

Empowering employees to get care

Fortunately, there are opportunities for employers to get ahead of the curve and help their employees be proactive and educated consumers of care.

Health plan design modifications that provide low-cost access to primary care, specialist, diagnostic, and pharmacy benefits through reduced copays can stave off deferred and foregone care in an inflationary environment.

For employers offering HSA-eligible health plans with high deductibles, increasing employer health savings contributions can provide a means for employees to pay for earlier care.

Likewise, direct primary care subscriptions give employees the flexibility to seek critical medical care without worrying about out-of-pocket costs. This arrangement can also provide employees with a health advocate who can appropriately “quarterback” their care.

Admittedly, these options come with a significant price tag – but doing nothing also carries a cost.

As long as people are skipping care to make ends meet, they’re going to face adverse health impacts that eventually work their way back to their employers.

To reverse this cycle and make things better for everyone involved, employers must find ways to help their employees access and afford the care they need.