468

The PA project accomplished its objective. HBOT utilization is now at a 10-year low, although that’s not entirely due to the completed prior authorization project. It is unfortunate CMS did not report how much the project cost to implement. We know it’s pointless to comment, but we can’t stop ourselves.

In their own words:

“The prior authorization model decreased HBO service use and expenditures; however, our quantitative analysis did not find a statistically significant decrease in total Medicare FFS expenditures. In fact, we found a statistically significant increase in Illinois. We did not find quantitative evidence of adverse impacts on quality of care or adverse outcomes for the full study population, although we did find a slight increase in emergency department utilization for lower extremity wounds among beneficiaries with diabetic lower extremity wounds and a slight increase in the rate of amputations in Michigan.”

“…From available claims data, we could not assess condition severity or rate of healing and could therefore not observe the characteristics that MAC’s used to judge whether HBO treatment is covered for a given beneficiary.”

Overall statements about HBO prior authorization:

  • Impacts were generally small and not statistically significant. (How much did this project COST?)

Findings about prior authorization (PA states compared to similar non-PA states):

  • Yes, HBOT utilization and the Medicare expenditures for HBOT went down:
    • PA significantly reduced the utilization of HBOT by around 15% overall
    • The probability that a patient with a DFU would get HBO declined 16% from a baseline mean of 2% (p < 0.001). So, if the goal was to decrease the use of HBOT for DFU’s, it worked.
    • Expenditures decreased by over 30% relative to the comparison group for both the “full target population” and for beneficiaries with diabetic lower extremity wounds.
  • HBOT utilization was not that much to begin with: HBO has a low 5% rate of utilization overall.
  • Oops. It might have hurt some people:
    • Emergency department use for DFU’s increased slightly in one state (0.03%):
    • Amputation rate increased 6.5% in Michigan and this was statistically significant:
    • Medicare Cost increased in a significant way in Illinois by $498 per beneficiary (p < 0.05).
    • Stakeholders reported delays in beneficiaries receiving timely access to care
    • Thirty-nine percent of beneficiary-quarters included an emergency department visit and 31 percent experienced an unplanned hospitalization in the baseline period.”
  • Fine print: Claims data does not allow an analysis of condition severity or rate of healing. We don’t know how bad the DFU’s really were or what happened to them. The analysis assumed that there were no “spillover effects” from the states with PA to the comparator states without it. We know this assumption is incorrect since utilization of HBO declined in both PA and comparison states.
  • Are you sure you are qualified to make these decisions? MAC’s reported that providers questioned the clinical experience of reviewers and inconsistencies in the reviews and perceived that MAC reviewers lacked the depth of clinical knowledge needed to make accurate medical necessity determinations for HBOT.
  • It was inconsistently implemented: Providers reported that medical necessity guidelines were being applied inconsistently and that MAC’s had different interpretations of the coverage guidelines, and there were difference between the MAC’s:
    • New Jersey experienced the greatest declines in HBOT utilization and number of treatments. That’s because it had twice the “non-affirmation” rate as the other two, using more strict local coverage determination rules.

What’s wrong with this picture?

The stated purpose of the model was to test whether prior authorization could lower Medicare expenditures by reducing the provision of non-covered outpatient HBOT therapy without adversely affecting access to or quality of care for beneficiaries. It appears that they showed the opposite.

Medicare expenditures were reduced, quite significantly in fact; suppressing utilization by 15 percent and cost by approximately 35 percent. However, they failed to demonstrate how they achieved those gains while maintaining the access to and quality of care.

They evaluated access to care by assessing the difficulty of the paperwork. The MAC staff reported that it wasn’t hard to implement. All 3 MAC’s used different methods, had different criteria and different rates of denial. What is clear is that the prior authorization process was implemented differently in each MAC with different effects on each state. The MAC’s reported that providers questioned the clinical experience of reviewers and expressed concerns about inconsistencies in the reviews. They also said that stakeholders (the MAC’s and providers) had different interpretations of the coverage guidelines. That’s a problem.

Now let’s discuss quality of care.

First, their admission that they used “all lower extremity wounds” as the comparator group for DFU’s receiving HBOT when considering if cost of care to Wagner III DFU’s rose due to the program’s limitations of care, which would absolutely drown out the issues from the program. That’s like saying they couldn’t see any effect on the tide levels in the Houston ship channel which are being affected by silting from rivers because world-wide ocean levels remained the same.

Next, to evaluate whether quality of care was impacted, they chose two incredulous surrogates. First, was whether or not a physician was in attendance during the treatment. Second, was whether or not the patient experienced any adverse events from not receiving the treatment, because, “Beneficiaries with diabetic lower extremity wounds are generally a group at very high risk for adverse events”.

For their first measure, they actually couldn’t tell if who, if anyone was supervising the therapy. So, that measure of quality was useless.

Next, for the second measure, since this group is at a high risk for adverse events, they measured increases in emergency department utilization, unplanned hospital admissions, amputations, or death. In their analysis, they discovered that in fact they “found a statistically significant increase of 0.03 percentage points (5.7 percent) in the probability of an emergency department visit for treatment of a lower extremity wound.” However, they recast the question into being about quantity of emergency department visits instead of the probability of the visit occurring in the first place! When you accept the refactoring of the original question so that it now says that the measure is not about whether you had to go to the emergency room for a diabetic foot ulcer now that HBO treatment was unavailable, into saying that should you have to go to the ER you shouldn’t have to go twice, then you’re left with a positive result.

However, it actually turned out that that when you removed the negatives that all of these measures actually dropped, which wouldn’t make any sense at all. Not to be swayed, the authors refactored the question again. At this time, they decided that the correct measure of whether the removal of a limb-saving therapy wouldn’t ever have any negative effects unless those effects were actually side effects of HBO treatment that would increase the likelihood of adverse events. Given that “We are not aware of side effects of HBO treatment that might result in higher emergency department use or hospitalization.”

“As a result, we believe it may be unlikely that reducing utilization of the treatment would reduce adverse outcomes such as emergency department visits and unplanned hospitalizations. We thus interpret the results as rejecting the hypothesis that quality decreased rather than asserting that there was evidence of an improvement.”

So, rather than admit that they no longer had an argument supporting that quality of care was maintained, the authors dismissed the first by recognizing that they had no meaningful method to measure it, and the second in a footnote because the answer didn’t align with their purpose, which was to reduce cost without impacting care.

Bottom line on the HBOT prior authorization:

  1. It didn’t save Medicare dollars: HBOT prior authorization reduced HBOT utilization but did not reduce Medicare Fee for Service (FFS) costs. Perhaps because the overall utilization of HBOT for diabetic foot ulcers is low.
  2. Medicare costs increased significantly in one state and it may have hurt some patients since amputation rate increased a statistically significant 6.5% in Michigan, in addition to delays in accessing care (which might have changed the response to HBOT, but they can’t measure that).
  3. We don’t really know its impact on quality of care: They can’t actually show that quality of care was maintained, and it wasn’t implemented impartially.

We don’t dispute that CMS needs a way to handle overuse and improper use. However, it doesn’t seem like this is the right way, particularly if we can’t get a better method to evaluate the results that includes what happened to patients.