member engagement

A 5-Star Conversation

  • Posted on

  • Written by Karen Pierce

The following is a conversation with David Larsen, former Director of Quality Improvement for SelectHealth, and Decision Point's Brian Heacox, Director of Product and Business Strategy

Key Takeaways:

  1. Keep members informed of any key changes to the plan and do your best to avoid member abrasion.
  2. Identify the members that will likely have the greatest impact on CAHPS if surveyed and build an engagement strategy focused on them.
  3. Collaborate with providers for meaningful assistance with CAHPS and member experience improvement.

Links:

  1. 2022 CAHPS Ratings Release & Compare Tool
  2. Making CAHPS Transparent and Actionable
  3. The Glidepath to Establishing Greater Accountability for Member Experience with Provider Partners

1. Keep members informed of any key changes to the plan and do your best to avoid member abrasion.

Brian Heacox: I’m Brian Heacox, Director, Product & Business Strategy. I’m joined by David Larsen, a Medicare Advantage industry veteran. David was with SelectHealth for over 30 years, heading up Quality and Star initiatives and eventually helping to lead them to a 5-Star rating and 5-Stars for every CAHPS measure. Welcome, David!

David Larsen: Hello. It’s great to be here!

Brian Heacox: How important is delivering a smooth member experience as it relates to a plan’s CAHPS rating?

David Larsen: As many of you may know it, a 5-Star rating doesn’t come easily. It’s something you have to mature into as a plan. And we had a lot of lessons learned over time. One of the lessons we learned was to avoid surprises. Seniors don’t like surprises, especially when it causes them anxiety and higher cost. We were very cautious about making significant changes to the benefit plan, the network, and removing anything that seniors tend to value.

At one point, we had instituted a new drug deductible and believed we had adequately communicated the information through the ACO. Once the seniors went out to fill their first prescriptions of the year and realized they had to pay for the whole prescription up front, it caused a significant amount of anxiety. The rating of the drug plan dropped all the way down to 2 stars that year. So, if you’re making any changes related to your network of providers, primary or secondary care, any changes to removing supplemental benefits that the members have liked, any deductibles added or drug formulary changes, you must have in place a process to make an individual phone call to every member that will be impacted.

We also learned that avoiding abrasion is critical. We were doing a fair amount of interactive voice response calls to our seniors, and we learned that these calls were doing more harm than good in terms of member satisfaction. Once we ceased the IVR outreach, our CAHPS ratings went up substantially the following year which led to a focused approach of using limited IVR. Select Health, still uses it for a few things such as flu immunizations and mock HOS surveys. Avoid abrasion as much as possible and do your best to make sure that you are communicating to the right members at the right time and through channels in which they’ll be most receptive.

Brian Heacox: With many moving parts in play, how do you operationalize a CAHPS improvement strategy within a plan like Select Health?

2. Identify the members that will likely have the greatest impact on CAHPS if surveyed and build an engagement strategy focused on them.

David Larsen: Leveraging predictive analytics through a vendor partnership helped us to drill down and identify the members that will potentially have the greatest impact on CAHPS. We chose Decision Point to help us with this. Each member is assigned a risk score relative to how they’ll respond to each CAHPS question. They also identified those who are most likely to respond to the CAHPS survey if targeted along with those who would champion us and rate us highly if surveyed.

Our targeted outreach focused on those two populations. The 1,500 members that were potentially negative responders to the CAHPS survey were outreached to via a live call with a script focused on the value of the plan. The primary purpose of the call was to identify the pain points that those members may be having and try to solve those while on the phone. If they didn’t bring forth a pain point, we had some value-added questions prepared and information relative to benefits available that they might not be using. For example, an over-the-counter drug benefit or gym membership reimbursement.

The initial challenge that presented itself was allocating the necessary resources to make those calls, so we divided them into three buckets. Those that were perceived to have a pharmacy issue were assigned to the pharmacy team. New members that had been on the plan for less than six months were assigned to the sales team as they owned the member for the first six months of onboarding. Our customer service team called the rest.

The 1,500-member target was calculated to be the magic number to improve our rating by one Star and we were successful. In fact, the Rating of the Drug Plan jumped from a two Star to a five Star in one year. Additionally, we had the outreach program for those that we call the base, like in a political campaign. You want to get your base out to vote. The goal was to motivate those members that were predicted to rate the plan positively to respond to the survey if targeted. We sent out a targeted communication right before CAHPS surveys were deployed, asking for their support. This proved to be very successful.

Brian Heacox: Thanks, David. It sounds like the CAHPS call, if you will, is a little different than the HEDIS team’s outreach for closing quality gaps. Can you tell us what is different about that script?

David Larsen: As we moved forward, we said to ourselves, how do we continue to allocate resources effectively to make these calls? It was quickly pointed out to us that 30% of our HEDIS gap closures had actionable CAHPS issues. We’re now ingraining that mentality throughout the year, focusing on whole person care. When we have a member on the phone, the goal is to address issues across domains.

Brian Heacox: With member abrasion being a hot topic as of recent, I think it’s important to cover how these calls were received by the members. I’ve talked with plans that have been apprehensive to set these types of strategies in motion given their fear of additional outreach leading to added frustration. Can you tell us a little bit about how these calls went?

David Larsen: I can understand that. We were new with making calls of this nature and certainly a little apprehensive going forward. We started out slowly and evaluated the kind of impact we were seeing as we went forward. We were pleasantly surprised at the receptivity. These seniors were very happy to receive a call from their plan, checking in and asking them how they were doing, how their experience with the plan has been and seeing how we could help. The script was designed around being of assistance rather than trying to get them to do something they may not exactly be inclined to do. Overall, we really had hardly any negative calls. We did limit it to three attempts to reach the members to avoid potential frustration. Three calls seemed to be a happy medium for our members.

3. Collaborate with providers for meaningful assistance with CAHPS and member experience improvement.

Brian Heacox: One of the things we’ve been hearing in the market and have been focusing on internally is the best way to involve providers in a CAHPS improvement strategy. Naturally, if the provider is helping to improve the member satisfaction, you need to make less phone calls. The provider is giving that extra lift by really being a part of the CAHPS equation. Do you mind talking about how you approached this at Select?

David Larsen: We’re trying to approach CAHPS in the same way we’ve historically approached the HEDIS measures. With a HEDIS measure, you have a member directed campaign and you partner with the doctors to try to close those gaps. So, on the CAHPS side, we take a similar strategy. The doctor can have a dramatic impact on CAHPS. Historically, we have completed clinician group CAHPS surveys after physician office visits or healthcare provider visits. We try to get the member to participate in a CG CAHPS survey. It has many of the same questions that are on the standard CAHPS survey.

We then feed those CG CAHPS results back to the provider’s office, they can then see their performance and how they rate against the other physicians in their clinic and on the plan. This brings a competitive nature into the equation, in terms of ranking providers by how well their patients are scoring them as a provider. Part of our pay for performance now focuses on the Rating of the Doctor measure and that’s now built into their incentive bonus pay structure.

We then turn around, take that survey data, and publish it out to our members on the provider directory so that the members can see how well their doctor rates. If they’re choosing a new doctor, they can see how the doctor rates and how other members of our plan have rated them on different aspects of the CG CAHPS survey. Together, those components have empowered the doctors to partner and reach out to members to make a difference on CAHPS. They’re partnering with us to improve measure performance, just like they would on a Breast Cancer Screening measure.

Brian Heacox: How did the providers react to this additional measure set being added to their areas of focus?

David Larsen: Initially, the doctors were saying, you’re going to hold me accountable to what my patients say and you’re going to make this all public? But soon after their thought was, there are other areas in which I’m being rated and ranked in a public fashion, so I’m going to do my best to achieve a great score and then reference my patients out to your system so they can see how well I’m doing. They really took the approach and ran with it. As we move forward with providers taking on more risk and responsibility pertaining to Star bonuses, they clearly saw the impact of CAHPS on the Star rating, which directly impacts their reimbursement. It’s an easy sell in today’s world to get the doctors to realize how much of an impact they can have on CAHPS.

Brian Heacox: How much of the pay per performance structure was designated for CAHPS?

David Larsen: It was about 20% of the bonus, so enough to get people’s attention, but not overly emphasized because we had a number of other things we wanted to influence. That said, CAHPS is now a large part of the Star equation given that it is such a highly weighted portion of the Stars program. I don’t think you can afford to exclude CAHPS improvement within your Star program going forward.

You must have your providers on board moving forward in a Stars environment where the goal is to have all patient experience measures at a rating of at least four. You can’t be successful with Stars without using every avenue available to achieve a higher CAHPS rating and an elevated member experience. One of the biggest tools you have out there are your providers. If the doctors are treating the patients well and complimenting the health plan on what a great partnership they have with the doctor, that goes a long way to boosting member satisfaction scores.

Brian Heacox: There’s a question again from the audience. As expected, this is an interesting topic, getting providers involved. And one question that comes in asked are the providers themselves also running their own CG CAHPS survey and were you selecting a random sample of members or was this done on a broad scale? I know it’s kind of interesting as Select Health owned by Intermountain Healthcare, an integrated delivery system. There is some connectivity between Select Health as a health plan and Intermountain as a provider group. But were these provider also doing their own surveys as well and, and ever comparing what they got to, what you got at Select Health?

David Larsen: There were a few. Some of the offices have wanted to do a survey and some of the offices stopped doing a survey and started using our data instead of using their resources, since we were going do it anyway. And there were some that continued to do the surveying. I don’t think we really had too much of a complaint of overlap. Our goal was to try to complete 30 surveys per provider over the year. So we would rate their performance on a minimum of 30 survey responses. And so, once we had reached those thresholds, we weren’t continuing to push those surveys backed off in terms of continuing to survey for that provider.

Concluding thoughts

Brian Heacox: We are just about at time, David, I’m wondering if you had any additional thoughts on shining a light on the black box that is CAHPS, in the middle of the year. What have you found is useful for measuring progress on CAHPS improvement as you move through the year? Any thoughts on leading indicators?

David Larsen: Well, that’s one of the areas where CAHPS is a black box. It’s just like HOS is a bit of a black box as you don’t know how you’re doing as you go forward. So, we introduced some leading indicator identifiers which are pretty simple. At the end of our customer service calls, we do a one question survey and alternate the questions throughout the year or throughout the quarter.

On one series, we ask for net promoter score - are they willing to recommend the plan to friends or family? We ask a Rating of the Drug Plan question if they’re calling the pharmacy department. If they’re calling customer service, we ask them how they rate the plan. We also alternate the two questions off the CAHPS survey related to customer service. Did we answer your questions adequately on this call and did we treat you with courtesy and respect? This gives us an idea where we are throughout the year as kind of a barometer of how we are doing with these action plans. If issues arise, we move to address and resolve before the CAHPS surveys are administered.

Brian Heacox: Thanks so much for your time, David.