1 00:00:00,000 --> 00:00:02,159 Hi, everyone. This is Brian Zerman with Becker's 2 00:00:02,159 --> 00:00:04,000 Healthcare. Thank you so much for tuning into 3 00:00:04,000 --> 00:00:05,379 the Becker's Healthcare podcast. 4 00:00:06,000 --> 00:00:08,000 Today, we're going to talk about how AI 5 00:00:08,000 --> 00:00:09,779 is transforming the payer strategy. 6 00:00:10,320 --> 00:00:12,799 Joining me for today's discussion live in Chicago 7 00:00:12,799 --> 00:00:15,359 at Becker's Payer Issues Roundtable is doctor Mira 8 00:00:15,359 --> 00:00:18,475 Atkins, chief medical officer at Lyric. Doctor Erica 9 00:00:18,475 --> 00:00:20,094 Atkins, thank you so much for being here. 10 00:00:20,394 --> 00:00:22,475 Thanks for having me. So to get us 11 00:00:22,475 --> 00:00:24,154 started here, can you just share a bit 12 00:00:24,154 --> 00:00:26,074 more about yourself, the work you're doing at 13 00:00:26,074 --> 00:00:28,474 Lyric, just so listeners can sort of appreciate 14 00:00:28,474 --> 00:00:29,755 where you're coming from as we move through 15 00:00:29,755 --> 00:00:33,030 the conversation today? Sure. So I'm actually a 16 00:00:33,030 --> 00:00:36,070 board certified OB GYN, and I practiced for 17 00:00:36,070 --> 00:00:39,049 several years prior to transitioning out of practice. 18 00:00:39,590 --> 00:00:41,429 And I've been in the payer space for 19 00:00:41,429 --> 00:00:44,390 the last ten years, but recently joined Lyric 20 00:00:44,390 --> 00:00:45,929 as their chief medical officer, 21 00:00:46,265 --> 00:00:48,585 where my focus is really to scale clinically 22 00:00:48,585 --> 00:00:50,844 informed AI for payment accuracy. 23 00:00:51,304 --> 00:00:54,524 And we're really moving towards prepayment intelligent that 24 00:00:54,825 --> 00:00:57,565 just helps to preserve the provider trust. 25 00:00:58,024 --> 00:01:00,929 Excellent. I appreciate that background. So here at 26 00:01:00,929 --> 00:01:02,549 at at Becker's Parish's roundtable, 27 00:01:03,090 --> 00:01:04,930 you've discussed with other leaders how AI and 28 00:01:04,930 --> 00:01:07,409 advanced analytics are are helping payers move from 29 00:01:07,409 --> 00:01:09,829 reactive to proactive care and payment models. 30 00:01:10,290 --> 00:01:12,129 So from your perspective, how is the shift 31 00:01:12,129 --> 00:01:14,474 showing up in areas like payment integrity and 32 00:01:14,474 --> 00:01:16,795 clinical validation, which you just flagged as really 33 00:01:16,795 --> 00:01:18,495 important to the to the work you're doing? 34 00:01:18,795 --> 00:01:21,034 And, really, I guess, in that context, if 35 00:01:21,034 --> 00:01:21,614 you could 36 00:01:22,075 --> 00:01:24,075 unpack what the word proactive actually means. I 37 00:01:24,075 --> 00:01:26,075 think we hear that word a lot, but 38 00:01:26,075 --> 00:01:26,715 put some, 39 00:01:27,275 --> 00:01:29,880 a concrete definition for folks out there. Sure. 40 00:01:29,880 --> 00:01:32,680 So for decades, you know, claims review has 41 00:01:32,680 --> 00:01:33,980 been really a reactive 42 00:01:34,439 --> 00:01:34,939 adversarial 43 00:01:35,319 --> 00:01:35,819 exercise 44 00:01:36,599 --> 00:01:37,099 where 45 00:01:37,480 --> 00:01:40,280 we are looking at either post payment and 46 00:01:40,280 --> 00:01:42,620 audits, and it's just been a really 47 00:01:43,155 --> 00:01:44,215 blunt force. 48 00:01:44,754 --> 00:01:47,234 I think the future is really we're moving 49 00:01:47,234 --> 00:01:50,515 towards the left where clinical AI provides a 50 00:01:50,515 --> 00:01:51,655 lot more transparency, 51 00:01:51,954 --> 00:01:54,534 and it drives more appropriate payments, 52 00:01:55,010 --> 00:01:58,049 where it reduces provider friction, and it really 53 00:01:58,049 --> 00:01:58,549 focuses 54 00:01:59,569 --> 00:02:03,189 people on the outliers. Those cases that really 55 00:02:03,329 --> 00:02:05,569 matter where you're getting the the high hit 56 00:02:05,569 --> 00:02:06,469 rates, I guess. 57 00:02:07,025 --> 00:02:09,525 Yeah. And that moving away from that adversarial 58 00:02:09,824 --> 00:02:10,645 sort of nature, 59 00:02:11,264 --> 00:02:12,564 do you think that AI 60 00:02:13,344 --> 00:02:15,425 is really an opportunity for that finally to 61 00:02:15,425 --> 00:02:17,525 happen? Do you really think that this technology 62 00:02:17,584 --> 00:02:19,584 is kind of might be the the the 63 00:02:19,584 --> 00:02:21,264 solution to some of the that sort of 64 00:02:21,264 --> 00:02:21,764 administrative 65 00:02:22,719 --> 00:02:25,439 gridlock, I guess. Yes. Absolutely. And I think 66 00:02:25,439 --> 00:02:27,840 it takes, like, true partnership and trust in 67 00:02:27,840 --> 00:02:30,020 order for to make that happen. 68 00:02:30,480 --> 00:02:32,180 So I think that the technology 69 00:02:32,800 --> 00:02:35,040 is out there. I think we just the 70 00:02:35,040 --> 00:02:35,540 people 71 00:02:36,240 --> 00:02:37,540 have to have the trust 72 00:02:37,840 --> 00:02:38,294 and 73 00:02:39,814 --> 00:02:40,555 the willingness 74 00:02:41,495 --> 00:02:43,655 to do something a little bit different. Yeah. 75 00:02:43,655 --> 00:02:45,574 And and coming to that trust, you really 76 00:02:45,574 --> 00:02:46,634 need sort of that, 77 00:02:47,655 --> 00:02:50,709 clinical validation. It's gotta be clinically sound. So 78 00:02:50,709 --> 00:02:51,909 how do you do that? How how can 79 00:02:51,909 --> 00:02:54,310 peer organizations really integrate these tools in a 80 00:02:54,310 --> 00:02:54,969 way that 81 00:02:55,509 --> 00:02:56,889 is equitable, transparent? 82 00:02:57,750 --> 00:02:59,349 And thinking from your experience as a as 83 00:02:59,349 --> 00:03:01,349 a clinician and also someone who who leads 84 00:03:01,349 --> 00:03:02,250 digital strategy, 85 00:03:02,834 --> 00:03:04,914 what are sort of your guiding principles or 86 00:03:04,914 --> 00:03:05,414 safeguards 87 00:03:05,715 --> 00:03:08,114 that are are essential to achieving that balance 88 00:03:08,114 --> 00:03:09,174 to making sure that 89 00:03:09,715 --> 00:03:11,794 AI is clinically sound and then you're coming 90 00:03:11,794 --> 00:03:13,655 to to that place of trust together? 91 00:03:13,955 --> 00:03:15,014 I think the first 92 00:03:15,330 --> 00:03:15,830 step 93 00:03:16,209 --> 00:03:17,430 is that intentionality. 94 00:03:18,050 --> 00:03:19,989 I think you have to be really thoughtful 95 00:03:20,129 --> 00:03:21,269 and intentional 96 00:03:22,050 --> 00:03:24,150 to ensure that we're embedding 97 00:03:24,530 --> 00:03:25,909 our human values 98 00:03:27,009 --> 00:03:30,150 of being patient centric. Right? Member centric. 99 00:03:30,764 --> 00:03:31,264 So 100 00:03:31,884 --> 00:03:32,704 at Lyric, 101 00:03:33,084 --> 00:03:36,305 our governance model is really a clinical governance 102 00:03:36,525 --> 00:03:37,025 first 103 00:03:37,324 --> 00:03:37,824 model. 104 00:03:38,764 --> 00:03:40,064 So we have multidisciplinary 105 00:03:40,685 --> 00:03:42,844 reviews. So we have clinicians. We have data 106 00:03:42,844 --> 00:03:45,719 scientists. We have legal. We have compliance. We 107 00:03:45,960 --> 00:03:49,180 really have just the whole gamut of people 108 00:03:49,639 --> 00:03:51,020 to give their input 109 00:03:51,400 --> 00:03:51,900 on 110 00:03:52,199 --> 00:03:52,699 what 111 00:03:53,159 --> 00:03:53,659 AI 112 00:03:54,680 --> 00:03:56,060 should be deployable 113 00:03:56,520 --> 00:03:59,659 before it's actually deployed and during deployment 114 00:03:59,960 --> 00:04:00,620 as well. 115 00:04:01,294 --> 00:04:03,474 And then we have continuous monitoring 116 00:04:03,854 --> 00:04:04,834 for equity 117 00:04:05,134 --> 00:04:07,614 and safety. Right? We hold our AI to 118 00:04:07,614 --> 00:04:09,474 really rigorous safety standards, 119 00:04:10,175 --> 00:04:10,675 and 120 00:04:11,215 --> 00:04:11,955 we require 121 00:04:12,495 --> 00:04:14,034 that we have this, like, 122 00:04:14,569 --> 00:04:16,430 subgroup performance checks, 123 00:04:16,889 --> 00:04:19,389 in real time governance to detect 124 00:04:19,769 --> 00:04:22,990 drift because the models can actually shift based 125 00:04:23,209 --> 00:04:24,889 on the data that's being, 126 00:04:25,290 --> 00:04:25,790 inputted. 127 00:04:26,410 --> 00:04:29,230 So and then finally, it's like that transparency. 128 00:04:30,705 --> 00:04:33,365 You can't build trust without the transparency. 129 00:04:34,464 --> 00:04:37,185 So the automated decisions, they have to be 130 00:04:37,185 --> 00:04:37,685 explainable, 131 00:04:38,305 --> 00:04:39,845 to clinicians and providers, 132 00:04:40,384 --> 00:04:41,285 to the people. 133 00:04:41,745 --> 00:04:42,064 So, 134 00:04:43,459 --> 00:04:44,819 you have to be able to back them 135 00:04:44,819 --> 00:04:47,699 up. Yeah. And that that transparency is a 136 00:04:47,699 --> 00:04:50,180 continuous act too. Just like as you're tweaking 137 00:04:50,180 --> 00:04:51,860 the models, there's no, like, set it and 138 00:04:51,860 --> 00:04:53,379 it's done. You know what I mean? Like, 139 00:04:53,379 --> 00:04:56,144 it's it's it's always constant work. It is 140 00:04:56,384 --> 00:04:59,604 constant work. Right? Because things are constantly changing, 141 00:04:59,664 --> 00:05:00,724 constantly evolving. 142 00:05:01,264 --> 00:05:02,944 I mean, look at medical evidence. When we 143 00:05:02,944 --> 00:05:06,004 talk about clinical, the medical evidence is constantly 144 00:05:06,144 --> 00:05:08,224 changing and evolving, so there is no set 145 00:05:08,224 --> 00:05:09,365 it and forget it. 146 00:05:09,740 --> 00:05:11,580 Yeah. And you could make the argument too 147 00:05:11,580 --> 00:05:13,660 that clinical evidence and medical evidence is just 148 00:05:13,740 --> 00:05:15,980 is gonna start evolving faster than what we've 149 00:05:15,980 --> 00:05:17,519 seen in the past. Absolutely. 150 00:05:17,819 --> 00:05:19,279 I mean, I think there's, 151 00:05:20,139 --> 00:05:22,459 a lot of evidence out there that says 152 00:05:22,540 --> 00:05:23,199 and I 153 00:05:23,754 --> 00:05:26,334 think someone quoted to me in 1950, 154 00:05:26,714 --> 00:05:29,194 the rate of doubling of medical literature is 155 00:05:29,194 --> 00:05:33,214 fifty years. And I think in 2024, 156 00:05:33,355 --> 00:05:35,295 it's something like sixty two days. 157 00:05:35,754 --> 00:05:37,295 So it's astonishing. 158 00:05:37,915 --> 00:05:38,879 So we have 159 00:05:39,920 --> 00:05:42,580 to be conscious of that and really intentional, 160 00:05:43,600 --> 00:05:45,680 about our models. Yeah. I I heard it 161 00:05:45,680 --> 00:05:46,720 put to me too. That's a I mean, 162 00:05:46,720 --> 00:05:49,520 that's that's a wild stat. Yeah. I also 163 00:05:49,520 --> 00:05:51,040 heard it put to me once that it's, 164 00:05:51,704 --> 00:05:54,824 you know, technology advancement innovation has never moved 165 00:05:54,824 --> 00:05:56,584 as fast as it's moving now, and it'll 166 00:05:56,584 --> 00:05:59,224 never move this slow again. Right. Yeah. I 167 00:05:59,224 --> 00:06:00,985 I have heard the same thing. Right? Like, 168 00:06:00,985 --> 00:06:02,824 this is the worst that it's gonna be. 169 00:06:02,824 --> 00:06:03,964 Yeah. So 170 00:06:04,584 --> 00:06:05,865 I don't know. That could be a little 171 00:06:05,865 --> 00:06:08,210 scary. Yeah. Yeah. That's a lot. It's a 172 00:06:08,210 --> 00:06:08,610 lot. 173 00:06:09,090 --> 00:06:11,090 But but let's let's come back to sort 174 00:06:11,090 --> 00:06:13,569 of thinking about your expertise and and what 175 00:06:13,569 --> 00:06:14,629 you're doing at Lyric. 176 00:06:15,090 --> 00:06:17,410 How does all this sort of fit into, 177 00:06:17,410 --> 00:06:19,970 I guess, helping health plans accelerate value based 178 00:06:19,970 --> 00:06:20,710 care goals, 179 00:06:21,330 --> 00:06:23,785 which are crucial to to getting to that 180 00:06:23,785 --> 00:06:24,605 payer provider, 181 00:06:25,865 --> 00:06:28,185 equanimity, I guess, to where things are sort 182 00:06:28,185 --> 00:06:29,865 of in harmony. Value based care is gonna 183 00:06:29,865 --> 00:06:31,625 have to flourish for that to happen. Right? 184 00:06:31,625 --> 00:06:34,024 Yeah. You know, I mean, that partnership is 185 00:06:34,024 --> 00:06:34,764 so important. 186 00:06:35,680 --> 00:06:36,740 I think that 187 00:06:37,120 --> 00:06:39,620 you can use AI, like, from that predictive 188 00:06:39,759 --> 00:06:40,899 modeling perspective, 189 00:06:41,599 --> 00:06:44,419 and you'll get better risk stratification or identification 190 00:06:44,720 --> 00:06:46,659 of at risk patients early. 191 00:06:47,039 --> 00:06:49,779 So you can really direct your clinical efforts 192 00:06:50,024 --> 00:06:52,125 to members who are gonna benefit the most 193 00:06:52,345 --> 00:06:54,205 and drive those better outcomes. 194 00:06:55,305 --> 00:06:56,764 I think from an administrative 195 00:06:57,225 --> 00:06:57,725 perspective, 196 00:06:58,105 --> 00:07:01,324 it can really decrease the administrative burden, 197 00:07:01,865 --> 00:07:03,564 which ultimately means 198 00:07:04,560 --> 00:07:06,339 providers, doctors, clinicians 199 00:07:06,639 --> 00:07:08,480 are gonna be spending more time with the 200 00:07:08,480 --> 00:07:10,720 patients. Right? And that's really what I think 201 00:07:10,720 --> 00:07:11,460 we want. 202 00:07:11,839 --> 00:07:12,339 So, 203 00:07:13,199 --> 00:07:15,779 automating those activities, it really frees, 204 00:07:16,160 --> 00:07:18,339 clinicians up to work top of license. 205 00:07:19,040 --> 00:07:19,699 And then 206 00:07:20,305 --> 00:07:24,004 shared measure measurement. Right? There's always been 207 00:07:24,865 --> 00:07:28,324 just that source of truth kind of friction, 208 00:07:28,464 --> 00:07:30,944 who has the right source of truth. And 209 00:07:30,944 --> 00:07:33,264 I think AI can really help align all 210 00:07:33,264 --> 00:07:34,004 the stakeholders, 211 00:07:34,579 --> 00:07:37,540 payers, providers, patients, by really integrating the data 212 00:07:37,540 --> 00:07:40,680 sources so you have one source of truth 213 00:07:40,819 --> 00:07:41,720 in a comprehensive 214 00:07:42,180 --> 00:07:43,319 view of outcomes. 215 00:07:44,019 --> 00:07:46,579 So data has always been really difficult to 216 00:07:46,579 --> 00:07:48,105 do with value based care, but I think 217 00:07:48,105 --> 00:07:50,985 there's a real opportunity here with AI to 218 00:07:50,985 --> 00:07:53,545 make the data far more accessible than it 219 00:07:53,545 --> 00:07:55,004 has been in the past, 220 00:07:55,785 --> 00:07:56,605 and transparent. 221 00:07:57,384 --> 00:08:00,129 So and then trust. Again, trust is at 222 00:08:00,129 --> 00:08:01,329 the center of all of it. That that 223 00:08:01,410 --> 00:08:03,089 that's where where my head went to. It 224 00:08:03,089 --> 00:08:04,949 comes back to, like, trust and transparency. 225 00:08:05,569 --> 00:08:07,730 You have the best technology in the world, 226 00:08:07,730 --> 00:08:09,670 but if you don't have trust and transparency 227 00:08:09,889 --> 00:08:11,730 sort of around it, it's not gonna not 228 00:08:11,730 --> 00:08:13,245 gonna do what it's supposed to do. Right? 229 00:08:13,324 --> 00:08:15,485 Right. Nobody's gonna use it. Right. Right. So 230 00:08:16,125 --> 00:08:17,964 So for folks listening out there who want 231 00:08:17,964 --> 00:08:19,904 to sort of, I guess, perhaps 232 00:08:20,444 --> 00:08:23,004 accelerate their their value based care goals and 233 00:08:23,004 --> 00:08:25,245 and put AI sort of at the center 234 00:08:25,245 --> 00:08:26,845 of that strategy, what's one step they could 235 00:08:26,845 --> 00:08:29,149 take? What advice would you give? You 236 00:08:29,529 --> 00:08:31,310 know, that's a tough one. 237 00:08:31,850 --> 00:08:32,350 But 238 00:08:32,649 --> 00:08:33,549 I would say, 239 00:08:34,009 --> 00:08:37,470 look at what's actually actionable. Right? Take something 240 00:08:38,009 --> 00:08:40,509 do a pilot on it. Partner with someone 241 00:08:40,570 --> 00:08:43,144 just on, you know, one aspect. 242 00:08:43,605 --> 00:08:44,745 You know, for example, 243 00:08:45,524 --> 00:08:47,625 here at Lyric, you know, we look at 244 00:08:47,684 --> 00:08:49,705 all sorts of different types of claims, 245 00:08:50,245 --> 00:08:52,904 look at a high cost inpatient DRG 246 00:08:53,605 --> 00:08:55,384 or coordination of benefits, 247 00:08:56,009 --> 00:08:58,730 Measure the hit rate. Look at provider appeals. 248 00:08:58,730 --> 00:09:00,670 Look at time to, you know, resolution. 249 00:09:02,009 --> 00:09:04,410 Grab some KPIs with just a short pilot 250 00:09:04,410 --> 00:09:06,350 and see if it works, and then iterate 251 00:09:06,410 --> 00:09:08,250 on that. Right? It's not gonna work the 252 00:09:08,250 --> 00:09:10,735 first time. It's just those multiple iterations and 253 00:09:10,735 --> 00:09:11,954 learning off the iterations, 254 00:09:12,815 --> 00:09:15,294 and I think that you'll find it really 255 00:09:15,294 --> 00:09:15,794 successful. 256 00:09:16,254 --> 00:09:18,334 That's, I think wonderful advice. 257 00:09:18,735 --> 00:09:20,334 You started at you said it was a 258 00:09:20,334 --> 00:09:21,774 tough question, but I think you landed the 259 00:09:21,774 --> 00:09:22,914 plane pretty good there. 260 00:09:23,419 --> 00:09:25,519 Mara, is there anything we didn't touch on 261 00:09:25,579 --> 00:09:27,500 or final thoughts you'd like to share? Perhaps 262 00:09:27,500 --> 00:09:29,579 it's something you wanna reemphasize for listeners. I'll 263 00:09:29,579 --> 00:09:31,019 leave it to you how how you wanna 264 00:09:31,019 --> 00:09:32,000 close out the podcast. 265 00:09:32,539 --> 00:09:35,339 Yeah. I think when we build AI with 266 00:09:35,339 --> 00:09:36,319 clinical intelligence, 267 00:09:36,779 --> 00:09:38,399 the governance, and the transparency, 268 00:09:39,235 --> 00:09:40,995 we move from a system that has a 269 00:09:40,995 --> 00:09:43,315 lot of friction in it to much better 270 00:09:43,315 --> 00:09:45,335 alignment across the system. 271 00:09:45,794 --> 00:09:46,934 It can help clinicians 272 00:09:48,115 --> 00:09:48,934 be clinicians. 273 00:09:49,634 --> 00:09:52,035 So it's a tool that we can use 274 00:09:52,035 --> 00:09:54,615 to improve outcomes for patients while 275 00:09:54,990 --> 00:09:55,490 really 276 00:09:55,950 --> 00:09:57,970 decreasing that administrative burden. 277 00:09:58,269 --> 00:10:00,029 Wonderful. Mara, thank you so much for being 278 00:10:00,029 --> 00:10:02,512 on the podcast. Thanks for having me. We 279 00:10:02,512 --> 00:10:05,392 also wanna thank our podcast sponsor, Lyric. Listeners, 280 00:10:05,392 --> 00:10:07,232 you can tune to more podcasts from Becker's 281 00:10:07,232 --> 00:10:11,332 Healthcare by visiting our podcast page at beckerspodcast.com.