In an era of increasing oversight on government spending, the “One Big Beautiful Bill Act” (OBBBA) is ushering in a major shift in how state Medicaid agencies manage program integrity. A central component of this new legislation? Regular audits using the Social Security Administration’s Death Master File (DMF).
Starting in 2027 for beneficiaries and 2028 for providers, Medicaid programs in all 50 states will be required to conduct quarterly checks against the DMF to prevent improper payments to deceased individuals and so-called “ghost networks.” But with the DMF’s known limitations, like incomplete data and false positives, how can states prepare effectively?
This 30-minute webinar transcript and recording breaks down:
- What the OBBBA mandates and when deadlines take effect
- Why relying on the DMF alone may not be enough
- The risks of building your own solution vs. partnering with experienced vendors
- Practical steps Medicaid agencies can take now to get ready
Whether you’re leading compliance, IT, or program operations, this session provides actionable insights to help your agency safeguard Medicaid dollars and meet federal requirements with confidence.
Webinar Transcript
Adam Wilson
We are going to discuss one of the possibly overlooked changes in the Medicaid space that came through the OBBBA earlier this summer. There’s been a lot of focus on other eligibility requirements, but today we will dig into this one and the impactful changes that could impact Medicaid operations. So, to begin, I’m going to let Laura introduce herself.
Lara Koza
Thanks, Adam. My name is Lara Koza. I’ve been with The Berwyn Group now for about 3 years leading strategy and product. I’ve spent the vast majority of my career in the healthcare sector working for a number of different payers and I’m excited to talk to you today about death audit and the developments in this space.
Adam Wilson
And as I said, my name is Adam Wilson. I started with The Berwyn Group earlier this spring. Prior to that, I was at United Health Group as well as a mixture of roles on the Medicare side, specifically Part D product as well as operations and some other shared service operations on the payer side at Optum. So that was about 17 years and a little bit of experience with HCFC before that. We’re here today to discuss some very nuanced, but important changes that came through the OBBBA earlier this summer. There were some pending changes and different versions of the Congressional or Senate bill and what they finalized here is in front of us. There will be a quarterly death audit requirement for Medicaid eligibility starting in January of 2027. There’s some detail here about what the states are obligated to do and then building on this the next year in January of 2028, states will be obligated to continue doing death audits quarterly for providers as well as the previous beneficiary requirement. As I said, there’s been many changes to Medicaid that came through this bill, new processes and workflows are going to be required for state agencies and plans. These eligibility updates are going to be possibly burdensome and there’s going to be a heavy lift for teams to operationalize things like retroactive eligibility redeterminations for Medicaid expansion populations, as well as the work requirements. The eligibility and expansion audits that one just a loan was a 12-month requirement to determine eligibility and that moved to six months in the bill. So just that change alone is going to essentially double the workload on state agencies. Before you even account for the death audit requirement. This death audit requirement was buried several 100 pages into the bill, and the basic requirement is to check eligibility and status of both the provider and the and the beneficiary groups against what is called the Death Master File or a suitable alternative. And Laura is going to talk more about that. I’ve talked to several people in the Medicaid space since the summer, both on the TPA side, plan operations leaders and state agency staff. It’s interesting to me that this death audit requirement seems to be a surprise to most. When I mentioned it, I think because there’s so much focus and attention on the other eligibility changes that came through the bill. Prior to the session today, I was even reviewing a very good website that has a great synopsis of the OBBBA and the changes into Medicaid interestingly, again, on that website pages of content, nothing mentioning the death audit requirements. So working ahead on this is what we’re going to talk about today to implement and operationalize this. It’s going to necessitate new processes and workflow for state agencies and related parties. And then also the bill doesn’t really describe how agencies and plans are supposed to validate their death audit results. It does mention that if there is a mistake or error in that. The ability they are required to reinstate that beneficiary back into the eligibility file, but it doesn’t really say how they’re supposed to determine if they have valid results. So again, heavy lift here, a lot of unknowns and the January 27 requirement will be here before we know it. So, with that, I’m going to turn it over to Laura and talk about the death master file. She’s going to talk about the death master file and really explain why this has become such a focus area for improvements in healthcare.
Lara Koza
Yeah. So given the fact that this is the source that this, this regulation really points to as the primary for running these death audits, we thought it would be really valuable to dig into it and learn a little bit more about the underlying data, the reliability of it etcetera. 1st we wanted to show on the graph here what we have seen as a user of the DMF for decades. What we’ve seen in terms of the comprehensiveness and accuracy of this data source. Given changes in privacy laws back in 2011, the death master file the limited access death master file has eroded in terms of comprehensiveness. A number of states have actually pulled their data out over the course of this last 15 years from the death master file from inclusion in it, which has considerably reduced the number of death records that are included in this file that is relied upon by a number of state agencies as well as commercial fees for purposes of death audit. So today what we see is that only about 16% of the deaths that occur in the US each year are represented in this file. So on the surface, this file is deficient for purposes of performing comprehensive death audits. But it doesn’t just end there. If you want to go to the next slide, what we wanted to highlight here is further developments in the last six months, last year or so that have called them to question the integrity of the data, not just the comprehensiveness we’ve seen a number of anomalies. In volume and quality of the data that has been added to this file. In April 2025 we saw that 10 million, more than 10 million, records were added to this file, which is significantly exceeding the number that we would normally expect to see added in a in a monthly or a weekly basis. And the vast majority of these records had a date of birth that was from before 1900. Those are clearly deceased, but they had never previously been included in this file, which calls into question the comprehensiveness of the file even prior to that decline in comprehensiveness that we identified occurring in 2011. The other aspect of this addition is that all of those dates of death. Whereas of the date it was act added, not actually a date of death that we could reasonably rely on as an accurate date of death for those 10 million records. So you can think of that as a placeholder date of death for those 10 million records. Then again, we saw in late April 2025 there were an additional 6,000 living individuals added to this record and there was even conversation of calling this the ineligible file. This was the administration’s way of trying to identify individuals that they believed were ineligible for certain benefits. But what is not contemplated in that action is the fact that there are many people, many agencies, that have an obligation to meet these individuals or provide these individuals with benefits that they are due. So taking action on that file and ceasing benefits for 6,000 or more living individuals who are due those benefits has real repercussions. What we then saw a couple months later is the reversal of that addition. So most of those 6,000 living individuals are actually then removed from the file, so you see a lot of volatility in the data that was included in that file and added and removed from this file. And then the most recent change that we saw was an additional 1,000,000 records added in September of 2025. And those records, 96% of those listed dates of death prior to 2010, which is again prior to that decline that we identified on the prior slide that started occurring in 2011. So again, it’s not only the accuracy of the dates of death and other data in terms of living or deceased, but it’s also then calls into question the comprehensiveness of the file, even when it was deemed the gold standard previously. So relying solely on the death master file as your death audit solution really poses a lot of challenges, and I’ll send it back to Adam to talk a little bit about that.
Adam Wilson
Thanks, Laura. So some considerations to operationalize this requirement that we want to talk about today. Now that we have the background from Laura on the decline of the DMF and we can talk about the legislative requirement and kind of the spirit of what they’re trying to get here with Medicaid. So first, the resourcing and process flow design changes to this are going to be significant and just the QA for any kind of changes that are going to touch eligibility in healthcare it’s pretty significant. Doubling on that is the staggered year implementations 2027 for beneficiaries, 2028 for providers, although those two processes likely don’t have that much overlap on the plan side or the administration side, there are going to probably be some. Back-end processes that touch each other and also just the validation piece, whether it’s a provider or a member it would strongly recommend looking at both of those at once, rather than treating them as distinct projects. The second thing is, as I stated earlier, the death master file, although the legislation says that is the requirement or another source. We would contend that’s not meeting the spirit of this new legislation to just balance eligibility up against the death master file and then take steps there. The state agencies are probably going to be looking at how they stand this up with their own resources and processes. Also, as I said earlier, the validation steps required and and kind of doing this on top of all the other changes that came through in the bill. We think that this can be speed up by using a trusted partner versus trying to create something ad hoc that could be time-consuming, very inconsistent and just frankly not scalable. One example that comes to mind for previous death audits that we’ve seen in the Medicaid space prior to this bill are simply Googling to confirm deaths. That’s not scalable. And it doesn’t reach the same quality and validity of using a trusted vendor for death audit. The last thing I wanted to say on this slide is the cost effective way to do this and have assurance of the valid results is to use that trusted partner because frankly the state agencies should want to avoid any rework if possible for false positives because of the heavy lift to go back, add those Members or can be providers. Even back to files and then go back and do retro true ups from that the use of a trusted vendor can avoid all that and just make this more efficient for everyone with a high degree of quality and frankly not disrupting either of these populations with this new death audit requirement. With that, I’m going to turn it back over to Laura and she’s going to talk some about our approach to death audit and you’ll get a sneak peek behind the scenes of how we do this process.
Lara Koza
It’s helpful to talk a little bit about how we’ve built our death audit solution to give a bit of insight into the why’s behind it and what’s potentially difficult about building it internally in inside of an agency to meet some of these requirements. So, if you want to go to the next slide, I think the first thing that we wanted to talk about is really if the death master file is not the comprehensive or reliable source for death data, what can you do to try to supplement that? And this is where we’ve started and built really. Our business around is gathering information from disparate sources and bringing that in is the foundation. But then we build expertise on top of so leveraging not only the death master file, but also state records from states who are willing to share. And then largely filling that additional gap with other sources, such as obituaries now gathering this data and even if you have all of the data in one spot, it still poses other challenges, right. There’s no singular source. And so we brought in a number of different sources to try to get that answer for our clients. But obituary data is really complex. There’s no editorial standard. It’s really inconsistent in that content as well as when those obituaries might be posted to a website or in a newspaper. And that’s similar to state records and the limited access death master file records as well as there could be time lags. There are significant time lags in some situations as to when you get the records from for an individual that’s deceased and so having additional death sources that you can corroborate that information, not only the accuracy, but supplement to account for time lags is really valuable because we know that timing makes a difference. We know that accuracy makes a difference. And so that’s why we’re coming at it from a more holistic perspective. So, it also requires a really strong validation process and review process, which we’ll talk a little bit more about in the next slide if you want to go to the next slide. Which is there’s a technological component and a validation component to the solution that we have built to make sure that this is scalable yet still highly accurate solution and so again I talked about this as this is what’s really needed. These are the components that are really needed to make sure that you’re doing the best by your beneficiaries. By your taxpayers, right? And so the data that we have covers 96.5% of all potential deaths utilizing those sources we spoke about on the prior slide, the death master file is 16%, state records gets us another approximately 17% and then those obituaries fill in a vast gap that that exists. So data again is a starting point, right? Getting all of that data and we’re proud to say that we’re data independent, that we are really gathering all of this data ourselves and we don’t have a singular source that that presents risk in the reliability or viability of that solution. The second piece, our solution is the technology that we use. And again this is necessary from a scalability standpoint to Adam’s point about Googling obituaries or Googling around and trying to find corroborating evidence of a of a death. We’re able to build that technology based on decades of experience. In this market and in this industry, to really create proprietary scoring and matching algorithms that can really help us determine the viability of a potential match between a clients record and a death record. And then the last really critical piece that differentiates from other solutions in the market today and I think is really the most critical piece to ensuring you get that, that comprehensiveness and really the accuracy is our expert operations review team that this is what they do day in and day out. Is review those potential matches to make sure that they are truly the same person as is in the clients’ file and population to make sure that the death record that we’ve identified is the same person as is in that clients populate. We know what can happen if you identify a false positive. We know that people are then having their benefits cut off erroneously, that they should be eligible for those benefits. But if there’s a false positive, we know that they’re not getting those benefits, we know the value of a missed death as well, right? If you’re still submitting payments or paying out. Either reimbursing claims that are coming in or paying out the capitation rates the MCO’s we know that there’s financial burden with that as well as well as the operational burdens to reverse any of those mistakes. So, it’s critical to get these things right the first time, and these are the components that really helped to build that. Comprehensive, accurate solution to get you the right answer the first time without any of the additional effort to validate beyond. So how does this work? You can go to the next slide and we’ll talk a bit more about how this works on kind of a step-by-step basis. First, we talked about that data, right. We leverage more than 40,000 data sources today and we’re constantly adding additional sources to make sure that we’re giving the most comprehensive solution as well as the most accurate, constantly corroborating those deaths that we’re identifying. And then we apply that proprietary scoring and matching algorithmic approach and what that means is that we’re able to take our clients records or participant list and put that up against all of our death sources and say which of these is. A likely match through that there’s a lot of noise, right? So, there’s a lot of potential matches, depending on how expansive you’re matching algorithms are. And that’s where we’ve been able to really train our algorithms to say we know based on experience, these are the things that indicate a higher possible match than some of these other things. So we’re able to get to a number of validations through this step as well as a number of rejections of potential matches to say no, we are confident these are not matches and not the right person. But then to get to that additional, you know, up to the comprehensiveness that we hold ourselves, hold ourselves to as a standard, we take that through any of those that we can’t confidently say are a validation, but we think might be a possible match we send to our operational expert review team to. To really get that last really 20% or more sometimes of those validations and really that’s how we’re also achieving a 99% accuracy in the deaths that we validate. So we send back to our clients of list of validated death with that 99% accuracy in a comprehensive report so that they can take their next best action based on that data. But these are the components that are really necessary as you think about should we consider external vendors or building something internally if you want that comprehensiveness, that accuracy and then the last leg of the stool is really timeliness. How quickly are you able to identify and validate a death after the death occurs? We’re able to do that. For the vast majority of our validations within 5 days and so that’s, that’s where there’s additional differentiation. And you know time is money consideration to take. Take stock of. OK, so if we want to go to the next one, I think Adam, you can wrap it up with the additional potential use cases.
Adam Wilson
Yes, before we go to questions, today’s webinar obviously was about the death master file and the death audit requirements and as a take away for everyone listening to this and watching this webinar, some other things that you can think about now that you’ve been made aware of the death master file, you may have never, never heard of that before. You maybe weren’t aware of the gaps in the data. There are other things in healthcare that you may want to look at and consider with your teams where this data could help you payment integrity. For one, making sure that both on the prepayment and post payment audit side, you’re looking at claims both for providers or beneficiaries who could be deceased. Your healthcare quality programs, particularly Medicare stars, making sure you don’t have deaths in the denominator fulfillment on the payer operation side, making sure you’re not sending things in the mail and generating print and postage for deceased beneficiaries as well as a myriad of other use cases and clinical. Workflows, pharmacy claims, and life sciences modeling to think about. So with that, I think we have about 5- or 6-minutes left. We can see if there are any questions.
Lara Koza
Ok, we got one here. So with so many changes to the DMF recently, are there steps agencies should be taking to safeguard themselves? The best they can outside of working with or considering a vendor? So I think I can take that and build off a little bit of what we were talking about earlier. In addition to considering all the different components of not only having the right data, but also the right process to manage that data and match it with your population, I think the other thing that we would take from again the deficiencies of the DMF that we’ve seen, especially in recent history is finding alternative sources to corroborate that data, right, validate that data is accurate and that’s something that we’ve been able to do. As we’ve seen, these changes hit the DMF in the last few months is to take that and compare it against some of our other sources to say are we seeing these records elsewhere, right? Are we seeing some of these? Things hitting on obituaries are we finding sources that say that this person is in fact dead. So, we’ve been able to validate a number of those to essentially determine whether they’re accurate or if they should be taken with a a grain of salt or with proceed with caution type approach. Right. So, I think that’s something that I would add to the considerations for this. Come with a healthy dose of skepticism and make sure you’re not just taking data at face value, but really trying to validate that it’s accurate because again. That accuracy and comprehensiveness makes a big difference when you’re talking about critical benefits that that people are receiving. And I know this audience knows that very well. I think we got another one here, too. So for agencies debating whether to build out their own auditing tools or work with a vendor, what factors should they take into consideration? So, I think this one there’s, there’s a lot of internal consideration with this amongst your agencies, right, with respect to budget and what budget might allow for here. But what we’ve heard in our conversations with a number of state agencies is just limited staff, right. And so I think considering all of the potential changes that you’re going to have to implement with the broader bill here, the question is I think, where does it make sense from an efficiency standpoint, both from a resource dollar perspective as well as a resource people and time perspective, which of these things? That you’re really going to have to meet in terms of regulatory requirements going forward makes sense to do internally versus externally and what’s kind of the bar that you want to set from a quality comprehensiveness, accuracy perspective. I guess that’s how I would think of the business case. As you assess each of these things. And then if it’s an internal build, I think it’s a matter of ensuring that you’re getting that comprehensive data set and training people on understanding what are some of the common things that we might look at that identify. False positives or potential missed debts. So I think that those are really the key performance indicators here for a process. Yes, is that comprehensiveness, accuracy and timeliness. So I think that might be my answer to that question. Do you have anything else?
Adam Wilson
To add, the only thing I would add to what Laura talked about is so we’re in October right now, this requirement for the beneficiaries is coming in January of 27, call it 15 months. These are big rosters. These are a lot of eligibles and then the provider piece, it behooves the state agencies and the plans and TPA’s anyone touching this data to not wait until the last minute to implement something like this. Even when using an outside vendor, there’s just too much at stake, a lot of data. Year and a lot of things to figure out and 15 months may sound like a lot to some people, but those of us who have experience in healthcare operations know that’s a quick turnaround. So with that, I think we are complete. Thank you everyone for your time.



