Skip to content

Blog

MRFs

Keeping Our Network Library Up to Date (Part II): Isolating a Network

In our previous post on our monthly data ingestion process, we discussed what it takes to keep all of the Table of Contents (ToC) up to date across the varying Payors. But now that we have them, what exactly do we do with them? How do we go from a ToC file to defining what constitutes a PPO network and locating its associated files.

Derek Pedersen

Published

6/7/2024

In our previous post on our monthly data ingestion process, we discussed what it takes to keep all of the Table of Contents (ToC) up to date across the varying Payors. But now that we have them, what exactly do we do with them? How do we go from a ToC file to defining what constitutes a PPO network and locating its associated files.

Payer Catalog

We start by creating a catalog of payers and their related details such as their name, where their ToC files can be found, etc. When a new payer enters the market or changes the location of their ToC files, one of our analysts just needs to add them to our system or update their details, and our processors will automatically pick up that information and process their ToC files. 

Index ETL

Once we have the payers and their ToCs cataloged, the next step in our process follows a stereotypical Extract Transform Load (ETL) pattern of extracting the contents from the ToC files, then transforming it so that it’s normalized, and finally loading it into a database. Once this data is loaded into a database, it then gives us the ability to query its contents through all of the CMS defined fields. Having parsed these files each month we can also track how payers have changed the content and use of these fields within their ToC files over time.

Some payers like Anthem have one big ToC that contains all of their networks.

While others like Aetna have multiple ToCs with their own network definitions, in this case broken out by state.

Network Templating

Now that we have the ToC files parsed, how do we isolate what constitutes a network? Just as with the ToC files, each payer has a different way of locating the relevant network files. For some payers like Aetna, this is straightforward as they use the reporting plan name field to clearly differentiate their file sets, so when trying to locate the Aetna CVS Silver files, you simply search against the reporting plan name field.

But for others such as Blue Cross and Blue Shield of Kansas, this is not the case. They use the reporting plan name field to store the name of their customers.  

Since customers and business names are not guaranteed to be consistent month over month, or that the network a customer was linked to will be the same, we needed to come up with a different method of isolating out the specific network files. For these, we filter on the file entries.

As you can see for BCBS of Kansas, they have a single file linked to each of their different networks so we can then consistently use these designators month over month instead of the reporting plan name.

Once we have the search pattern established, we create a network template which defines how we locate these files and what index templates they are linked to. As soon as those index templates have been processed, the related network templates are then attempted and a network instance is created. Throughout the month we monitor these processes to make sure that everything is flowing smoothly and make any adjustments to network templates if needed to ensure that we have the most up to date pricing files.

When a network template has completed, the network instance is moved into a “Needs Review” state.

We have several quality gates that we use to determine the quality of the network. For example, the June instance of the Maine Aetna HMO failed our “Availability of Common Codes” criteria.

When a network is approved, it gets loaded into our system so that it is available for data extraction. 

Okay, but how?

Okay now that we’ve explained the logical search patterns for how we isolate a network definition, just how do we do it? That’s a large amount of data coming from payers to be processed in a timely manner, and it was no easy feat to engineer a system to support it, but that story will have to wait till next time. Stay tuned for Part III!