By: Mike Brennan, Lead Consultant, Impact Makers and Kevin Cox, AWS SSA, CCSK, Lead Consultant, Impact Makers
This is the third post in a series about sharing healthcare data according to new CMS guidelines. The previous posts cover New Solutions for Sharing Healthcare Data and APIs for Sharing Healthcare Data
A Data Quality Refresher
In our last post, we covered APIs in FHIR solutions and touched on data consistency. A reminder that the data in an EHR or combination of back-end systems should be normalized and standardized so the API will deliver accurate data. It is critical to understand the consistency of your data to operate successfully in the Healthcare space. In the FHIR standard the elements are data components. A compliant data store is a key component to a successful FHIR API transaction.
Data consistency and manageability is often the biggest portion of setting up the mandatory FHIR API functionality. At Impact Makers, we have successfully helped clients overcome many data problems.
Data Quality
For many business consumers and procurers of data, Data Quality is an intuitive concept, which on the surface seems straight forward and understandable. However, many have trouble grasping the impact of the decisions they make and the effect that those decisions have on the actual quality of their data. It is imperative for all involved in ensuring the quality of data the data (Clinical staff, support staff, business users, stewards, analytical and operational systems) to develop this understanding as quickly as possible. Experience has shown many FHIR and Healthcare projects get into trouble due to a lack of clarity on the entire data quality process. In a Healthcare environment leveraging FHIR interoperability and APIs, this data consistency is critical for exchanging data with patients, providers, and payers.
Caution Signs
What are some of the warning signs that your company might not be managing their data at a high level of maturity?
- Does everyone have the same definition for payer, provider, and patient?
- Does everyone have the same definition of good?
- Is there a place where these definitions are documented?
- Can you confidently identify good payers, providers, and patients that contain consistent information?
- Is time spent in meetings arguing about whose numbers are right, instead of the actions that need to be taken?
- If your organization is not meeting any of your benchmarks, are you confident that the data supplied is accurate?
- Is your company experiencing high customer, provider, member churn?
- Do you know what your company’s Data Strategy is?
- Do you know how you will forget a customer as the California Consumer Privacy Act (CCPA), or EU General Data Protection Regulation (GDPR) require?
A scenario
Consider a simple element like the telephone number for a provider. One provider may have a telephone number for her business office in the medical center. She might also have a health system cell phone number. Additionally, if this provider is a PCP, they may practice a day or two a week at different off-site clinics, each with their own phone number. Lastly, the health system may have provided the physician with a cell phone. In this rather simple example, what is the physician’s phone number? The correct answer to the question, depends on the event and context around the event. If one is calling about an appointment at a certain clinic then the phone number associated with that particular clinc is the physician phone number. If it is an emergency, it is probably a different number where the call would be routed to the health system cell phone. If a question is asked, it will most likely go to the business office number. For these different contexts, is this correct telephone number the accurate reference in all the systems, how is that consistency enabled? Consider the outcome if one system gets updated with a new number, does it automatically update all systems with the new data? How will the updated data be delivered to an API?
How do you know if your data has the needed quality for interoperability? You can ask these questions:
- Do I know where the source of truth is for each element that is used in an API?
- Do I trust my data accuracy in back-end systems for sharing via API or other method?
- Do I know where the data, such as address formatting or telephone number formatting, is consistent or inconsistent?
- Do I trust my data accuracy in back-end systems for sharing via API or other method?
- Are there differences in data definition and use between siloed operational systems and back-end systems?
- Does my data easily map to the CMS API requirements?
Data consistency and manageability is often the biggest portion of setting up the mandatory FHIR API functionality. At Impact Makers, we have successfully helped clients overcome many data problems.
An Approach
At Impact Makers, we have a mature approach to data quality. We will share an overview of the success we have driven and delivered to our clients.
Our approach uses these steps:
- Discovery: Identify potential data quality issues.
- Profile Data: Review sample data and existing data creation and usage processes to provide context for business rules discussions with Data Owners and Business Data Stewards
- Develop Business Rules: Work with Data Owners and Business Data Stewards to review and documented business rules and capture and refine undocumented rules.
- Define Metrics: Define objective metrics and acceptable thresholds of compliance against which to measure agreed upon levels of quality.
- Evaluate Data with Metrics: Execute business rules against production data and evaluate results. Utilize acceptable thresholds set by the Data Quality team to evaluate the data.
- Findings Review: Review the Findings with the Data Owners and Business Data Stewards.
- Remediate Anomalies: Implement and execute a remediation process to resolve data quality issues.
- Monitor Health: Define and implement a continuous monitoring/remediation plan to prevent and/or fix data quality problems in the future.
Figure 1. illustrates the entire process and provides a high-level overview of each of the components of the Data Quality cycle. The process begins with Discovery and continues through Remediation and Monitoring, with an emphasis on implementing an ongoing Data Quality monitoring program.
In the above data quality process, there are many accelerators that !M have developed to get the facts about the data quickly. For data profiling, we used database functions in the client’s choice of technology and avoided the procurement of additional tools. After identification of data, we reviewed the data with the appropriate data stewards and the team. Next, we worked with the client to develop tailored business rules to define metrics. Based on the findings, we helped the client remediate problem data areas and propel the client forward for consistent and synchronized data.
These approaches are aligned for Healthcare Interoperability and meeting CMS requirements for FHIR and API usage. A solid foundation of consistent data is needed for the successful use of FHIR APIs.
The Impact Makers Solution
Impact Makers is an AWS Advanced Consulting Partner with a specialty in data. We leverage comprehensive and mature data practices to enable customers to take advantage of their data.
Every project has unique elements that must be incorporated into a comprehensive strategy in addition to identification and execution of technical work. As Advanced AWS consulting partners, we recognize the importance of a secure, reliable, and flexible data sharing. Our comprehensive framework includes the AWS Well Architected Framework and industry best practices in addition to elements like compliance , asset and metadata management, business strategy alignment, service portfolio management, support model definition, service design and deploy, CloudOps and much more. We work with our customers to deliver and enable strategic business advantage with cloud services.
To learn more, contact us.