eBook
Achieving Data Integrity: A Guide for Insurers
Technology-driven insights and capabilities depend on trusted data
Data integrity powers insurance innovation
Technology offers insurance carriers new opportunities to innovate. Process automation streamlines claims processing. More accurate risk assessment improves underwriting, sharpens pricing, and reveals new opportunities. Personalization drives customer satisfaction and enhances omnichannel marketing performance. AI and machine learning spot fraud. Analytics reveal new business insights.
Central to realizing value from all these advances is data integrity.
To make the most of technology for innovation, insurers need trusted data that is accurate, consistent, and contextual. What can you do to ensure your data will support the innovation needed to compete in the digital age? So you can disrupt rather than be disrupted?
Continuously improving the integrity of data is a journey. The destination is a better understanding of your business and, ultimately, better service for your customers. The first step can be any unique and specific business initiative that depends on data, such as:
- Digitizing processes across your business
- Advancing modernization initiatives
- Increasing data literacy for more precise business outcomes
- Streamlining underwriting for efficiency
- Identifying opportunities and targeting sales and marketing efforts
Whether you are looking to connect and leverage data in legacy systems, improve visibility and consistency between core business applications, reduce risk, or find new opportunities, you need data you can trust. You need data accuracy, consistency, and context to fuel your business initiatives. In this eBook, we will explore some of the key capabilities that will help you begin your data integrity journey.
of insurance business executives state that they do not entirely trust the data they receive.
– Capgemini Research, 2022
Integrate data across silos
For most insurance companies today, data exists in many formats and within silos across the business, including decades-old legacy applications running on mainframes or IBM i systems. This state of affairs stands squarely in the way of innovation as well as trust in data.
Connecting data across silos and platforms, from legacy to modern cloud-based data platforms, unlocks significant value for insurers by driving data-driven decisions. In creating a single, integrated data pipeline, you can:
- Extend the value of mission-critical systems while making legacy business data available for data quality initiatives
- Gain a holistic and accurate view of the entire business by including complex datasets from legacy systems in your modern BI dashboards
- Streamline and automate business processes that cross data siloes
- Leverage modern cloud-based platforms to improve performance and reduce costs, while running workloads where your data lives — on-premises, public cloud, private cloud, or any hybrid environment.
With data integration, you’re ready to innovate. Power the next generation of customer service with mobile apps that reflect real-time mainframe transactions, for example, and enable virtual claims filing and settlement. Incorporate streaming data from car telematics for usage-based pricing or on-demand coverage. Reach digital-native prospects you couldn’t reach before.
Govern data for innovation and compliance
Carriers today are heavily investing in data governance. Why? Because leveraging critical internal data is crucial to innovating and gaining competitive advantage. But in order to do this, you must still keep a close eye on compliance.
Smart data governance strategies help you discover and understand your data’s meaning, lineage, and relationships for more advanced business insights and analytics, with processes in place to ensure compliance. Industry-leading data governance strategies help to:
- Effectively democratize data. Organizations trying to democratize data across data warehouses, data lakes, or legacy applications typically need to first improve data governance. If data is not properly cataloged, metadata will quickly become unorganized and difficult to locate and understand. A strong data governance framework will enable metadata to be dynamically captured and curated, keeping that information easily searchable and up to date.
- Ensure data is easily understood and trusted. Users typically spend too much time searching for the right information and questioning whether data can be trusted. Data governance provides business and technical asset definitions, ownership, and data lineage to give context to the assets that are being leveraged. Additionally, a data governance solution that provides visibility into data quality rules and scores of data assets within a data governance solution will increase the confidence and trust of the data.
- Ensure accessibility with compliance. Discovering and understanding the right assets is critical, but you must enforce accessibility to ensure that sensitive data conforms to internal or external policies and regulations. Best-in-class governance tools must also include an auditable workflow process that documents requests for access and edits through approved owners and processes.
Data quality and your business
Data quality takes on heightened importance as insurance companies strive to become more customer-centric across internal and customer-facing business processes. Inaccurate, incomplete, and unavailable data diminishes the quality of your customers’ experiences, hinders operational efficiency, and risks regulatory compliance issues. Moreover, data quality becomes even more important — and challenging – when you’re working with advanced analytics, and AI across an increasing volume of data.
Building a solid framework and process is only valuable if you can track and monitor permissions, changes, and requests. Collaboration is key, and a no-code workflow makes it easy and streamlined.
The goal of any organization is to grow, but you need a solution that’s flexible and scalable enough to keep up – without disruptions to the business. Think of all the financial and policyholder data that’s continuously received from third parties; data quality checks, validation with known datasets, and complex reconciliations with internal and external data ensure no data is lost or transformed incorrectly.
Managing data quality across your business leads to increased accuracy of policies, claims, and other critical business data – protecting your reputation and building customer trust. Examples of data accuracy impacting these areas include:
- Underwriting: As artificial intelligence, machine learning and other digital advances increasingly automate the underwriting function, the accuracy of data that is used as part of these machine learning risk models is becoming even more critical.
- Marketing and customer experience: It is critical to have a single view of your customer throughout your organization to ensure that every touchpoint is used to delight your current and prospective customers. Customer data that is duplicated can cause headaches for your customers and compliance issues for you. Ensuring all communication is consolidated with a single customer record enables a clear picture of all interactions with that customer. It can also avoid embarrassing interactions such as sending duplicate welcome correspondence to the same member.
- Policy data management: It is imperative that your policy data is complete and accurate. It also must be properly linked to the enrolled policyholders. Appropriately and clearly linking policy data with policy holders prevents sending a policyholder information for a policy different from the one in which they enrolled.
- Third party business data: When data that is critical to your business is obtained from an outside source, the best practice is to reconcile this data with your internal data before it is sent to downstream systems. For example, if a third party is enabled to enroll new policyholders for a specific line of business, it is important that the data for these new policyholders is reconciled with internal data to evaluate any overlap with existing policyholders. This avoids duplication of policyholder records later in the process.
- Claims processing: Claims processing often involves many systems and processes throughout the organization. When a transaction moves through this array of systems, issues can occur such as a transaction being lost, or critical data being transformed incorrectly. It is important to include data quality validations for any data as it moves through the organization to ensure that these issues do not occur.
- Investment portfolio reconciliation: As investment decisions and trades occur, it is critical that these are reconciled with the existing portfolio accounts as quickly as possible. This trade data is often sent via streaming data. When transaction data is lost or errors occur, this can immediately impact the accuracy of the portfolio value and additional investment decisions. It is critical that the quality of this data be addressed as it moves between entities to prevent any inaccuracies.
Add critical context with location insights and data enrichment
Location intelligence and data enrichment are key components of data integrity and missed by many carriers. Together, they add context to your data, increasing its completeness, boosting its value — and providing a significant advantage for insurance companies.
Accurate, consistent, and contextualized data enables faster, more confident decisions when it comes to underwriting, claims processing, and risk assessments. Plus, enriching customer data opens the way to personalization that can reveal new insights, direct product development, and deliver a game-changing customer experience.
Location intelligence starts with geo addressing
It’s in the very nature of the insurance business to work with addresses. They belong to your policyholders, your prospects, and the properties you cover and exist in your policy, claims, billing, and CRM systems. Getting those addresses right across the multiple datasets where they appear is a critical component of data integrity – but it’s one of those things that are harder to do than it may appear.
While there are a wide range of products on the market that can verify addresses, they are not all created equal. Having the right algorithms and the most complete and accurate address reference dataset make a big difference.
Accuracy and consistency are essential to operationalizing addresses in your organization. They enable addresses to serve as the foundation for establishing that elusive “golden” customer record as well as achieving efficiency in claims processing, making data-driven decisions in underwriting, and upselling/cross-selling new products
to customers.
Geo addressing also attaches latitude and longitude coordinates to each address, using a process called geocoding. It works in reverse as well. Hyper-accurate geocoding is vital to insurance companies. You need to know exactly where a property is located in relation to other properties and other businesses, for example, to evaluate the risks of co-tenancy.
Unlock key insights with spatial analytics
Location is prevalent in our everyday lives. Whether it is our mobile phones, mapping applications, or fitness trackers, we use location today without even thinking about it. The need to accurately understand “where” is fundamental to many critical business areas in insurance.
However, while using location may be expected, working with geospatial data is tricky. It requires a unique set of spatial analytical capabilities that may be hard to come by. In a recent Forrester study, 52% of business leaders surveyed said they don’t have the right technical skills or knowledge required to use location intelligence more effectively.
While it can be hard to access, interpret, and deliver the analysis in the right place, that doesn’t mean you should be missing out on key insights propelling your business forward. You can enhance underwriting decisions, improve time-to-close during claims, and share meaningful insights by providing non-technical users access to spatial insights. For example, there’s just no replacement for visualizing data on a map when you want to:
- Visualize a property’s relationship to potential risk factors such as lakes and rivers, coastlines, and wildfire borders
- Discover a universe of new customers that look just like your “best customer”
- Understand where healthcare providers are located in relation to your current and prospective plan members to optimize your network
– Forrester study
Add context to decision-making and analytics with data enrichment
Accurate addressing and rooftop-level geocoding resolution are the first steps toward enriching data with an array of location-based variables. That, in turn, provides additional value by driving better decisions based on a substantially better-informed view of reality.
- Underwriting: It’s never been more important for insurers to have as much context as possible to accurately assess risk and price policies. Risk datasets related to wildfires, floods, earthquakes, and weather events reveal the history and propensity of hazards in certain areas. For commercial insurers, datasets that provide insight into co-tenant and adjacent risks are essential.
- Catastrophic risk management: The ability to combine your business data with location data like risk datasets and weather data is essential to managing catastrophic risk. That takes the right combination of technology, including spatial analytics, and expertise to use these datasets to model risk, assess your current portfolio, and increase pricing accuracy.
- Claims processing: That same technology and enrichment data can help meet customer demands for speedy claims processing. Standard and dynamic datasets and in-depth risk and fraud analytics can bring efficiencies and automation to the process. When a claim is flagged for review, easy-to-access mapping tools and data provide a quick visual confirmation that a property was affected by an event.
- Marketing and customer experience: Entirely different enrichment datasets are available to enhance the completeness of your customer and prospect data. To better target and personalize offers and messaging, for example, customer analytics combined with demographic datasets can reveal detailed information about consumers, their lifestyles, and preferences.
Ensure that data is fit for purpose
Insurers are regularly bringing in additional datasets to add more context to business decisions. However, ensuring that this data is “fit for purpose” is difficult. It requires clear metrics for measuring and tracking whether the data is meeting your business needs. Here are five criteria you can use as a framework, whether you are evaluating your own data or data from a third-party provider:
- Coverage. Examine the attributes of each dataset. Is each data record as detailed as necessary to meet business goals? If your goal is to meet policyholder expectations for personalization, what level of detail does a dataset provide? Will you need to acquire outside data to meet business requirements for coverage?
- Completeness. Each dataset contains many fields. Consider the fill rate of the entire dataset. The more fields left blank or containing null values, the less valuable the data. Determine the completeness metric that your internal datasets must adhere to and examine your data capture processes to determine underlying causes of missing values.
- Accuracy. How accurate is the data? Statistical sampling of large datasets, yours or a third party’s, enables you to cross-check sampled data with authoritative information. This can help you determine the error rates of each dataset.
- Currency. Determine how frequently a dataset is updated and whether that schedule meets your business needs. For third-party data, ask how long it takes for datasets to reflect real-world changes. For example, you might want to ask how long after a major storm are aerial photos of affected areas available? Or if a developer builds 20 townhomes on previously empty land, how long does it take for this updated information to reach your dataset?
- Consistency. Making your data input, storage, extraction, and analytics processes as consistent as possible is key to ensuring that your data itself also remains consistent. Consistent procedures are based on clearly documented steps that everyone follows. Creating and enforcing procedural rules for handling data will do much to help avoid common data quality problems.
Case Study
Quality data streamlines insurer’s decision-making, improves competitiveness
This large insurer works through local independent insurance agents to offer a broad range of property and casualty products to policyholders across most of the U.S. To streamline and improve underwriting and pricing decisions, the personal lines group rolled out an array of Precisely geo addressing, spatial analytics, and data enrichment solutions. The result is a straight-through web-based process that, in many cases, enables policies to be written without input from a human underwriter. For policies that require human intervention, the Precisely solution streamlines data access and decision-making. Both approaches save time for agents and for the insurer’s internal staff. They also lead to better underwriting and pricing decisions.
– VP of Operations, Personal Lines
Why Precisely?
Precisely is the global leader in data integrity, ensuring accuracy and consistency in data for more than 300 insurance carriers, brokers, and reinsurance organizations. More than 1 million underwriters rely on Precisely to deliver data that is accurate and consistent across the business, with context needed to power better business decisions for better outcomes.
Build trust in data with the Precisely Data Integrity Suite
The Precisely Data Integrity Suite is a set of seven interoperable modules that enable your business to build trust in your data. Data with integrity has maximum accuracy, consistency, and context — empowering fast, confident decisions that help you add, grow, and retain customers, move quickly in a rapidly changing insurance marketplace, and empower users to get the answers they need, when they need them.
- Data Integration. Build resilient, high-performance data pipelines that connect your critical systems and data to modern data platforms — your key to continued innovation and greater competitive advantage. Easily create streaming data pipelines that integrate complex data from traditional business data sources like mainframe, IBM i, or relational databases, with modern cloud-based data platforms like Kafka, Snowflake, and Databricks.
- Data Observability. Proactively monitor and manage the health of your data. Machine learning intelligence continuously monitors the patterns in your data and immediately alerts you of anomalies, so you can avoid costly downstream issues and unexpected business disruptions later. By proactively monitoring and analyzing your data for adverse events – and alerting those who need to resolve issues–you can be assured of healthier data pipelines, more productive teams, and happier customers.
- Data Governance. A strong data governance framework ensures that you can easily find, understand, trust, and leverage critical data across your organization and produce more accurate, informed decisions and reporting. With the Precisely Data Integrity Suite’s Data Governance module, achieve the confidence you need in the meaning, quality, value, and trustworthiness of your data. Enterprise metadata management capabilities enable you to automate governance and stewardship tasks and answer essential questions about your data usage, impact, and lineage.
- Data Quality. Delivering data that’s accurate, consistent, and fit for purpose across your policy, claims, customer, and risk management systems is simplified and streamlined with the Data Quality module. With the ability to execute natively in cloud environments, this solution provides enterprise-level scalability, a visual user environment and intelligent guidance.
- Geo Addressing. Precisely geo addressing combines address matching with geocoding to provide a clean, accurate address along with building- or unit-level latitudinal and longitudinal coordinates for every property in our global reference database. Geo addressing also assigns a Precisely ID to each location. The Precisely ID helps make data enrichment easier by providing a unique and persistent identifier attached to a property’s address.
- Spatial Analytics. Spatial analytics can be integrated into any workflow and customized to drive more efficient and enlightened decisions. Leveraging open data standards, interoperability, and scalability, Precisely solutions make it easy to activate geospatial data for your unique needs and reveal actionable insights, driving superior outcomes.
- Data Enrichment. The Precisely ID enables fast and easy data enrichment from Precisely’s catalog of more than 9000 attributes in over 400 datasets, providing deeper insights for more informed decisions. These datasets include postal code boundaries, census information, world boundaries, world points of interest, building attributes, geodemographics, weather data, and boundaries for flood, fire, and other risks.
Not sure where to begin your data integrity journey?
Precisely Strategic Services can provides a broad range of consultative services tailored to helping you identify data challenges, prioritize business needs, and implement initiatives. This allows you to multiply the value of your data assets. We can help you:
- Define a business initiative related to data integrity
- Design a data program that aligns with your KPIs and business goals
- Provide strategic guidance and operational support to drive your project over the finish line
As organizations build and refine enterprise-wide data management programs, they can derive significant benefit from expert evaluation and advice. Our data principals have deep insurance industry expertise as well as domain expertise to help you maximize your data investment.
Learn more
Innovation- and data-driven insurers are always evolving. Data integrity can help you keep pace with the fast-changing world. Start improving your data today.