Ewa Banaś – Blog – Future Processing https://www.future-processing.com/blog Fri, 20 Feb 2026 11:42:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.future-processing.com/blog/wp-content/uploads/2020/02/cropped-cropped-fp-sygnet-nobg-32x32.png Ewa Banaś – Blog – Future Processing https://www.future-processing.com/blog 32 32 The practical implementation of AI-augmented underwriting https://www.future-processing.com/blog/the-practical-implementation-of-ai-augmented-underwriting/ https://www.future-processing.com/blog/the-practical-implementation-of-ai-augmented-underwriting/#respond Wed, 02 Oct 2024 06:05:08 +0000 https://stage-fp.webenv.pl/blog/?p=30856 With more than 25 years of experience in Speciality, Marine and other lines of the insurance industry, we understand insurance and the London Market well. Passionate about learning the most up-to-date trends and ideas, we bring them to you!

Read our most recent article with Paul Butler, Chief Technology Officer at Hiscox London Market, below to find out his views on how AI is affecting the insurance sector and what is being done to utilise this technology.


Key takeaways on augmented underwriting

  • Hiscox London Market has integrated AI into its underwriting processes, significantly reducing data processing times and increasing accuracy.
  • Collaboration with Google Cloud has enabled further advancements, such as using large language models (LLMs) to enhance underwriting speed and efficiency.
  • While AI offers substantial benefits, Hiscox maintains a focus on augmenting, rather than replacing, human underwriters due to the complexity of specialised insurance areas.
  • Challenges include managing AI model reliability and developing effective prompt engineering techniques.
  • The future of AI in underwriting at Hiscox involves continued innovation while ensuring robust governance and oversight.

Paul Butler has been pioneering the use of artificial intelligence (AI) to transform underwriting processes. With remarkable advances in machine learning, Butler is well versed in implementing large language models to both insurance as a whole, and specifically, the underwriting process.

We’ve been working on AI for some time at London Market, probably about four years [and that’s when] we created a brand new data science team focused on developing neural networks using a lot of back data to train models to handle submission emails and their attachments efficiently.

In a recent interview featured on our IT Insights InsurTalk series, Butler shared his insights on the implementation of AI-augmented underwriting, the challenges faced, and the lessons learned at Hiscox. Sounds interesting?

Read on to learn about how AI is reshaping underwriting in a major Lloyd’s of London carrier.


The current state of AI in underwriting

Hiscox London Market has been at the forefront of integrating AI into underwriting, with four years of development already put into testing and implementing AI into their underwriting processes.

The initial focus was on using AI to automate the processing of submission emails and attachments.

Paul mentioned that before AI, “(…) we had third party operational people in India, about 20 people. And it would take them about two or three days to cleanse that data, to transform and geocode that data and get it back to us. The models are doing this in seconds“, Butler explained.

Key Developments at Hiscox Include:

  • Neural Networks for Data Processing: Hiscox’s AI models handle submission data, transforming and geocoding it with a 98-99% accuracy rate, an improvement over the 95% accuracy previously achieved by human teams. This model “strips that data out, transforms it, geocodes it, and makes it available for the underwriters to start using“, according to Butler.
  • Collaboration with Google Cloud: Hiscox developed a lead algorithmic underwriting proof of concept using Google’s large language models (LLMs) demonstrating the potential to improve their underwriting processes. “we’ve had some great results in using those as well” noted Butler.


The continued need for human expertise in underwriting

Despite these advancements, Hiscox remains cautious about fully automating underwriting decisions.

We’re calling it augmented underwriting for a reason“, Butler emphasised. “We’re not looking to replace the underwriters” said Paul, particularly in specialist areas like sabotage and terrorism insurance.

Their focus remains on equipping underwriters with tools that enhance their ability to make quick, informed decisions, saving precious time in the process.

We’re trying to do is minimise that time, put all that information that we’ve extracted and packaged together in front of the underwriter for them to make the decision“.

Paul Butler stressed that “I don’t think we’re ever probably going to get to a point where we are going to trust large language models” and that Hiscox is assessing the risks of using these models, and where the risks are minimal and the benefits appear high, this is the ideal point to implement LLM technologies into the underwriting process.

Hiscox’s collaboration with Google to minimise the risks and decision-making time in the underwriting time has so far yielded impressive results, with Paul explaining that “(…) we’ve got a three-day process down to a matter of minutes“, by accumulating the data, modelling and pricing it, and understanding their exposure in order to be able to make good, accurate final decisions on the model.

MVP for streamlined claims management process

We collaborate with Hiscox on an MVP that integrates data from multiple sources into one aggregated view. Our joint team builds an event-driven architecture, enhancing system’s flexibility and responsiveness.

We align the tool with the claim handlers’ specific needs leveraging continuous UX consultancy. The MVP has received positive feedback from claims handlers.


The evolution required to make underwriting decisions

Hiscox has learned that while AI can significantly aid the underwriting process, there are limitations to its use.

Large language models “do hallucinate, they do have come up with incorrect answers sometimes and they’re very confident that they’ve got the correct answers“, Butler warned.

To address these risks, Butler explained a key learning gained at Hiscox:

We’ve discovered that prompt engineering really is an art form. (…) You have to change the way you go about prompting them to get the results you want and get the accuracy levels that you need.

To achieve high-quality, reliable results, it’s necessary not only to experiment with the prompts, but also the “personas” given to the model, the temperature, and the feedback provided.

Hiscox has developed robust governance frameworks and controls to ensure model accuracy and reliability. “We’ve spent a lot of time focusing on more recently is the governance, frameworks, the controls we’re going to need to have in place, [thinking about] how are we going to sort of monitor these models, make sure that they’re not drifting, make sure that they are accurate and they’re not hallucinating or coming up with the wrong answers“.

However, Butler remains sceptical about AI fully replacing human judgement in complex decision-making scenarios in the future. “They’re just too generic to replace highly skilled people”, he explained, adding that while there may be some highly tuned specific models that prove useful, “I don’t think those types of models are ever going to replace human decision making.

Read more about underwriting and Insurance industry:


The competitive advantages of Generative AI in augmented underwriting

As Paul Butler explained in detail, AI integration at Hiscox has yielded several competitive advantages:

  • Speed to Market: The “actual real value” to Hiscox’s business has come from “getting the turnaround from a submission coming in from the broker [and] getting the quote back to the broker from three days to a matter of minutes“, according to Butler.
  • Applications Beyond Underwriting: AI is also being explored for its use in other areas, such as claims processing. A pertinent example described by Butler was, “we could use satellite imagery to quickly assess if the damage to a property is genuine, see the ‘before’ and ‘after’ shots, and then pay the claim quickly“, saving both time and money, and improving customer satisfaction.


Lessons learnt from implementing AI in underwriting

Butler shared several lessons from Hiscox’s journey with AI:

  • Prompt Engineering is Key: Effective prompt engineering is essential to getting accurate results from AI models, but as Butler pointed out, “getting the prompt engineering right is complicated“.
  • Continuous Model Updates are Necessary: AI models evolve quickly; “a year ago we were building with Google PaLM as a large language model. That’s been deprecated now. We then built it with Gemini Version 1. And now we’re retuning and reprompting for Gemini 1.5” explained Butler, emphasising the need to keep up with the constant LLM development curve with a highly skilled and dedicated team.


“Accuracy scoring” framework for augmented underwriting

To ensure the reliability of AI models, Hiscox has developed an “accuracy score” framework. “For our neural network models, [accuracy scoring] is straightforward” according to Butler, but for LLMs, Hiscox relies on feedback from underwriters to refine accuracy.

To do this, they utilise a data dashboard displaying the results of their LLM information, and with continuous feedback from underwriters, who highlight any inaccuracies in the information extracted by the AI, the data is scored on its reliability.

Having this continuous feedback loop helps in fine-tuning the models and prompts, ensuring consistent performance.


Preparing for AI integration and augmentation

Successfully integrating AI requires more than technological expertise – it demands a supportive culture that “(…) is willing to experiment and invest in that experimentation“, and is to “accept failure” before achieving success.

At Hiscox, fostering such a culture has been key to their AI initiatives, with “stepping outside their comfort zone” a key requirement to achieving their end goals.

Lots of little bits of failure that teach you, okay, that doesn’t work, that doesn’t work, this does work, next thing doesn’t work. So you’ve got to have a culture that supports failure. You’re not going to innovate if you don’t have that.


Picturing the future of AI-augmented underwriting

Looking ahead, Hiscox plans to build on the foundations already in place.

With a solid grounding in a range of skills such as data science, machine learning, data engineering and so on, Hiscox aims to bring people together based on their capabilities to build a flexible structure aimed at efficiently solving specialised tasks to streamline the AI-assisted underwriting process effectively: “(…) for us as a London Market carrier, underwriting is our biggest value stream“, reported Paul, so building excellent underwriting tools is a high priority.

While these changes are promising, Hiscox are not seeking to radically shake up their current underwriting structure, with Butler definitively stating “the way we’re organised, the way we’re structured, it does work. I can’t see that changing for the next few years even with all the AI stuff going on”.


Summary

By leveraging advanced technologies like neural networks and large language models, Hiscox has proved itself an industry leader in dramatically improving the efficiency and accuracy of the underwriting process, reducing manual workloads and speeding up decision-making.

However, the importance of human expertise is not to be overlooked, as AI and LLMs are unlikely to replace human skills anytime soon. Rather, they can be utilised to augment current talents, speeding up and improving the underwriting process in areas that were previously time intensive.

As AI models and organisational structures continue to be refined, the focus must remain on balancing innovation with careful governance, ensuring that the benefits of AI are fully realised while minimising risks.

If you would like to watch our full interview with Hiscox London Market CTO, Paul Butler, follow this link to head over to our IT Insights Hub and enjoy our conversation on the practical implementation of AI augmentation in full.

With over two decades of expertise in data harmonisation, digital solutions, consulting, AI, and cloud services, Future Processing is the ideal partner to help you bring your digital initiatives to life.

We collaborate closely with you to understand and achieve your goals, ensuring your success in the market. If you’re interested in discovering how our industry-leading digital services can benefit your business, contact us today, and we’ll work with you to find the best solution for your unique needs.

Revolutionise your claims operations with futureClaims™

futureClaims™ is a flexible modernisation programme for the commercial insurance market, designed to address key challenges in the claims value chain.

Built as a modular set of components, it allows you to select only the features you need – without the need of a replacement of your entire claims technology stack.

]]>
https://www.future-processing.com/blog/the-practical-implementation-of-ai-augmented-underwriting/feed/ 0
Implementing new data standards https://www.future-processing.com/blog/implementing-new-data-standards/ https://www.future-processing.com/blog/implementing-new-data-standards/#respond Tue, 30 Jul 2024 08:00:30 +0000 https://stage-fp.webenv.pl/blog/?p=30407 Within the insurance sector, there is a lot of change happening, and with it, the expectation of an evolution of long-held processes and approaches, notably the adoption of standardised data practices, which seek to foster a more cohesive and effective collaboration between numerous market participants.

A unified cross-market approach to data standards is essential for achieving these goals, ensuring consistency, improving accuracy and increasing performance across the industry.

In the latest episode of our IT Insights InsurTalk by Future Processing, we met with the Executive Director of the London and International Insurance Brokers’ Association (LIIBA), James Livett, to discuss all about implementing new data standards in the insurance sector.

With nearly four decades of experience in the industry, Livett shared his extensive knowledge of the current state of data standards in the London market, the challenges faced, and the significant benefits that a unified approach to data standards offers brokers in the modern world. Livett emphasises the importance of collaboration between brokers and underwriters, and technology’s pivotal role in facilitating the transition to standardised data practices.

In this article, we explore the key takeaways from this engaging conversation and uncover how the insurance industry can successfully navigate the path toward data standardisation.


The current state of data standards in the London Market

Data standardisation has already been around for years and is not a new concept in insurance. Examples of this include LPAN and LCCF, for instance, which are ‘data standard’ by definition, but not strictly digital.

A&S and ECF are digitised market processes that are used with Velonetic to submit premiums and claims, and these have been available for more than 20 years using the ACORD DRI standard. Insurance companies have been using standardised data for a long time, so the state of the market on standards for central service interaction is currently very positive.

The challenge is that very few organisations use a common set of data standards within their businesses themselves – they are used across the wider industry, but less so actually within individual companies.

Not only utilising data standards across companies in the insurance industry as a whole but also within various departments of the companies themselves should be the goal as it has proved successful in other projects such as the Ruschlikon Initiative. However, this approach isn’t currently as widely adopted as it could be.

Another example of a current initiative to standardise data and analysis is Blueprint Two, an initiative by Lloyd’s of London which seeks to bridge the gap between data systems and operations and to provide a single, common data platform and approach for brokers, insurance companies and Syndicates.


The insurance ‘lingua franca’

Blueprint Two using the ACORD data standard could well turn out to be the insurance industry’s ‘lingua franca’ of the future, with key players in the world of insurance busy working on the platform to standardise and centralise databases and data analysis so that conglomerates of insurance brokers and companies can collate and analyse insurance data under one common platform.

As James Livett of LIIBA discusses, standardised data allows for a single entry of information, such as something as common and as simple as a contract’s inception date, rather than multiple entries by brokers, underwriters and claims teams.

This significantly reduces the chances of errors and streamlines processes, saving valuable time across tens of thousands of policies. In addition, having data in a consistent format enables more effective analysis and review of business trends and exposures, as disparate databases currently hinder comprehensive analysis.

Livett emphasises that data standards must be seamlessly integrated into operations, much like the interoperability between different mobile phone systems. Just as Android and iPhone communicate using common standards despite different software languages, insurance data standards should function invisibly to enhance and support industry practices. This integration is crucial for achieving the operational efficiencies and analytical capabilities that modernisation promises.


Driving the adoption of data standards in the London Market

The responsibility for driving the adoption and definition of data standards in the London Market doesn’t fall solely on brokers or underwriters but involves a collaborative effort across the entire ecosystem. This includes entities such as vendors, platforms and operations people who all play incredibly crucial roles.

The London market’s unique advantage over other insurance magnates lies in its centralised bodies such as the London Bureau – Velonetic, which handle a significant portion of London transactions (over 80%, according to James Livett).

Therefore, if we take the step to encourage these major insurance players in the London market to adopt standardised data and platforms, conforming to a common set of data standards and practices, brokers and underwriters on both sides will start to homogenise, to come together and work from the same direction to centralise and unify their data practices.

Encouraging the key players in the insurance industry (like the London Market) to onboard and adopt these new processes and data standards will encourage others to follow suit. It is the responsibility of those committed to progressing digital solutions to let the wider community know about the benefits of standardised data and platforms to encourage a more widespread adoption for the good of the industry as a whole.

Governance and committee collaboration are key, with passionate industry participants leading the charge to ensure effective implementation and modernisation.


The role of the MRC and CDR committees

The Market Reform Contract (MRC) and Central Data Repository (CDR) committees play a pivotal role in standardising data practices within the London Market.

The MRC committee, which is owned and overseen by the London Market Group, has been actively addressing a backlog of issues and queries since its formation.

Meeting monthly, the committee tackles a variety of topics to do with insurance and data standards and handling, from minor details to significant structural questions, ensuring that client-facing documents are precise and comprehensive. This process involves extensive collaboration and discussion, reflecting the complex nature of these standards.

Simultaneously, the CDR committee focuses on the broader aspects of data standards, particularly those affecting premium and claims transactions.

Despite their different functions, both committees are crucial in shaping a unified approach to data management in the market, and encouraging market participation through platforms that allow stakeholders to submit questions and feedback, resulting in a more transparent standardisation process.

The committees regularly interact to resolve overlapping issues, aiming to streamline processes and enhance operational efficiency by more closely meeting the market needs and dealing with practical challenges effectively.

Read more about data standardisation in the London Market:


Data standards implementation – success stories from outside the London Market

One of the most notable examples of successful data standard implementation outside the London Market is the ACORD Ruschlikon Initiative, which has been in operation for over two decades. It has garnered significant traction among major reinsurance brokers and underwriters across Europe.

Industry experts like Simon Squires at AXA XL, Emma Ford at Liberty, Tim Pledger at Swiss Re, Richard Brame at WTW, and Terry Calthorpe at Gallagher Re have all attested to the initiative’s success, highlighting the substantial operational benefits and efficiency gains it provides. The Ruschlikon model showcases how effective data standardisation can streamline processes and improve overall market functionality.

In contrast, the London Market’s earlier attempt at similar standardisation through the eAccounts system did not achieve the same success. Although it was adopted by a few brokers who reported impressive results – such as a 90-95% cash match rate and significantly faster processes – the initiative ultimately floundered. These experiences demonstrated that while new systems may seem daunting, their potential benefits in terms of operational efficiency and error reduction make them well worth the effort.

It is clear that while there was much room for improvement from the London Market’s data standardisation initiative, by looking outside of the London Market to other success stories we can begin to produce a roadmap for implementing data standards that will be effective and long-lasting.


The risks of not implementing effective data standards

The risks of not implementing data standards are significant and multifaceted. Firms that fail to adopt these standards will find themselves stuck in inefficient manual processes, producing PDF documents rather than leveraging automated systems.

This not only increases the likelihood of errors but also results in higher operational costs. Smaller firms, which may be reluctant to invest in new technology, might inadvertently adopt these standards through indirect usage. However, they will still face challenges if they cling to outdated, manual methods.


Mitigating the Risks

To mitigate these risks, the London Market is likely to enforce a more uniform adoption of data standards. The presence of central platforms and the bureau will drive this change, reducing the chance of non-compliance.

Within the next five years, it’s anticipated that the market will phase out manual processes entirely, making it mandatory for practitioners to interact with central services using standardised data formats. This transition will be facilitated by ensuring that even portal interfaces conform to these standards, thereby easing the adoption process and minimising disruption.


Summary

The implementation of new data standards, both within the London Market and in the wider insurance industry as a whole, is not just an aspiration but a necessity. The benefits are clear:

  • enhanced operational efficiency,
  • reduced errors, and
  • the ability to conduct comprehensive data analysis.

The collaborative efforts of brokers, underwriters, vendors and central bodies are pivotal in driving this transformation. Learning from successful models like Ruschlikon, the London Market can avoid the pitfalls of the past and embrace a future where data standards streamline processes and foster innovation.

The journey towards full adoption may have its challenges, but the end goal is a more efficient, accurate and competitive market. As the industry progresses, these standards will become an integral part of daily operations, benefiting all stakeholders involved. Embracing these changes now ensures that the London Market remains at the forefront of the global insurance industry, prepared to meet the demands of an increasingly digital world.

At Future Processing, we have over two decades of experience in data harmonisation, implementing digital solutions, consulting, AI and cloud services. We are your ideal partner to bring to market your digital initiatives, as we work closely with you to identify and bring your goals to life. If you are interested in learning more about Future Processing’s industry-leading digital services, contact us today and we will help find the best solution for your unique business.

]]>
https://www.future-processing.com/blog/implementing-new-data-standards/feed/ 0
London Market data standardisation – essential insights by Cassandra Vukorep https://www.future-processing.com/blog/london-market-data-standardisation-by-cassandra-vukorep/ https://www.future-processing.com/blog/london-market-data-standardisation-by-cassandra-vukorep/#respond Tue, 23 Jul 2024 07:54:51 +0000 https://stage-fp.webenv.pl/blog/?p=30359 Future Processing is working with market participants to deepen the knowledge about data transformation in the London Market and translate the standards and requirements into practical implementation steps.

With that objective in mind, we were privileged to host Cassandra Vukorep, the Chief Data Officer for Lloyd’s of London, at our recent InsurTalk event – InsurTalk: Navigating Data Standards at the London Market.

At Future Processing, we serve as a trusted technology consultancy and delivery partner, drawing from 25 years of experience in Specialty, Marine and other lines of the insurance industry, as well as profound comprehension of the London Market.

“(…) the London Market (…) is an ecosystem, where everybody speaks different languages, with different dialects in the way that they exchange data,” said Cassandra.

In her speech, Cassandra, who also chairs the Core Data Record (CDR) committee of the London Market’s Group (LMG) Data Council, addressed the critical need for establishing data standards in the industry, sharing real-life examples and success stories.

Interested in learning more? Dive into our article to find out about Cassandra’s crucial views on the London Market data challenges and the opportunities coming from creating a common framework for data management and communication.


Key takeaways

  • The London Market data challenges create inefficiencies and inaccuracies that hinder the market’s overall growth. Cassandra described the most prominent ones: legacy data systems, fragmented data, manual data capture, and compliance issues.
  • The benefits of the CDR implementation include using richer and wider, consistent sets of data, delineating the data, maintaining its integrity, and enabling efficient data management across the market.
  • Proper foundations and data literacy are essential for the CDR and Market Reform Contract (MRC) to encourage progress – they won’t solve all your data issues at once, but they will move your business forward, improving data accuracy by establishing a standard within the market.
  • Examples of real-life success stories from the Ruschlikon initiative include PartnerRe improving claims turnaround time by 30% thanks to the ACORD GRLC data standards and AXA XL reducing information transfer times by 60 days by adopting the EBOT Trading Model.
  • Future Processing’s Data Harmonisation Framework is designed to facilitate data standardisation in accordance with the latest data governance principles. Our robust solution enables Specialty Insurance to take advantage of all the benefits discussed during the InsurTalk event while minimising transformational risks.


What are the data challenges in the London Market?

Cassandra began her engaging and insightful presentation by identifying four primary challenges in the current data landscape:

  • Legacy data systems: the existing systems are outdated and often patched together with makeshift solutions, leading to inconsistencies and errors.
  • Fragmented data: data is scattered and unstructured, making it difficult to consolidate and analyse effectively.
  • Manual data capture: reliance on manual data entry introduces human errors and slows down processes.
  • Compliance challenges: differing interpretations and standards which can lead to compliance issues and regulatory risks if not addressed.


The London Market data challenge
InsurTalk – The London Market data challenge. Copyright: Cassandra Vukorep


These challenges create inefficiencies and inaccuracies that hinder the market’s ability to innovate and grow.


Data landscape – what does the future hold?

Cassandra moved on to discuss the future data landscape and the establishment of data standards.

This standardisation is crucial for achieving operational efficiency and scalability on the London Market, because “we have to all interact, and if we’re not interacting seamlessly, all it does is create speed bumps along the way. It slows down the process.”

She emphasised that implementing the standards would ensure data completeness, include the required granularity, enable interoperability, and improve efficiency.

It would also streamline data flows for brokers who:

“just want to go out and sell. They want to do the best for their clients. They want to bring analytics to their clients. When they place business, and when there is a claim, they want the claim to be paid out, and quickly. It’s a whole bunch of parts that work together. The standard needs to be there and improve efficiency of data”, concluded Cassandra.


Real-life stories – the Ruschlikon initiative’s success

To illustrate the advantages of standardisation and the collaboration with ACORD, Cassandra presented “(…) really great case studies with Ruschlikon, which prove that the data standards really work and bring many benefits to the table.

The Ruschlikon initiative has successfully implemented the EBOT and ECOT ACORD Standard specifications:

“with AXA, (…) Swiss Re, and they know how to actually get these benefits – those proof points are there. This is a standard. This isn’t about technology, but about the language that you use within your company. It’s through the mapping. If you know what your natural language is, you can map it to an ACORD standard.”

The case studies mentioned demonstrate significant improvements in efficiency, data integrity, cost reductions, and overall effectiveness.


InsurTalk - Case studies - the Ruschlikon initiative. Copyright: Cassandra Vukorep
InsurTalk – Case studies – the Ruschlikon initiative. Copyright: Cassandra Vukorep


The Core Data Record (CDR) – why is it vital?

Cassandra Vukorep highlighted the Core Data Record (CDR) as a critical component of Lloyd’s data strategy.

Aligned with the ACORD Global Reinsurance & Large Commercial (GRLC) Standards, the CDR serves multiple purposes, including:

  • Lloyd’s tax validation and calculation,
  • Lloyd’s regulatory reporting,
  • claims matching,
  • accounting and settlement.

“What are the benefits of the CDR? – asked Cassandra.

“Richer, wider set of data, consistent structure. It is transactional data, but it’s transactional data that you can use”, she explained. These standards also facilitate better data governance and lineage, reducing errors and increasing the accuracy of market operations.


InsurTalk - Benefits of CDR and data standards. Copyright: Cassandra Vukorep
InsurTalk – Benefits of CDR and data standards. Copyright: Cassandra Vukorep


Cassandra emphasised the importance of data literacy within organisations, advocating for robust data ownership and stewardship programmes. These programmes help ensure that data is understood, trusted, and used effectively across all levels of an organisation.

“(…) it is [also] really great for the broker because (…) brokerages are moving to be more like consultants. You’re a consultant, you’re an advisor for your clients, you’re telling them the right way to go. Clean, good, accurate data gives you more analytics to add more value to your clients (…), to be able to sell more, to cut out your competition, to really drive value for that client and give them a good experience, is having good, solid data”, highlighted Cassandra.


Debunking the CDR-related myths

Approaching the end of the presentation, Cassandra addressed several myths surrounding the CDR. She clarified that:

“the CDR will not solve all of your data issues. It is not a singing, dancing holy grail. It is part of a piece of a puzzle to move people forward and to actually think about adopting something for the ecosystem. But as discussed before, you need proper foundations. (…) you need data literacy.”

The CDR also does not imply straight-through processing but helps automate processes and improve data accuracy by establishing a standard within the market.


InsuTalk - CDR myths. Copyright: Cassandra Vukorep
InsuTalk – CDR myths. Copyright: Cassandra Vukorep


“The CDR will give us completeness. With completeness, you have analytics, and with analytics, you have great growth”, concluded Cassandra.


Assisting the insurance sector in digital transformation

Future Processing’s InsurTalk event was an afternoon full of insightful discussions and presentations from London Market thought leaders. As Blueprint Two’s Phase 1 cutover draws near, navigating data standards are more important than ever.

Having worked with Lloyd’s of London’s syndicates, insurance companies, and global insurers, we possess a deep understanding of the London Market. Our Data Harmonisation Service ensures data reconciliation by implementing quality assurance standards and constant monitoring throughout the project lifecycle.

As your trusted partners, we align with your business objectives, focusing on strategic agility and stakeholder-oriented design. We also excel in integrating large amounts of data, enabling its fast processing in real time.

If you need support in achieving your digital transformation while standardising data, contact us – we will choose the best options for your unique organisation to elevate your business.

]]>
https://www.future-processing.com/blog/london-market-data-standardisation-by-cassandra-vukorep/feed/ 0
Transforming data into success: the harmonisation advantage – insights from the panel discussion https://www.future-processing.com/blog/transforming-data-into-success-data-harmonisation/ https://www.future-processing.com/blog/transforming-data-into-success-data-harmonisation/#respond Thu, 11 Jul 2024 08:09:41 +0000 https://stage-fp.webenv.pl/blog/?p=30296 Future Processing, the specialist IT firm in the London Market, organised a panel of industry practitioners to translate requirements and standards into practical implementation steps that firms might want to consider when embarking on a data project. 

The session, “Transforming Data into Success: The Harmonisation Advantage,” featured industry experts Sharon Stanley, Managing Director at GPM Development Limited, Lewis Gibbons, Operations Director – Business Transformation at Howden Insurance Brokers Limited, and Clarissa Montecillo, Head of Global Reinsurance and Large Commercial at ACORD, with Piotr Piękoś, Head of Insurance at Future Processing, serving as the moderator. In addition to responding to questions from the moderator, the experts also referred to the results of a survey that event participants completed during the session. 

With over two decades of experience in the Specialty, Marine, and other lines of the insurance industry, Future Processing effectively addresses complex data challenges.

Our successful data harmonisation implementations have accelerated Go-To-Market (GTM) initiatives, facilitated early revenue collection, and significantly boosted revenue in one client example from 1M GBP to 5M GBP while maintaining a strong divisional Combined Ratio below 90%.

This extensive expertise in consulting, data migration, AI, and cloud services makes us the ideal choice to convene such panels.


Key takeaways

  • Successful data harmonisation starts with a comprehensive understanding and governance of data. Engaging business stakeholders ensures the project aligns with business goals, while clear communication between IT and business teams helps avoid misunderstandings.
  • Seamless integration requires clear data definitions and formats. Focusing on data quality, standardisation, accessibility, and interoperability will reduce manual work, ensure data is well-organised, and enable informed decision-making.
  • Before digitising, it is crucial to understand your current data and processes. Clearly defining business needs and involving stakeholders throughout the process is essential. Ongoing governance and management are key for lasting success.
  • Companies that do not adopt data harmonisation will struggle with competitiveness and attracting talent. Regulatory pressures will push towards data-led modernisation. Proactive companies will gain a significant competitive advantage.
Data harmonisation is a process of optimising your company’s data structures, in line with business and external requirements to deliver tangible business value.


Understanding the challenges of data harmonisation

The discussion kicked off with a focus on the importance of understanding data and its flow within organisations. Clarissa Montecillo emphasised the need for comprehensive discovery and data governance.

She pointed out that many digital transformation programs fail because they do not spend enough time understanding their data and its lifecycle within the organisation. “Not enough time is spent in the discovery phase,” she said, “before jumping into target models and objectives. It’s crucial to understand your current data estate to build an effective roadmap for transformation.

IT Insights InsurTalk - What are the key success factors during the data harmonisation process
IT Insights InsurTalk – What are the key success factors during the data harmonisation process

Lewis Gibbons echoed this sentiment, stressing the importance of business-focused data initiatives. “When we look at data standardisation and harmonisation, it can be quite siloed. Bringing that business-centric focus into the initiatives is crucial,” he noted.

Gibbons shared an experience where lack of business engagement led to poor quality outputs in a data sanitisation project, underscoring the importance of involving business stakeholders throughout the process. “The people who were working with the data didn’t really understand the business case. They weren’t invested in that piece of work. Bringing business and technical stakeholders together is essential to ensure that data initiatives are aligned with business needs and objectives,” he stated.

Sharon Stanley added that effective communication and a common vocabulary are vital for successful data standardisation.

She noted that there is often a lack of integration between IT and business functions, which hinders the harmonisation process. “Understanding the language different stakeholders use and ensuring they see the value in their input is key to bridging this gap,” she said.


Envisioning the ideal state of data structures

When asked about the ideal state of data structures, the panellists provided a clear vision. Clarissa Montecillo emphasised the need for a cohesive understanding of data definitions, formats, and structures across platforms to enable smooth integration and business benefits. “An ideal state is where there’s a common understanding of data and its structures, allowing seamless integration across systems,” she explained.

Sharon Stanley envisioned a scenario where data flows smoothly through the value chain with minimal manual intervention. “Utopia is a single entry of data that moves through the cycle, augmented as needed, and passes through a gateway that translates it into a common standard,” she said. She recognised the difficulties posed by legacy data but stressed the importance of starting with new, well-organised data.

Lewis Gibbons outlined four key areas for an ideal data state: quality, standardisation, accessibility, and interoperability. He highlighted the importance of data quality and standardisation in driving reliable insights and operational efficiency. “Quality data is crucial for trusted outputs that can drive business insights,” he noted. “Standardisation ensures that data is consistently understood and used across the organisation.


First steps towards data harmonisation

The panellists also shared practical advice for organisations that are just starting out on their data harmonisation journey. Sharon Stanley recommended focusing on understanding the data and processes before digitising them.

Understand the data, understand the process, and know whether you want to improve that process before just repeating and digitising a process that’s not useful,” she advised. She also emphasised the importance of communicating the value of data projects to all stakeholders to ensure buy-in and support.

Clarissa Montecillo reiterated the need to spend time understanding the current data estate and the processes surrounding it. She was surprised by the survey results that prioritised cost reduction, arguing that the true value of standard data lies in its ability to drive informed decisions.

IT Insights InsurTalk - How does your organisation benefit from harmonising data
IT Insights InsurTalk – How does your organisation benefit from harmonising data

Lewis Gibbons highlighted the importance of clearly defining business requirements and keeping stakeholders engaged throughout the process. “The first step is understanding from the business what their requirements are and being very clear on that,” he advised.

Keeping that stakeholder engaged throughout the process and supporting them is essential.” He also emphasised the role of ongoing governance and management in ensuring the sustained success of data harmonisation initiatives.


Future implications for market players

The panel concluded with a discussion on the future implications for market players who do not catch up with data harmonisation. The panellists agreed that organisations that fail to embrace data harmonisation will face increasing difficulties in remaining competitive.

Sharon Stanley noted that such organisations would struggle to attract talent and retain their market position. “They will become less attractive places to work and may lose out competitively,” she warned.

Clarissa Montecillo and Lewis Gibbons agreed, adding that while these organisations might not disappear, they will find it harder to perform their jobs efficiently. Gibbons pointed out that regulatory pressures, such as those from the Financial Conduct Authority (FCA), will push organisations toward data-led supervision and modernisation.

He also pointed out the significant opportunities available for proactive organisations to leverage data for competitive advantage. “There’s a huge opportunity for us to do it without having to be forced. We can leverage this demand proactively,” he concluded.


Conclusion: the road ahead

The insights shared during the InsurTalk discussion panel highlight the critical role of data harmonisation in the insurance industry. It is not just about operational efficiency but also about enabling better decision-making and strategic insights.

The experts stressed the need for a collaborative approach, involving both business and technical stakeholders, to achieve successful data harmonisation. By understanding and leveraging their data effectively, organisations can drive significant benefits and maintain their competitive edge in the evolving insurance landscape.


Future Processing is your ideal partner

For those in the insurance industry, the message is clear: the harmonisation of data is not just a technical requirement but a strategic imperative for future success. Future Processing is your ideal partner for achieving these goals. Our expertise can help you navigate the complexities of data harmonisation and ensure your organisation remains competitive and efficient.

Want to learn more about how data harmonisation can benefit your organisation? Visit our website or contact our experts to find out how we can transform your business.

]]>
https://www.future-processing.com/blog/transforming-data-into-success-data-harmonisation/feed/ 0
Navigating data standards in the London Market: essential insights by Mark Bennett https://www.future-processing.com/blog/navigating-data-standards-in-the-london-market/ https://www.future-processing.com/blog/navigating-data-standards-in-the-london-market/#respond Thu, 04 Jul 2024 06:51:32 +0000 https://stage-fp.webenv.pl/blog/?p=30228 “Everyone through the lifecycle of an insurance transaction values from getting data in a consistent fashion rather than the wild, wild west of different formats (…) ultimately needing transformations all the way through.”, said Mark Bennett, Senior Vice President of Global Business Development at ACORD and ACORD Solutions Group.

In his speech during Future Processing’s InsurTalk event, centered on the theme of “Operationalising Standards and the Changing Patterns of Data Exchange”, Mark shed light on the importance and implementation of data standards in the insurance industry.

It is surprising to note that over 50 years ago, data standards in the insurance industry were composed of printed documents, including 800 forms for certificates of insurance and proposal forms, which have now evolved into modern JSON digital standards, facilitating modern data exchange.


Key takeaways

  • Standards are crucial for consistent data handling across the entire insurance transaction lifecycle, reducing the need for constant transformations.
  • Adopting standardised data practices leads to substantial improvements in efficiency, data quality and operational processes.
  • Embracing standards is essential for leveraging new technologies like AI, which transform data exchange processes across the insurance industry.
  • Future Processing’s Data Harmonisation Framework is designed to facilitate data standardisation in accordance with the latest data governance principles. This robust solution enables Specialty Insurance to take advantage of all the benefits discussed during the InsurTalk event while minimising transformational risks.


The crucial role of data standards in driving efficiency and innovation in the insurance sector

In the insurance industry, data standards are invaluable to every participant along the value chain, from the originating producer to the final stages of reinsurance, permeating through various organisational functions such as marketing, underwriting, policy administration, claims, customer management, and enterprise management.

Without standards, there is a wide variation in operations and numerous transformations required, which can lead to inefficiencies and inconsistencies.

InsurTalk Mark Bennett - why do standards matter to the insurance sector
InsurTalk – why do standards matter to the insurance market? Copyright: Mark Bennett

Mark pointed out: “We’re all seeing an exponential growth in data, and for underwriters (…), the challenge is making the most out of this data and bringing it together.”

Standards help streamline this process, reducing the need for extensive data augmentation. Moreover, as the industry evolves with advancements like generative AI, the way data is exchanged continues to transform.

Mark pointed out that ACORD’s role in managing data standards across the insurance value chain becomes increasingly pivotal, ensuring that the industry can adapt to and leverage these new technologies effectively.

He highlighted several key advantages, like:

  • up to 50% improvement in turnaround time by adopting digital messaging standards for organisations,
  • a nearly 70% improvement in cash flow by accurately allocating incoming funds,
  • over 80% increase of data quality by eliminating manual data entry,
  • query times reduced by more than 70%.

These tangible benefits were demonstrated by the examples of companies such as SCOR, Swiss Re, Aon, and WTW, showcasing the practical advantages of adopting data standards.

InsurTalk Mark Bennett - benefits of data standards in insurance
InsurTalk – benefits of data standards in insurance. Copyright: Mark Bennett

This further proves how crucial implementation of data standards and data harmonisation are and highlights the importance of finding a strategic tech partner to achieve that goal. At Future Processing, we serve as your trusted technology and consultancy partner, and digital transformation of the insurance sector is our shared objective.


Overcoming challenges in adopting data standards

Mark addressed some common challenges faced by organisations in adopting data standards, particularly system integration and budget constraints. He explained how ACORD and its partners are tackling these issues through initiatives like ADEPT, “a messaging gateway […], a portal and an API enabling sender and receiver patterns on the ACORD standards,” and Ruschlikon hubs, which have been implemented in regions including Italy, Spain, Bermuda, the Middle East, and Singapore.

“These initiatives help organisations start with ACORD messaging, supported by system vendors already pre-configured for these standards,” he added.

The Rushlikon initiative was also mentioned by Cassandra Vukorep, the Chief Data Officer for Lloyd’s of London and a keynote speaker at the event. She mentioned that the initiative has successfully implemented the EBOT and ECOT ACORD Standard specifications “with AXA, (…) Swiss Re, and they know how to actually get these benefits – those proof points are there. This is a standard. This isn’t about technology, but about the language that you use within your company. It’s through the mapping. If you know what your natural language is, you can map it to an ACORD standard.”


Looking into the future

With upcoming advancements in mind, Mark explored the integration of emerging technologies such as AI with data standards. He highlighted the importance of preparing the industry for these technological changes and the potential benefits of leveraging AI for more efficient data exchange processes.


Practical steps to start your data standardisation efforts

For organisations looking to start their journey towards data standardisation, Mark offered actionable advice. He recommended “mapping out current trading partners, assessing their readiness for ACORD standards, and prioritising implementation efforts.” He also stressed the importance of learning from peers who have already adopted these standards to avoid common pitfalls.

Mark encouraged organisations to engage with their software vendors, many of whom have already integrated ACORD standards into their systems. As an ACORD member, Future Processing helps clients succeed in their digital transformation efforts. Our Data Harmonisation Framework optimises organisations’ data structures, aligning them with business and external requirements to deliver tangible business value.

InsurTalk Mark Bennett - steps to consider before applying data standards
InsurTalk – steps to consider before applying data standards. Copyright: Mark Bennett


Conclusion

All things considered, adopting data standards is not just a technological upgrade but a strategic move that can transform the insurance industry, ensuring consistency, accuracy, and efficiency across the entire value chain.

Are you interested in learning more about data standards for your company? Future Processing’s Data Harmonisation Framework is a robust solution designed to facilitate data standardisation in accordance with your stringent data governance policies. Feel free to contact us to discuss the possibilities for your unique business.

]]>
https://www.future-processing.com/blog/navigating-data-standards-in-the-london-market/feed/ 0
Digital underwriting through the lens of MGAs https://www.future-processing.com/blog/digital-underwriting-mgas/ https://www.future-processing.com/blog/digital-underwriting-mgas/#respond Tue, 07 May 2024 10:18:18 +0000 https://stage-fp.webenv.pl/blog/?p=29225 The transformation, fuelled by advancements in artificial intelligence (AI), machine learning (ML) and data analytics, promises to redefine the traditional paradigms of risk assessment and policy pricing.

The shift towards digital underwriting is not merely a trend but a strategic imperative, aiming to:

  • drive efficiency,
  • enhance decision-making accuracy
  • and elevate the customer experience to new heights.

The latest episode of IT Insights InsurTalk series by Future Processing sheds light on this transformative journey, featuring an in-depth conversation with Michael Keating, CEO of the Managing General Agents’ Association. With a career spanning over four decades in the insurance industry, Keating brings an experienced perspective on the digitalisation of underwriting processes having worked in a variety of Managing Director roles in multiple insurance companies.

In this article, we delve into some of the most important themes discussed in our conversation, such as the growing importance of the MGA market sector, the evolution of underwriting processes and the vital role of human interaction in the age of automation.

Through Michael’s lens, we explore the challenges and opportunities presented by digital transformation in insurance underwriting, providing a comprehensive overview of the current landscape and envisioning the potential that lies ahead.


The importance of the MGA market sector within the London Market

The MGA market sector has emerged as the fastest-growing segment within the UK insurance market, largely driven by innovation, entrepreneurial spirit, the expertise of underwriting in the industry and insurers’ subsequent confidence in allocating capital to specialised underwriters.

Within the London Market, all of these attributes are firmly embedded, with international MGAs at their heart. There are UK domicile MGAs within the London Market, such as Lloyd’s, boasts over 600 cover holders. Not all of these are MGAs, but a large proportion of them are indeed.

The London Market’s blend of international and domestic entities underscores the sector’s integral role in the insurance ecosystem and makes up its very fabric, and due to this ever-evolving underwriting expertise and specialisms, the London Market enjoys a position of diversified and specialised insurance underwriting conglomeration that is simply not mirrored anywhere else on this globe.


The evolution of underwriting processes for MGAs

The journey from traditional, paper-based underwriting to digital processes marks a significant evolution in the MGA sector.

Long gone are the ‘quill and ink pen’ days of underwriting, since replaced with technology as the cornerstone of the insurance industry. This technology:

  • helps to streamline transactions,
  • enhances communication,
  • and ultimately leads to improved customer satisfaction.

An important aspect of this technological transition in MGAs’ underwriting processes is the move towards electronic bordereaus (reports from an insurance company to its reinsurer, listing either the assets covered or the actual claims that are paid).

While electronic bordereaus have not been adopted by the entire insurance sector quite yet, with some aspects of the business still being very much paper-based, they hold a significant presence in the market, which is only likely to increase.

Technology has significantly impacted the insurance underwriting world in two key ways:

  1. it has removed frictional costs (the total direct and indirect costs associated with the execution of a financial transaction)
  2. and increased communication channels between all stakeholders (provided they all have access to the same technology).

Technology ultimately enables solutions that increase customer satisfaction, most importantly when it comes to the speed and overall experience of going through the claims process, both inside and outside of the London Market.


The crucial role of ‘human touch’ in claims processing

While the technological advancements we’ve enjoyed in the insurance industry have undoubtedly brought with them numerous benefits, the human element remains crucial in claims processing.

Michael Keating mentioned that through his own deep dive into consumer intelligence, despite the advancements and wide-spread adoption of technologies such as AI, interactive ‘chat boxes’ and even Chat GPT, 8 out of 10 customers prefer to speak to a human when it comes to getting through triage successfully.

The message customers are delivering is clear – the industry’s push towards automation and AI tools must not overshadow the importance of human interaction.

Despite the undoubted need for the insurance sector to explore and adopt digital solutions, a balanced approach is required, one that incorporates technology without losing the personal touch.


AI’s role in improving the underwriting process

While it is still not yet fully developed in the insurance underwriting world, AI is nonetheless poised to revolutionise the underwriting process.

It offers several advantages, including, the ability to increase operational productivity and to reduce frictional costs. AI can triage broker presentations and strip out information that is not relevant or required to price and assess risks, saving huge amounts of time, increasing productivity and saving costs.

However, adopting AI and bolstering its capabilities requires insurance companies to step back and have a clear business plan on how they intend to use it. Simply adopting AI for adopting AI’s sake is not optimal – it must be based on a clear and well-thought-out strategy. AI must be a tool that is used to achieve strategic goals, not the end goal in and of itself. AI doesn’t, and shouldn’t replace the human element in insurance underwriting, where a human touch will always (hopefully!) be required.


Digital underwriting transformation from the MGA perspective

Digital transformation in underwriting goes beyond technology adoption; it requires a synergy between MGAs and their capacity providers.

All parties in this partnership should be founded on shared data and goals, ensuring seamless communication and a unified approach to risk assessment, pricing, claims management and any other insurance industry sub-section.

When adopting digital processes and technologies, insurance companies must gather all stakeholders ‘around the table’, so to speak, to review, discuss and share their vision and to ensure that they are all on the same page.

A recurring theme in the digitalisation of insurance underwriting is that of achieving a ‘single set of data’.

Insurance companies tend to use different systems, technologies and communication portals, meaning that data is stored in various locations, making it hard to access and analyse. In order to fully embrace a widespread digital solution, collating all data into one common set is the key to fully digitising it effectively and efficiently.

Poor (or a lack of) data can be catastrophic for insurance companies, so in terms of the richness of data, the alignment in how you import that data and how it matches premiums (in terms of analysing it and coming up with efficient solutions) is really important.

Communication is key – communication between companies and their stakeholders, between the London Market and between the wider insurance industry as a whole.


The art of possible: the utopia of digital underwriting?

The ideal state of digital underwriting marries technological efficiency with human insight. While digital tools and solutions will (and already are) undoubtedly change the landscape of insurance forever, it is important not to run before we can walk. A human touch is most certainly required, now more than ever, but this shouldn’t be cause for pessimism when it comes to adopting new digital technologies and solutions.

This ‘utopia’ will likely centre around a central electronic data exchange where risk appetites, risk assessments, pricing, operational efficiency and data analysis take place openly and seamlessly.

Through this, insurance companies could expect to enjoy profitable growth, groundbreaking developments through innovation, and exponentially improved efficiency and cost savings.

Technology, AI and machines should be used as tools to bolster insurance underwriters’ risk and data analysis tasks, and not be the end goal in and of themselves. This vision emphasises the importance of a holistic approach where technology enhances human capabilities, ensuring that the insurance industry not only adapts to the digital age but thrives in it.


Summary

As the insurance industry continues to evolve, the balance between technological innovation and the human touch will define its future.

By embracing digitalisation while preserving the essence of personal interaction, the industry can navigate the challenges ahead, offering enhanced services that meet the evolving needs of consumers.

The insights shared by Michael Keating offer a roadmap for this journey, highlighting the potential for growth, efficiency, and customer satisfaction in the era of digital insurance.

If you would like to watch our full interview with MGAA CEO Michael Keating, follow this link to head over to our IT Insights Hub where you can enjoy our gripping conversation in its entirety.

]]>
https://www.future-processing.com/blog/digital-underwriting-mgas/feed/ 0
The digitisation of underwriting https://www.future-processing.com/blog/the-digitisation-of-underwriting/ https://www.future-processing.com/blog/the-digitisation-of-underwriting/#respond Thu, 28 Mar 2024 09:40:11 +0000 https://stage-fp.webenv.pl/blog/?p=28968 Underwriting in insurance is a critical process that assesses risk and determines the premiums for insurable interests.

It’s a foundational pillar of the insurance industry, balancing the scales between profitability and risk management. With the digital age firmly upon us, the need for digitisation in underwriting has never been more pronounced.

Recent developments in artificial intelligence (AI), machine learning (ML) and big data analytics are paving the way for more streamlined and automated underwriting processes.

This is supported comprehensively in the industry through generous predictions of increasing growth in the sector, with Deloitte’s 2024 Global Insurance Outlook forecasting insurance underwriting’s 2022 market size of $81.5 billion increasing to $130.1 billion by 2027 – a compound annual growth rate of more than 9.6%.

This transformation is not just a matter of improving speed but also about enhancing the quality of risk assessment and decision-making.

In Future Processing’s latest installment of our IT Insights InsurTalk series, we met with Dave Connors, founder and CEO of distriBind, a digital data exchange (that is part of the Lloyd’s Lab) from London, UK, delivering automated back office processing solutions for every stage of the insurance lifecycle.

Dave began his insurance career by exploring a range of different departments, including working as an insurance technician and taking up roles in claims and underwriting. Dave has spent a lot of time working in delegated authority in his career, experiencing the ins and outs of risks, claims augmentations and reinsurance.

Having gained a full overview of both the successes and shortfalls of other insurance companies, Dave founded distriBind in 2018 and has navigated it to great success ever since.

In this article, we explore the insights gained in our discussion, including the significance of delegated authority and the critical role of data in the modern insurance market, as well as offering insights into the current state and future possibilities of underwriting digitisation.


The importance of delegated authority within the London Market

Delegated authority plays a pivotal role in the insurance market, particularly in London where it accounts for over 40% of the overall market gross premium.

This is not only true of London, with the global delegated authority premium reaching an incredible $100 million in 2020. This delegated authority arrangement allows insurers to outsource underwriting authority to third parties, enabling a more diversified and widespread risk-taking capacity.

The global scale of delegated authority underscores its importance, with the market for programme business in the US alone now approaching $100 billion in 2024, according to Dave Connors.

This model’s effectiveness in distributing risk and capital demonstrates the critical role delegated authority plays in the global insurance landscape.

Dave also offered a useful counterpoint to this notion; he mentioned the idea that it’s not always possible to externalise a particular data standard outside of this market, which is why this is a specific area that needs to go through a level of evolution in delegated authority in insurance – both in the London and wider global markets as a whole.


The insurance underwriting process – historic overview

Traditionally, the underwriting process, especially in the context of delegated authority, heavily relied on manual, spreadsheet-based methods.

Data was exchanged either monthly or quarterly, leading to delays and inefficiencies in risk assessment and policy issuance.

Typically, the MGA would sell the insurance policies throughout the month and then spend a week or two of the following month compiling the information about all the policies they had sold for that month to the brokers, who would then pass it on to the insurers.

In addition, the claims would also be included, further complicating this highly manual system.

This approach presented significant challenges in data accuracy, visibility and timeliness, and made things particularly difficult for insurers to make accurate connections between policies.

Nowadays, Dave explained that while there have been noteworthy advances, the process remains quite a laborious task that is still highly manual.

This slow pace of development further underscores the need for digitisation and automation in underwriting practices in these modern times.


The importance of data standards in delegated authority

According to distriBind CEO Dave Connors, the push towards establishing common data standards in the insurance industry has been a double-edged sword.

Dave explained that while internal consistency within organisations is beneficial due to its standardisation and internal consistency, the external imposition of standards on companies such as Lloyds of London can stifle innovation and flexibility, holding them to regulations and practices that may not always be optimal for their operations.

Quite simply, sometimes ‘a square peg doesn’t go in a round hole’, as the old adage goes. It is important for both insurance underwriting and delegated authority that the standards and practices applied to them are suitable for both the market and the individual company, and not simply applied because it worked well elsewhere.

The diversity of systems, processes and technological sophistication across the insurance value chain necessitates a more adaptable approach to data exchange. Recognising the need for flexibility rather than rigidity in data standards is crucial for fostering innovation and efficiency in underwriting processes.


The challenges in data ETL specific to delegated authority

The greatest challenge of unifying data standards is doing so in a manner that suits each company comprehensively.

Operations and requirements vary greatly between organisations, and whether they are smaller domestic brokers or larger brokers (such as those on the London market), the degree of manual, ‘paper-based’ operations still in place present significant barriers to a modern digitising world.

Insurance companies have now realised the importance of turning to digital solutions but the issues they are increasingly facing are how to apply technology to create a suitable, effective product.

Data comes in so many forms and there is not currently an efficient way to collate all of this data into a simple and functional system. In an ideal world, brokers worldwide would have access to a type of real-time API solution that could compile the data in an easy-to-read interface, noting trends, capabilities and opportunities for insurance.

However, for this to happen there would need to be a series of groundbreaking solutions found, not least a central digital platform that collates all the data in a single environment with an easy-to-analyse system interface.

The wide range of technological capabilities among MGAs, brokers and insurers requires a flexible approach to data integration and processing. Addressing these challenges is essential for improving data quality, visibility and operational efficiency in the underwriting process.


Delegated authority in the context of Blueprint Two

Delegated authority is integral to the vision outlined in Blueprint Two, Lloyd’s of London’s ambitious strategy to deliver profound change in the Lloyd’s market through digitalisation.

Blueprint Two seeks to be the ultimate digital solution in the London insurance market through the creation of a one-stop digital platform for insurance brokers all over the world.

Through a number of key phases, the first beginning on 1st July 2024, it aims to become a fully comprehensive digital insurance platform that will provide the full capability to support digital placement of risk through digitisation and automation.

However, Dave Connors is much more cautious when it comes to celebrating the success of Blueprint Two ahead of time.

He mentions that in order to succeed, Blueprint Two will require a departure from traditional approaches to data standardisation and exchange.

To be ultimately successful, Dave believes that Lloyds will need to adopt a more open-minded and flexible strategy that accommodates the diverse ecosystem of the insurance market, as this is essential for realising the benefits of digitisation in underwriting and across the insurance value chain.


Looking towards the future and “the art of possible”

The future of underwriting in the insurance industry will likely centre around digital and real-time data exchange via APIs, albeit at a varied pace across different market participants.

Whether or not Lloyd’s of London’s Blueprint Two will be at the central focal point of this solution remains to be seen, but it is likely to play a hugely pivotal role.

The transition to an all-encompassing digital solution will necessitate tools and processes that can accommodate a wide range of technological capabilities, ensuring that quality business is not turned away due to rigid tech standards.

The evolution towards a more digital, efficient, and flexible underwriting process is inevitable, driven by the need to better manage risk and meet the changing expectations of consumers and businesses alike.


Conclusion

The digitisation of underwriting is a journey that transcends mere technological adoption. It’s about reimagining the underwriting process to be more agile, accurate and aligned with the needs of a rapidly evolving market.

The insights gained from distriBind’s Dave Connors highlight the need for effective digital solutions in the insurance underwriting space but also underscore the challenges and opportunities that lie ahead.

As the insurance sector embraces digital transformation, the focus must remain on flexibility, innovation and the strategic use of data.

The path forward has its challenges, not least of all the question as to whether or not a single or multi-faceted digital solution is the answer, but one aspect is clear – a fully digital solution is required and this is the direction that the insurance must, and is, moving towards and the potential rewards for insurers, policyholders and the broader economy are substantial.

The digitisation of underwriting promises to redefine the insurance landscape, making it more responsive, efficient and capable of addressing the complexities of modern risk.

If you would like to watch our full interview with Dave Connors, CEO of distriBind, where we discuss the digitisation of underwriting and his thoughts on the strengths and challenges of making the digital transformation in the insurance underwriting space, please follow this link to visit our IT Insights Hub.

]]>
https://www.future-processing.com/blog/the-digitisation-of-underwriting/feed/ 0
How to save the underwriter from extinction? https://www.future-processing.com/blog/how-to-save-the-underwriter-from-extinction/ https://www.future-processing.com/blog/how-to-save-the-underwriter-from-extinction/#respond Thu, 15 Feb 2024 09:54:26 +0000 https://stage-fp.webenv.pl/blog/?p=28515 As part of our IT Insights InsurTalk series, Future Processing sat down with Artur Niemczewski, Non-Executive Director at the Chartered Insurance Institute, to discuss how AI is both benefitting and causing a few headaches for insurance leaders across the world and looked in detail at how the insurance landscape is being affected by this new technology.

In this article, we take a deep dive into the current state of insurance underwriting and look in detail at what might be achieved through the successful adoption and integration of modern AI and Machine Learning tools and how they will very likely benefit the entire industry as a whole.


The current state of AI tools in insurance underwriting


2017 predictions

Back in 2017, the insurance press was swamped with doomsday predictions about the possible impact of AI replacing human workers in the near future. At the time, publications such as Insurance Business were predicting that as much as 98.9% of insurance underwriting jobs were at risk of being replaced by AI, leading to some understandably concerned sentiments in the world of insurance.


Present day

Fast forward to the present day and it is clear that these 2017 predictions of the obsolescence of the insurance underwriter role have yet to come to fruition due to a slower adoption rate of new technologies than many had foreseen.

While AI did make some inroads, the fundamental underwriting tasks remained heavily reliant on manual processes, enveloped within a labyrinthine of spreadsheets.

The current sentiment towards AI technologies has shifted somewhat, with the World Economic Forum now feeling that insurance underwriters are actually among those who have the most to gain from AI augmentation.

While at first glance this is very promising, there is still a way to go before we fully understand how AI augmentation and other AI tools can be fully utilised to realise their potential in the world of insurance.

AI uptake has been a hot topic in insurance in recent years. Even before the public release of generative AI tools such as ChatGPT in late 2022, insurance companies had long been experimenting with their own AI software in an attempt to solve a number of key issues in data handling and analysis, as well as data augmentation in the field of underwriting and risk analysis.


The current issues faced in insurance underwriting


Ethics

Ethics have always been a key issue, with many companies having developed policies to guide them in how they would use the huge datasets that AI models require in order to provide fair, accurate and professional insurance services and offers.

A key question for businesses has always been centred around the usage of AI digital tools in the decision-making process, something that is hotly debated.

It’s crucial to understand how these tools might work, what challenges they might present, what their outcomes might look like and how to balance this efficiency with effectiveness and fairness to the customer.


Judgement

Insurance companies are concerned with the concept of ‘risk’ while insurance customers are concerned with ‘trust’.

Regardless of which side of the fence you sit on, there is a common denominator: judgement. Consumers and insurance companies alike want to know which brokers to trust and what the risk is of choosing one over another.

Until now, there have been numerous methods to make this judgement, which is confusing at best. Before AI, companies would look to create long and complicated spreadsheets filled with data and other information, held together with complex code.

However, businesses have found that when, for example, the member of staff who created that particular piece of code leaves, they are left high and dry, unable to adapt or evolve this code. This forces them to leave it alone and makes it impossible to make any meaningful updates, largely rendering the old code useless.


A lack of integration

As more and more generations of these types of codes emerged, they were successful in creating limited solutions to individual problems but failed to solve the problem as a whole.

The COVID-19 lockdowns pushed the whole industry to use infinitely more electronic data solutions, but this has largely not helped or sped up the underwriting process.

Why? Because these solutions used in isolation do not function well together, they do not integrate.

The problem is not a lack of time and financial investment on the part of insurance companies when it comes to creating new digital solutions, the problem is that there are ‘too many’ solutions on the market, all of which solve a particular problem but do not solve all of the problems that need to be solved.

These ‘in-house’ solutions are not the answer, as Artur Niemczewski explains.

Companies tend to build solutions in isolation; ‘building a workbench over here, an underwriting workbench over there, a data augmentation tools that will pull data from social media and third-party databases over here’, and so on.

But as an underwriter, Artur explains that with multiple data streams coming from all directions don’t get the underwriter any closer to tying them all in together and making a good, well-informed decision.

Multiple separate tools don’t currently integrate, which results in too much time being taken up by simply working out how each tool works and learning what the data in front of him even means!

Companies are making inroads on this solution currently as it stands, but the solution is not quite there yet. Artur explained that during his time at Pro Global, they built a tool based on an earlier version of AI that was seeking to achieve the automatic cleansing of data, but at the time, it wasn’t possible so the solution unfortunately didn’t materialise.


The future of insurance underwriting with AI tools

As we acknowledge the current shortfalls in insurance underwriting when using AI tools and practices, it’s important to have a clear vision for the future.

Currently, there is an ‘arms race’ between around five industry-leading companies who are working on solutions and how to successfully integrate AI data-augmentation tools into their operations in order to revolutionise the industry.

To the winner will go all the spoils, but currently, no one is quite there yet. But what could this ideal future look like?

Artur Niemczewski paints a clear picture.

Instead of having numerous tools and streams of information from which the underwriter is expected to seamlessly process and understand, he believes that a single source of data is key.

One tool, one data stream, one solution. The key here is ‘integration’ – all tools should integrate perfectly together, talk together, understand and process their data as one, finally producing a single data stream of information from which the underwriter can make their final decision.

The whole point of integrating AI into insurance underwriting would be to assist the underwriter in assessing the associated risks of each individual case by crunching those hard numbers and offering simple and easy-to-understand results.

AI should do all the ‘heavy lifting’ by effortlessly analysing the data at hand and offering a very clear result.

This could be as simple as a traffic light system – ‘green’ could indicate the low-hanging fruit that offers low risk and a high chance of successful policies. ‘Red’ could indicate high-risk situations that the underwriter should steer clear of.

However, the ‘amber’ situation could be the most interesting…

Here, the AI data augmentation analysis could flag the underwriter to risks that are filled with nuance.

It is in these middle-ground decisions that human underwriters could involve themselves by studying the data presented in order to arrive at a decision on whether the risks are worth investing in or not.

Armed with all of the necessary information, they could study the case and make a good, informed decision.

The AI has done all of the heavy lifting in siphoning out the green ‘low hanging fruit’ and warning against the red ‘no-go’- areas, leaving the amber analysis clear and comprehensive for a human to make a final decision on.

All datasets and systems would have been crunched and analysed by the AI, allowing the human underwriter to spend their valuable time judging the nuanced situations where the high-value decisions lie. The AI would have already taken care of all of the areas which were previously difficult for humans to navigate due to the complex errors brought up by non-integrated systems, leaving them worry-free and focused on real value-driven tasks.


Summary

The problem is not investment, but rather integration.

Many companies are heavily invested in pursuit of this type of solution but the answer will lie in how well each individual system integrates into the next, and how well they can harness the power of AI in order to arrive at a single source of data to make a well-informed final solution.

With this ideal solution on the horizon, the key question remains: who will get there first?

If you would like to watch our full interview with Artur Niemczewski where we discuss the current state of AI in insurance underwriting and his vision for the perfect future, please follow this link to visit our IT Insights Hub.


About our guest

Artur Niemczewski has a huge wealth of experience in the insurance industry gained over a 25-year career at the highest levels.

Starting out, Artur qualified with a PhD in Nuclear Engineering and a Master’s in Public Policy of Technology from the Massachusetts Institute of Technology (MIT) in the USA. From there, he transitioned into insurance as a broker and enjoyed many varied positions in the industry such as Head of the London Market Operations for Willis and Chief Executive at Marsh, a branch of the London Market Speciality Division.

Now working as the Non-Executive Director at the Chartered Insurance Institute, Artur concerns himself with the current state of insurance in this ever-changing world and works towards developing and implementing modern AI solutions to help drive the industry forward in an attempt to revolutionise the use of technology across the whole insurance world.

]]>
https://www.future-processing.com/blog/how-to-save-the-underwriter-from-extinction/feed/ 0
Managing claims inflation: innovation & predicting the future of claims https://www.future-processing.com/blog/managing-claims-inflation-innovation-predicting-the-future-of-claims/ https://www.future-processing.com/blog/managing-claims-inflation-innovation-predicting-the-future-of-claims/#respond Thu, 25 Jan 2024 09:21:15 +0000 https://stage-fp.webenv.pl/blog/?p=27912 In part 1 of this article, we explored a number of insights gained from our recent IT Insights InsurTalk interview with Manjit Rana of Clearspeed where we discussed the challenges insurance learners face when it comes to managing claims inflation.

We looked at the rising cost of claims inflation and how this affects not only the insurance industry, but in fact all areas of business. The rise of Covid-19 and the Ukraine war are but just a few world events that have driven up inflation in recent times, leaving a severe labour shortage, increased cost of fuel and parts, and issues with transportation and logistics.

Insurance leaders are facing unprecedented pressure to innovate in order to come up with solutions to these issues and to ensure that their business models are aligned with the changes that the companies they work with are implementing.

By taking the appropriate steps, insurers seek to future-proof their business insurance models to make sure that the products and services they offer will remain current and relevant to a rapidly changing landscape.

In part 2 of this article on managing claims inflation, we look at how insurance leaders might use technology to help innovate and contemplate what the future of claims might look like.


Fostering a culture of innovation using technology


Bucking the trend

It can be easy for insurers to get caught up with focusing on the latest ‘current’ ideas instead of focusing on those that could provide the best solutions.

One size does not fit all, so looking at the same solution that the rest of the industry is working on might not be the best solution for an individual company’s needs.

In order to do this successfully, companies must forge their own paths, spurred on by the issues that their insurance customers are facing and come up with direct innovative solutions to help both the business and the consumer move forward.


Problems with innovation


Innovation done in isolation

Quite often, innovation scouting is done in isolation from the main business challenges. This results in a disconnection between the real problems that insurance companies are facing and useful, innovative solutions that could help solve these issues.

In order to try to fix this, company heads, managers and team leaders must be able to understand and convey the main issues that they are experiencing, to which the innovation team can apply themselves in search of answers.

Setting priorities is essential as it creates a clear focus, which then reduces disconnection and allows the issues to be analysed and (hopefully) solved.


Innovation should be treated as a service department

Innovation must be implemented as a service department that reaches throughout a company in its entirety – it shouldn’t simply be a standalone department tucked away in a corner office somewhere.

The innovation team must have access to the entire organisation and be able to service other departments with solutions to help solve their problems. Their supporting role plays a crucial part in the development of new ideas on how to implement and revolutionise processes and solutions, which should be a key part of all areas of an insurance business.


Harnessing knowledge from other industries

Traditionally, insurance companies have not always been the best at looking outside of their own industry for solutions and taking these lessons learnt to come up with solutions on how to use technology to innovate. However, problems can’t always be solved from the ‘inside’, so delving into how other industries innovate and create new solutions is extremely important.

Insurance companies inherently work alongside all types of industries – from automotive to e-commerce, there’s a huge range of businesses out there that require insurance, so bringing in people with a wealth of experience and know-how from those industries in order to help offer creative solutions is a must.

A great solution could be to invite speakers into the insurance world to discuss the problems they are facing and listen to the steps they are taking to solve these problems. This will surely result in insurance companies gaining a better understanding of how they might innovate their products and processes for the better.

They will surely come up with ideas and solutions that the insurance company hasn’t thought of, offering true ‘out-of-the-box’ thinking.

Coming at issues faced in the insurance industry from a different angle and with a fresh perspective will likely result in ideas that wouldn’t otherwise have even been considered.

One great example of this is at Clearspeed – they invited guests into their company who had lots of experience in the military technology sphere, and as such, were able to apply the lessons learnt to their own insurance products successfully.

Solving problems and fostering innovation requires starting from the right place, understanding the challenges posed and then gathering as many different viewpoints and perspectives as possible in order to come up with effective, real-world solutions.


Predicting the future of claims inflation

While it’s impossible to predict the changes that may occur to businesses and insurance in the future, one thing we can be sure of is that change will occur.

As these changes play out in other industries, insurers will need to evolve in order to ensure that their products are effective and relevant in the modern world.

A useful example of this evolution can be found in the automotive industry. Car manufacturers are moving ever closer to a subscription-type model when it comes to consumer purchase preferences, but where does that leave insurers?

Where in the past customers would purchase a car outright and then visit a price comparison site to purchase insurance, subscription purchase models are likely to not work in the same way.

Consumers are moving more towards flexibility in their purchasing habits, after all, the type of car people might need would likely change depending on whether they have a young family, teenagers, or perhaps its main use is to commute and spends most of its time at a train station during the day.

By paying for their vehicles using a subscription model, customers will enjoy the benefits of this flexibility by being able to swap their car in for the latest model much more frequently, which will drastically reduce the number of annual insurance plans required.

How can insurers stay on top of automotive manufacturers’ flexible subscription model plans? Without thinking carefully about this question now, the risk to insurance companies is that vehicle cover could be bundled into the monthly car subscription payment, which could put all of the choices when it comes to insurance plans into the manufacturers’ hands.

Insurance companies will surely not be thrilled by the idea of their products sitting underneath car manufacturers’ monthly subscription proposition as it takes the power of choice away from the consumer and puts it squarely in the hands of the manufacturers.

Being fully reliant on the decision-making of another company is certainly far from ideal, so this should be a major wake-up call for insurers to consider these changes now and innovate to find solutions while it is still possible.

A useful example of innovation that insurers could explore in this situation could be to offer customers ownership of their insurance ecosystem and provide a model that can be applied to any vehicle manufacturer. This would preserve customers’ choices when it comes to purchasing insurance and help companies maintain a balanced playing field.


A final word

Insurance companies must look at the issues the industry is facing and how to create innovative solutions for them through a different lens. Change is here, so they must adapt and overcome it through collaboration, innovation and out-of-the-box thinking in order to succeed.

Change often comes from the ‘outside’, so working with leaders of other industries is a must when it comes to approaching problems differently and imagining how the future might look and how products may change.

A great example of an area of insurance that desperately requires innovative solutions is claims mitigation. Insurers are often so busy worrying about how to best settle claims that very few resources have ever been allocated to innovation surrounding how to prevent the claims from occurring in the first place.

With some time and resources spent in this area, insurers might be able to come up with new and exciting methods of claims mitigation that could drive up profits and reduce claims across the board.

Innovation is not always about being right at the leading edge of change – insurers don’t have to be at the pinnacle of innovation when it comes to leading their industry, but they do need to be innovating in some way or another. As long as they are somewhere close to this ever-moving forefront of innovation, they will be heading in the right direction.

With half a step backwards from this bleeding edge, companies can reduce their risk and offset costs slightly while still enjoying the benefits that creative endeavours bring.

If you would like to see our full interview with Manjit Rana of Clearspeed where we discuss the issue of mitigating claims inflation, follow this link to visit our IT Insights hub.

Revolutionise your claims operations with futureClaims™

futureClaims™ is an advanced platform designed to meet the demanding requirements of complex commercial and specialty claims, including the London Market.

]]>
https://www.future-processing.com/blog/managing-claims-inflation-innovation-predicting-the-future-of-claims/feed/ 0
Managing claims inflation: challenges insurance leaders face https://www.future-processing.com/blog/managing-claims-inflation-challenges-insurance-leaders-face/ https://www.future-processing.com/blog/managing-claims-inflation-challenges-insurance-leaders-face/#respond Thu, 11 Jan 2024 09:25:06 +0000 https://stage-fp.webenv.pl/blog/?p=27772 Recent world events, such as the war in Ukraine and COVID-19, have been major contributors to the rise in inflation across the globe. The knock-on effects of these issues have resulted in supply-chain bottlenecks, higher energy and transportation costs, as well as a mass labour shortage across many industries.

US inflation was at 3.7% for September 2023 when compared with the previous year, the UK was at 6.7% and 4.3% for the European Union. While these numbers are seeing a slow decline, inflation is still a serious issue and with the world’s geopolitical stability balancing on a knife edge, another serious event could tip the scales once again towards rapid inflation.

Almost all industries across the world are feeling the effects of inflation, not least those in the insurance space. Awards for general damages have increased, wage rises have strongly impacted the cost of settling claims, and simply providing long-term substantial compensation for the care of seriously injured individuals suffering from catastrophic personal injury claims becomes significantly more challenging.

In order to keep up with these rapid changes that are driving inflation, insurance companies are obligated to take drastic and innovative steps so as to ensure the long-term profitability and sustainability of their business model.

While innovation is very much needed in the insurance industry at this time, it doesn’t come easy, with many companies struggling to find their way in a rapidly changing landscape.

Future Processing met with Manjit Rana as part of our IT Insights InsurTalk series to discuss these issues and get his views on what could be done. With many years of experience at the forefront of the innovation industry, Manjit Rana is an Insurance Innovation Thought Leader, Innovation Expert, Venture Builder and GM U.K., EMEA & APAC at Clearspeed. Manjit is highly experienced when it comes to innovation in insurance and we discussed many of the current issues facing the insurance sphere and how they might be approached.

In this article, we take a deep dive into some of the insights gained through our conversation with Manjit in our ITInsights InsurTalk interview and go through how innovation plays an essential role for insurers in managing claims inflation, as well as look at how best to foster innovation in new and effective methods in order to keep up with the rapidly changing business models of companies in other industries.


The biggest current challenges for the insurance industry

Claims inflation is a huge issue for a number of reasons. In addition to the labour shortage, cost of living rise and fuel costs, the general availability of materials across various industries is low, which is also driving up the costs to insurers.

One pertinent example of this is in the automotive industry. Cars are becoming ever more sophisticated, so much so that it’s becoming more difficult to take new and complicated vehicles to local garage shops to get them fixed. Instead, people are obligated to go to main dealerships, which can be much more expensive.

Managing this as an insurance company can be difficult, as these extra costs were not necessarily calculated into the original premiums. Paying out extra funds for claims has seen insurance companies averaging as high as 110% COR (Combined Operating Ratio), according to Manjit. This means that for every £100 the company collects in premiums, they’re spending £110. Obviously, this is not sustainable.

The impact of these rising costs is felt not only by the insurance companies themselves, but also by all third-party vendors. Insurance companies do not work in isolation, they have a symbiotic relationship with all other industries that they provide insurance coverage for, so any financial impact to the insurer is also felt strongly by other businesses as well. The cost of parts is rising, as is the cost of logistically transporting those parts from one place to the next. Some lines of businesses are suffering more than others, but there’s no doubt that all lines are feeling the impact one way or another.


Predicting the rising costs of claims inflation

The rapid rise in costs could not have been foreseen. The knock-on effects of the Ukraine war and Covid-19 have seen a drastic increase in fuel prices and the emergence of the “working-from-home” phenomenon respectively, both of which have had a huge impact not only on costs, but also on customers’ expectations.

In particular, responding quickly to insurance claims while working from home has been especially difficult for insurers. Fully digitising systems and processes while still providing fast and effective claims support has been difficult, leaving many companies struggling to keep up. At best, they are treading water by simply doing enough to stay in touch with the latest twists and turns in the industry, but everyone is struggling to get ahead of these issues.

Working from home has had a big impact on consumer behaviour as well. As people are often home all day, products such as contents insurance tend to be seen as less important, as customers feel that being home reduces their odds of being burgled, for instance. On the other hand, being home all day can result in a greater number of accidental damage claims for incidents that occur in the home. This is just one example of a change in consumer needs brought on in recent times.

For insurance companies, trying to predict the knock-on effects that an issue will have on other areas of their business is extremely difficult. Nothing happens in isolation, so when one area or industry is impacted, there is a trickle-down effect that impacts many other areas, too.


The role of technology in mitigating claims inflation

Technology plays a pivotal role for insurers in mitigating claims inflation. While the labour shortage has left companies short of staff, businesses are obligated to look inwards and redistribute their most valuable resources, humans, to the most important tasks and utilise technology to cover the everyday repetitive and mundane jobs.

When there are fewer people working, it doesn’t make sense to assign those valuable staff members working in the company to tasks such as checking receipts, filling out forms or confirming the validity of supporting documents for claims to human workers. These types of processes can all be achieved through the use of automation software, freeing staff up to cover the most important tasks that drive business value, such as speaking directly with customers.

Technology can cover tasks that don’t need a human touch and that involve a lot of data, leaving staff free to interact with customers, listen to their concerns and best help them through the claims process. After all, customers pay a premium for their insurance products, and while they hope to never be in a situation where they need to make use of their insurance plans, incidents do happen, so having a human being on the other end of the line who can empathise and help them through their difficult situation is extremely valuable. The rest can be automated and left in the hands of technology.

Read part 2 of our IT Insights InsurTalk article on managing claims inflation, where we look at how insurance leaders can foster a culture of innovation, what the future of claims might look like, and how companies might make effective use of experts from other industries in order to innovate and revolutionise the insurance landscape.

Revolutionise your claims operations with futureClaims™

futureClaims™ is an advanced platform designed to meet the demanding requirements of complex commercial and specialty claims, including the London Market.

]]>
https://www.future-processing.com/blog/managing-claims-inflation-challenges-insurance-leaders-face/feed/ 0