The Fight Against Legacy Systems
Aug 18, 2017 Legacy systems , Article , Insurance

Grzegorz Podleœny, Senior Manager, Head of Insurance UK & PL

Rafał Kępa, Senior Analyst, Insurance UK Team

Legacy systems in the insurance industry today, with all their quirks and draw backs, require significant effort in order to maintain. Many times, the board rooms of large insurance companies appear to apply the old proverb “If it ain’t broke, don’t fix it” indiscriminately to their IT systems. However, over time old IT architecture can become unwieldy and unmanageable. As additional configurations and resources are added to address changing business needs, the code and systems accumulate. After several years of this natural process, the IT architecture leaves the company in a precarious state by exposing them to multiple risks. This sort of legacy is applicable to many parts of IT architecture such as claims management, billing, policy administration systems as well as document and content storages.

One of the most visible risks is the escalating maintenance costs. In one recent case, the authors had a privilege to work on moving a content management system from an old mainframe platform with operational systems to a modern solution. This allowed for 10M in savings over a 5-year period. Storage on mainframes is expensive, can require expensive integrations, and is hard to maintain or expand. Having data in a modern data center or taking advantage of cloud storage is much more cost effective. Not only will costs climb if one stays with the legacy solution, but after some time it will even become difficult to staff enough individuals that can maintain the archaic systems. The IT sector moves forward with leaps and bounds; young professionals will not have the expertise and knowledge to work with outdated tech that is decades old as they will have been trained only in the latest processes.

Furthermore, as more systems are added, there is increasing exposure to cyber risk as the potential for security gaps increases. As insurers add new front-end applications and components to improve client service, the neglect of their back-end leads to higher operational costs. The integrations become more difficult to manage, and the newer front-end requires human intervention to connect with the back-end systems. Additionally, one of the biggest issues modern insurers need to tackle is how to deal with the vast quantities of personal data they collect from their customers on a daily basis. Leaving all of that data vulnerable is a misstep every prudent insurer wishes to avoid. This is particularly important due to the new General Data Protection Regulation (GDPR) that comes into force next year. Based on the authors’ experience, an audit of systems to ensure compliance can require from 30 to 50 man-days of effort based on the complexity of the architecture.

Three potential strategies exist to tackle the legacy code and rein in ballooning costs. The first is to prevent systems from ever becoming aged by perpetually upgrading and introducing the newest IT systems. However, this is rather an idealistic approach which is resource and cost prohibitive. The second strategy is one where focused and narrow replacements in key areas are implemented in order to maximize the return on investment. This approach is preferable to the first as the changes required as well as the value delivered can be clearly defined. The last approach is to push through large scale transformations to address business needs with process changes, change management, and the replacement of many key IT components. While this method will create the most value, it will also contain the highest risk as so many changes are pushed through simultaneously. Moreover, taking the third strategy of business evolution also implies the insurer may have waited too long to implement a variety of changes, placing them behind competitors that attempt to innovate more frequently.

What’s more, such a conglomeration will also lack the flexibility to quickly address new business requirements. PZU, the largest insurer in Central Europe, understood this very well before the start of Project Everest. Project Everest was launched to bring PZU into the twenty-first century. The old systems were also not adequate to expand into new, fast-growing sales channels, such as mobile. Prior to the revolution that was Everest, a new product took approximately 12 months for PZU to introduce. After the replacement of their aging systems with Guidewire’s PolicyCenter, BillingCenter, and other systems, that time was cut down to under 3 months. It also enabled constant, real-time tracking as well as rapid changes in tariffs. New tariffs had previously taken 3 months to implement, and after the upgrade the same process took a few days.

Legacy systems also create data deficits, as older systems often do not store enough data or do not support a single customer view. In the era of big data, not harnessing data correctly and making it easily viewable means forgoing a key advantage that can enable insurers to become market leaders. Among the other benefits of the Everest Project for PZU, their data was of better quality and much more accessible after the changes. Prior to the implementation, there were dozens of different databases in silos, causing duplicates to exist if clients moved between regions. With the help of Sollers Consulting, many of their problems were resolved. Thanks to the close collaboration of the parties involved, each major release was completed on-time. In such a mammoth undertaking, successfully achieving the business requirements at each stage is a marvel in the IT world. The redundancies and the fragmentation of data was eliminated with the new, centralized systems. The old systems were also not prepared to record policies as they were structured on paper. After the new systems were introduced, the paper-based sales process made way to an online, electronic process.

Another hidden risk that many fail to recognize is that the longer an insurer puts off upgrading and centralizing their IT architecture, the more time-consuming and costly the project will be. Large IT projects often fail to meet deadlines and estimating their costs can be difficult. The longer a company waits to proactively handle the situation, the harder it will be to accurately gauge the effort needed to scale the mounting problems.

The London market, to maintain its place as the world center for insurance and innovation, has started on its own quest to handle such a monumental project in systems building. The London Market Target Operating Model (or LM TOM) is a five-prong solution with a cost of ?270m for the first phase with the purpose of centralizing and streamlining many archaic processes in the market. The Electronic Placing Platform (PPL) will enable brokers and insurers to quote and bind business digitally. The Central Services Refresh Programme (CSRP) will enable brokers to submit claims and premiums. Delegated Authority (DA) will automate much of the auditing of coverholders and standardize requirements to lower the cost of doing business. It is an initiative that the London Market Group has identified as critical to remain competitive.

Companies within the market itself are also taking independent strides forward to maintain a competitive advantage. The insurers are undertaking a multitude of projects to centralize and upgrade systems. Some have had great success creating an end-to-end routing of ACORD market messages with additional improvements underway to be able to utilize all the benefits of the LM TOM changes. Other market players are also looking to follow suit, with upgrades and replacements of their own platforms.

An example of the more targeted approach to replace legacy systems can be seen in Talanx’s acquisition of Warta in 2012. After the migration to one IT architecture, they focused on a claims transformation project to optimize the claims handling process. The claims optimization project took only 7 months to complete. In this short amount of time, Warta managed to see a 60% increase in claims processing productivity. The new implementation also allowed the customer to automate some of the processing with the electronic claim file becoming more accessible as well. It was also a cornerstone for further digitization of claim processes including mobile FNOL, real time communication between customer and claims handler, as well as remote claim reviews. London Market players also benefit from such a targeted approach.

It is interesting to note that the management of large insurance conglomerates can use mergers as an opportunity to restructure and tackle their legacy concerns to maximize the value created by their greater economies of scale. It is worth watching how the management of these market leaders will approach such challenges. XL Group acquired Catlin Group Limited in 2015. Consolidating in such a moment makes the most sense, as it can reduce the overhead even more. Another example of such a merge is ACE Limited’s acquisition of Chubb in 2016. Rather than allowing redundant systems to exist, transforming the pre-existing IT systems can make the new synergies even more pronounced.

So, when is the best time to replace old systems and undertake such changes? There is no simple answer, but there are guidelines that can assist in reaching a conclusion. The decision should always be business driven, with some sort of road map established with clear objectives. The most common factor that motivates such change is the necessity to adapt to market needs. Issues can arise because decision-makers in business are typically disconnected from their IT. They fail to recognize the benefits that are presented by innovating their systems, and see IT systems more as cost of doing business rather than a window of opportunity. This needs to be addressed if insurers want to remain agile in such a dynamic market.

 
 
Originally publishe in Insurance Day, UK
7th August 2017