Martin Earley looks at how companies with legacy systems can achieve cost savings and efficiency improvements without going for the ‘rip and replace' option
For many years now the challenge of legacy systems and what to do with them has had IT directors, finance departments – and a whole host of other decision-makers in the insurance industry – scratching their heads. Not surprisingly, too. Legacy systems, typically inherited from a period of M&A activity, act as a drag on growth and efficiency.
They make it very difficult to achieve a transparent view of the customer – especially when that customer has a number of different policies with the same provider. Legacy systems can often mean that introducing new technologies becomes a burden rather than an opportunity, and indeed, have also often been a factor in the growth of outsourcing in the insurance industry.
In the cautious business of insurance, a systematic approach to replacing legacy
systems is usually adopted. There are some companies that will benefit from a ‘rip and replace' policy.
But most insurance companies seem to prefer to confront each area of the business separately, with a view to balancing the initial costs against long-term evaluation of what the impact on revenue, efficiency and customer satisfaction might be in the future.
Failed IT initiatives
The general insurance market carries the burden of legacy on its shoulders perhaps more heavily than its counterparts in other areas of financial services. We can partly attribute this to the residual sentiment that insurance remains primarily a face-to-face business. But it is also a by-product of the number of failed IT initiatives that have characterised the global insurance market in recent times – for example Kinnect in the Lloyd's market.
And many legacy systems still work well. They can be some of the most reliable technology elements of an insurer's business. What's required, then, is a considered assessment of what would best be retained in-house, what systems need upgrading and what could potentially be outsourced.
Insurance companiess are continually under pressure to deliver up-to-the-minute business data, while still keeping an eye on costs. Executives demand increasingly
sophisticated reports regarding sales targets and other key business and performance
metrics. Customers and partners want self-service, web-based tools that provide
up-to-date answers to questions.
Managing directors want to know how the company's information systems can speed up time to market, differentiate from other players in the market – especially new entrants – and increase profitability, while seeing a return on investment.
These relentless business imperatives are placing new demands on legacy systems. The term legacy itself has taken on a negative, ‘past its prime' connotation. And with the added demands to adopt the latest technologies, organisations in the insurance field are debating over whether or not to ‘rip and replace' legacy systems in order to modernise IT infrastructure and save money.
However, the ‘rip and replace' approach is not always the right answer. For many companies, legacy applications are business and mission-critical. They run the systems that form the backbone of the business. They house the data and business processes that differentiate a company from its competitors and represent years of valuable intellectual property. Ripping legacy systems out and replacing them with newer systems, when less drastic alternatives still exist, for many, makes little financial or tactical sense.
While in many cases legacy applications continue to meet business needs, they often do have some key limitations. Legacy systems are often disconnected from the enterprise; they store pockets of data that are difficult to integrate with other areas of the IT infrastructure and they are sometimes difficult to support. But these difficulties do not exist in every organisation that operates legacy systems.
So, on balance, is there a reasonable alternative to ‘rip and replace' that mitigates the downside of legacy systems and builds on the inherent positives?
Companies should focus on applications that are most valuable to business and determine what impact the application's lack of availability would have on day-to-day operations. They should also consider the application's overall quality and complexity, the team's knowledge of the actual code, and how much time and money is to be sensibly invested.
This approach highlights a new trend in the process of upgrading. Instead of replacing a complete system, the task can be undertaken in simple stages. Each addition or change can be tested and validated without bringing the rest of the system to a halt, adapting or adding only to those processes that are required to meet business needs.
Financial services providers have been at the forefront of outsourcing and offshoring movements in recent years, although general insurers have been more cautious than most. But the road to outsourcing enlightenment has not always run smoothly, and we have recently seen a trend for insourcing, where companies reintegrate a business function that had previously been outsourced.
While outsourcing clearly offers some strong benefits in terms of cost and efficiency, some companies have sought to offload a problematic business function rather than assessing and improving the business unit in-house, before delivering the unit ready wrapped for others to manage and maintain.
Firms looking at offshore options also need to contemplate such issues as physical infrastructure, cultural issues, legislative protection (notably the new TUPE regulations) and sales relationship management.
Technology is a significant driver of such processes across the industry. Harlosh's experience in the general insurance market shows that businesses should consider best practice application of technology – either alongside legacy systems or through outsourcing – to deliver customer service improvements and strip out costs in the process.
Each company needs to take stock of what aspects of operations can be outsourced effectively, while using technology, legacy systems and new, integrated systems, to deliver process improvements in order to sustain competitive advantage in other areas of the business.
The danger lies in focusing purely on cost without considering all the factors that could have an impact on businesses over the long-term. We are now seeing a shift towards ‘right-sourcing,' where each area of a company is considered individually before making an assessment of whether a legacy system provides great value, the integrity of introducing new technology and does it fit with the current, useful legacy structure and which function would yield the greatest business advantage.
Companies will continue to outsource, but there is likely to be evidence of a more blended approach to outsourcing, with technology at the forefront of decisions made.
Surely, given the confluence of factors conspiring against the early adopters of the outsourced model – customers, EU regulators and brokers – it makes sense for the GI business, and particularly those players at the small to mid end of the market, to at least make sure they have fully evaluated all of the options available to them before selecting outsourcing over technology investment.
The application of leading edge technology to ensure an insurer is equipped to compete in this market going forward can be just as effective – if not more effective – than deciding to outsource.
The most sensible option may not be a case of ‘either or' but more about blending different ways of achieving cost-savings and efficiency while also delivering great customer service – using outsourcing, technology investment. IT
‘ Martin Earley is commercial director of Harlosh, which provides technology solutions to the insurance sector