Follow Us

Follow us on Twitter  Follow us on LinkedIn
 

24 June 2014

Risk.net: Firms review data governance ahead of Solvency II


Default: Change to:


Thanks to onerous Solvency II reporting requirements, insurers are being forced to analyse their data requirements and reorganise. This article reports on the approaches firms are taking to meet the challenge.


It is not just the amount of reporting that is of concern. Regulators will also demand full look-through into how these reports are compiled and transparency on the underlying datasets. In practice, this means a regulator should be able to select any data cell in an insurer’s submission and trace its lineage through the reporting process all the way back to its origins in the firm’s core system. Add on top of this the need for insurers to submit these reports in a strictly defined reporting language and ensure they conform to the European Insurance and Occupational Pensions Authority’s (Eiopa) host of validation rules, and it is easy to see why the spectre of Pillar III is looming large in the minds of insurance professionals across the continent.

The first challenge insurers must overcome is also arguably the most disruptive and complex: getting their data management systems organised. Traditionally, insurers have stored their data in legacy systems – large databases provided by the heavyweights of the IT world, such as Oracle, SAP and IBM – and/or in a profusion of end-user computing (EUC) technology assets, typically made up of Microsoft Excel spreadsheets and Microsoft Access documents.

The first decision insurers need to make – if they haven’t already – is whether to transfer their data to a central data warehouse or optimise their current infrastructure to cope with the rigours of the new regime. There are pros and cons to either approach and, not surprisingly, the nature and size of the firm in question will dictate the route they take.

Article 121 of Solvency II states that the data used for an internal model must be “accurate, complete and appropriate”. Eiopa guidelines released on June 2 expand on this, explaining that proper collection, processing and analytical procedures should be used to achieve this level of data quality. Data relating to different time periods must also be used consistently, emphasising the importance of effective controls to prevent discrepancies arising. In addition, the actuarial function must implement “a sufficiently comprehensive series of checks… to allow detection of any relevant shortcomings”.

Such checks can be carried out by comparing present data with data used in different calculations; monitoring data values to ensure they stay within reasonable limits, and matching random samples of “cleansed” data with their raw data originals. This is all much easier to achieve when firms have a firm handle on their data in the first place.

In 2011, the Financial Services Authority (now the Prudential Regulation Authority) investigated the use of spreadsheets in firms’ Solvency II internal models and made a number of recommendations on how they could be more tightly controlled, including forming an enterprise-wide spreadsheet management policy, creating an inventory of critical spreadsheets and installing controls on data security, change management and disaster recovery.

When approached for an update on its position, a spokesperson said: “Many firms’ spreadsheets are a key area of risk because they are not typically owned by IT and are owned instead by other business or control areas. As a result, spreadsheets are often not subject to the same general controls as firms’ formal IT systems. As the relatively open nature of spreadsheets sometimes offers more flexibility with regard to controls than ‘traditional’ IT systems, this is something that needs particular attention by firms.”

While this message has been received and understood by many of the larger firms, this is at least partially because these insurers have been the main focus of the regulator’s attentions. Smaller companies are generally at an earlier stage of the data management process.

Full article (Risk,net subscription required)



© Risk.net


< Next Previous >
Key
 Hover over the blue highlighted text to view the acronym meaning
Hover over these icons for more information



Add new comment