Time to dive deep into digital transformation

Time to dive deep into digital transformation

Data is valuable, we know this, but what are the challenges and opportunities? A new white Paper: The Digital Transformation Deep-dive, focuses on digital transformation in regulatory technology.

The paper focuses on seven insights:

Firstly, engagement with regulators is important. As Bob Fuller, Chief Administration Officer at Fixnetix explained, this can “give a number of advantages, including early indications of any business implications as well as increases in budget that may be required.” But, there is a limitation. As Paul Charmatz from Encompass warned: “Regulators are not always equipped to deal with feedback.”

Secondly, for certain activities, proof often lies with the bank, which has to demonstrate to clients that they provided the best service. Bob Fuller pointed out that this means being able to prove that they provided “the right data, at the right time and right place.” He mentioned PPI to illustrate this – “[banks] have no option but to pay out claims, because they can’t prove that the products were sold fairly.”

Thirdly, thanks to MiFID II, links within or between banks have to migrate from an informal model to straight-through processing (STP) model “infrastructure is only likely to increase in volume and complexity,” it is suggested.

Fourthly, post Brexit, banks may need to split themselves into UK and EU entities. Some banks are already considering it, suggested Bob Fuller. “When we leave the EU,” he said: “it is highly likely that we will be considered ‘non-GDPR’. He added that “America is unlikely to get third-party equivalence due to recent political developments. . . It’s going to be very busy in the coming months and years.”

Fifthly, control over communication channels, combined with new venues such as OTFs, will drive an increase in the number of data sources. Tom Fairbairn from Solace explained: "Looking back to the LIBOR scandal, traders colluded over Bloomberg messenger. This and other developments have led to the idea that all forms of communication, all the data that relates to a trade, are important when it comes to provable execution. So, firms have begun locking down access to these communication mechanisms. This – combined with the increase of venues such as Organised Trading Facilities (OTFs), mean that there will be a huge new array of data sources.”

Related to this point, is the issue of moving data, and legacy versus new technology fixes. Bob Fuller said: “There is no limit to using legacy technology, but the cost of continuing along that path will likely become too high to be supported by the business. Indeed, the vast increase in data being processed and stored will need to be addressed. The use of well-established vendors like Fixnetix should aid this transition.”

Sixthly, don’t chase the regulators’ tail, but aim to be ahead of the game. As Tom Fairbairn said: “Solutions being deployed now need to be flexible, scalable and advanced enough to deal with these future challenges that we don’t yet know about.”

Finally, there is the utility model. “It’s inevitable that banks will share information – the regulators will force them, either directly through utilities or via initiatives like the Open Banking API” said Tom Fairbairn. But they have been adopted slowly. The problem, explained Paul Charmatz: "Is that banks’ proprietary data and process are still considered as an advantage.”

To download the White Paper click here .