This article uses affiliate links, which means if you purchase products through these links, we may earn a commission. Click here to see our T&C. For licensing please click here.

As technology continues to disrupt an array of industries at every level, there are still surprising parts of global infrastructure that rely heavily on outdated technologies and legacy systems of years and even decades past. Many might be surprised to hear that the global financial system is in fact an example of that. While fintech, digital trading and online banking have transformed the end-user experience of many aspects of financial life, the backend of those systems are still often less modernized than one would imagine.

Legacy Systems Plague Wall Street's Top Banks
Legacy Systems Plague Wall Street's Top Banks And Slow Down Trades Saphyre

In fact, financial trades taking place at the highest levels of global financial institutions are often navigated using systems held together by an array of disparate spreadsheets, emails and even faxes. The costs, both financially and in terms of efficiency, are enormous to banks and financial firms, which is what led twin brothers Stephen and Gabino Roche to co-found an AI-driven platform which has provided banks a solution to finally streamline and automate those disparate processes. Saphyre has now been adopted by some of the world's largest financial institutions and we asked Stephen to provide insight into the technology challenges faced by global financial firms.

What are the primary challenges driving many top institutions to the Saphyre platform?

Stephen Roche: The original main driver was cost-efficiency, but now with the combination of a recession and inflation being here, and many companies are being forced to look at headcount reduction, and operational transformation is certainly a must.

Especially since many of these pre and post-trade issues are being handled manually by literally armies of people. You're talking disparate databases not synced between each functional operational team, faxes coming in from their clients, and manually rekeying information off of emails or spreadsheets into internal or 3rd party systems, providing ample opportunity for errors to occur and persist through the lifecycle of a fund.

What are the impacts and opportunity costs that legacy systems and manual input processes are having on the financial industry?

SR: Just imagine one example of when someone rekeys information from let's say a fax of an account number and inputs a 6 when that digit should have been an 8, that will persist all the way through the post-trade.

Many of these middle/back-office tools or vendors do not provide a cloud solution and have on-prem databases. Think of them as Polaroid snapshots in time of data. It may not be the latest information, it could be outdated, or it could have been inputted incorrectly and contradicted another database from another team within your organization.

The compounding effect exponentially gets worse with the confusion, and it is why the industry needs T+2 to settle and solve for this in the post-trade. When you see that the regulators are now asking for all of this to occur within a T+1 state, it's just crunching all that same work in less time and if innovation or technology is not there to assist, it will be impossible to achieve.

Inflation, recession and time-to-market, need to make money right away when they launch a new fund. Lag time hurts and the longer it takes, the more costly it is.

Paying costs at least twice or more, possibly multiple times in post-trade due to the research that must be completed.

There is now a new push within the industry to shift from a T+2 framework to T+1; basically a 24-hour timeframe to settle transactions. As someone who has worked within financial institutions for their whole career, I can tell you all that's going to happen is firms are going to try to crunch the same amount of work into a 24 hour period instead of 48 hours with no innovation. Mistakes will happen and somehow miracles will make it appear on paper that everything has been solved, but in reality, costly mistakes will be masked. This means institutions and clients will have repercussions and errors that may not be detected until later.

Does this also create risks in the financial system that technology now helps mitigate?

SR: The risks are tremendously high, due to the manual nature of entering sensitive information. Fraud cases can easily occur if someone wanted to input an account number that could get credit that was not the intended client account. This discovery of this could happen way too late, and recouping the funds could be another challenging task within itself.

Explicit permissioning and memory on who's allowed to see and provide this information securely with an intelligent cloud technology can mitigate this.

The risks are being phased out now in the form of fines by regulators due to the manual processes. They're violating trade regulations by trading when they aren't compliant, which is because they don't have transparency. The manual nature not only creates fat fingering errors, but opportunities for fraud and theft. Digitizing provides secure and authenticated access so that only those designated access sensitive information.

How can the financial industry better embrace new technology?

SR: Internal tech build times will always take longer and be more expensive because they don't have expertise, because they only know how the business works for themselves but not for the whole ecosystem. There's a domain expertise factor that is missing so it will always take longer and cost more when you try to build it yourself, rather than embracing new technologies that already exist, the cost, the margin for error, the time it takes, all increase significantly. It's like assuming that because you live in a house, you can easily build one yourself.

If you create memory with mapping of data, you enable interoperability between digital platforms, and then the question of adopting standards is no longer important.