Collateral in 2019: Part I

Collateral in 2019: Part I

  • Export:

Collateral management has come a long way in the past two decades. Just 20 years ago, couriers were hounding around the City with paper share certificates to transfer titles, collateral allocation and selection was done on spreadsheets and optimisation didn’t exist.

The uses of and need for collateral were also vastly different from today. The stock lending business was the exclusive preserve of a handful of major international players, variation margin was hardly exchanged against over-the-counter positions and the need to mobilise collateral was rarely urgent.

Much has changed. Post-crisis regulatory reform has dramatically increased the requirement to manage collateral at a time that cost pressures have forced financial institutions to seek out efficiency.

Providers of collateral services and technology have responded with new tools to automate processes and reduce costs. New concepts have been introduced. Financial institutions today are well down the path towards granular calculations of collateral costs.

Algorithms dominate selection decisions and new technology is reducing the inefficiency in interactions between market participants. Processes around collateral schedules and allocations are increasingly digitised and collateral has become a centralised function for many firms.

But in other respects, little has changed. Many of the processes and methods of collateral management and mobility today, while much more efficient and automated, would be recognisable to the executive of 20 years ago.

However, new technologies in the form of machine learning, big data processing and distributed ledgers look set to revolutionise the collateral industry, massively reducing inefficiencies and moving towards a global, frictionless and even more secure environment for the pricing, transfer and settlement of collateral.

This study is based on interviews with numerous experts in the collateral industry some of whom are quoted throughout the report. It is split into three parts: the first looks at best practice today; the second towards the potential impact of disruptive technology; and the third asks what a perfect collateral environment looks like and whether it is viable.

This study focuses on banks and broker-dealers who have been driven by necessity to work with established and new providers to drive innovations. But the lessons are applicable to the buyside who are likely to follow in the banks' footsteps and adopt many of the processes that have been developed

Part 1: The drivers of change: best practice in collateral efficiency today

Collateral management is like learning a foreign language. Most people know a few buzzwords, everyone understands how much they would benefit from advanced knowledge, some have used technology to satisfy basic needs, but few have taken major steps to fully master the subject.

The financial crisis has been followed by regulatory reform, increased capital costs, heightened awareness of counterparty risk and a squeeze on profit margins. This has forced banks and broker-dealers to lead the charge to a more sophisticated understanding of collateral and for greater efficiency across the management process.

This drive has brought with it innovations in collateral optimisation algorithms, moves towards holistic enterprise collateral management, which broke down the silos within organisations, and a renewed consideration of the potential gains resulting from outsourcing parts of the collateral process either to a third-party technology vendor or to a collateral manager or triparty agent.

But complexity in collateral is escalating almost as fast -- if not as fast -- as the increasing sophistication of collateral processing. Capital charges and leverage ratios for banks are adding new calculations to optimal allocation methodologies. New clearing rules mean more collateral is required for certain exposures and new regulations are increasing transparency and reporting obligations.

All this adds to the cost of day-to-day business operations, which further drives the need for efficiency across portfolios, whether through increasing visibility and the centralisation of pools of col  lateral to ensure the most cost-effective asset is always posted, automating collateral selection and transfer to reduce costs or the potential for costly human error, or simply to ensure that all assets that can be mobilised are mobilised and not sitting dormant on balance sheets.

From the back office to the Boardroom: Collateral takes centre stage

Collateral optimisation has been a buzzword in the industry for over a decade. The term historically has referred primarily to the centralisation of collateral pools across desks and silos within financial institutions and the ability to match assets against liabilities based on detailed calculations of uses, eligibility and the cost of deployment.

The cheapest-to-deliver concept had dominated with advances in technology mainly related to deploying more sophisticated algorithms in that they include more metrics and a move from linear to non-linear calculations bringing in more external data points to calculate the true cost of an asset.

Ben Challice, global head of agency financing and collateral management at J.P. Morgan, says that collateral management has changed significantly in importance in the industry: “Collateral management today is core to how people run their businesses. It has moved from a back-office, cheapest to deliver process to being centrally managed and driving pre-trade decisions across an organisation.”

Post-crisis regulatory reform has led to monumental change in collateral management.

Some 20 years ago, all that mattered to a trader pre-trade was whether that trade was going to make money. At that point, optimisation meant asking if there was a more profitable trade that could be done at that time.

Today traders have to take into account capital, risk-weighted assets, balance sheet implications as well as credit exposure both from a counterparty and a geographical perspective, and myriad other factors before a trade can be executed.

“There are a whole raft of variables that a trader must be cognisant of when executing today that just weren’t considerations in the past and these are often complex and interdependent,” says Phil Morgan, chief commercial officer at Pirum Systems.

New ways of thinking about the Cost of Collateral

This additional complexity led institutions to evolve their thinking about collateral across the organisation. Much work has been done by financial institutions to centralise collateral pools, break down the silos between asset classes and trading desks and functions (repo, securities lending, derivatives etc) and geographies to create single views of assets held across an organisation in order to optimise collateral allocation.

Triparty agents (TPAs), outsourced collateral management providers and software vendors have been able to offer additional sophistication benefits by running optimisation and allocation tools over a wider pool of assets resulting in a more efficient allocation.

Within a triparty structure, firms have the capacity to set rulesets, schedules and baskets of collateral. This enables them to automate collateral selection and outsource the process of optimisation, substitutions, settlement and managing daily margin calls.

Using triparty, firms can deliver a basket of securities and optimise a deep, broad book of globally sourced assets. Because of continued investment, TPAs have developed sophisticated tools for optimising collateral allocation that is not limited to the cheapest-to-deliver asset against any exposure and have done the majority of the work around optimisation.

But, over the past five years, financial institutions have also faced increasing liquidity constraints resulting from new capital and liquidity rules.

As a result, TPAs can only go so far in the optimisation process. Their algorithms can only be run against assets under their control, and only against objective metrics and measures for each asset. Increasingly this is not enough to provide true optimisation and crucially to satisfy internal treasury departments.

Of all the new rules and regulations, the Liquidity Coverage Ratio (LCR), which has been introduced in stages since 2011 and in part requires banks to hold up to 100% of liquid assets against 30-day net cash flows as well as imposing other additional capital requirements, has had the biggest impact.

A result of the LCR, the funding cost of an asset within a bank is now specific not just to the objective and external metrics traditionally calculated by optimisation tools but also to the broader and unique liquidity profile of that specific bank on that specific day.

“In today’s market, a bank needs to understand not just the cheapest way to deliver and to allocate the lowest grade eligible assets first, it also requires a deep understanding of liquidity conditions and needs to embed those calculations in liquidity frameworks across the firm,” says Jamie Purnell, head of equity finance EMEA at Nomura.

Sources vs uses

The LCR has also resulted in the need to calculate the funding cost of an asset based on where it came from because the same or similar securities are subject to different treatment under the LCR depending on their sources.

“One bluechip share coming from a hedge fund needs to be treated differently from one coming from a trading book which needs to be treated differently from one coming from a reverse repo to cover a short,” says Purnell.

Graham Gooden, EMEA and APAC head of Agency Collateral Management at J.P. Morgan, says that this differentiation between sources and uses is forcing change among the firm’s client base.

“More sophisticated clients are looking deeper into the sources and uses of collateral,” he says. “They understand that two different lines of stock can have a different value depending on whether it is a house or a client asset and that moving an asset from one particular trade or counterparty to another might be more cost effective.”

Understanding the liquidity constraints and funding costs of an asset is not just a question of pooling data sets and running broader optimisation calculations. Banks are also starting to think about how to breakdown account structures and change internal operational processes to direct the asset to the right place.

“It is much more complex than just creating an algorithm and sending a file. If the inputs such as asset reference data, pricing information, eligibility schedules or account structures differ, deploying an algorithms’ output at a granular level becomes ineffective. The building blocks need to be put in place first,” says Gooden.

Purnell adds: “There are so many nuances with each asset. The sellside doesn’t have the ability to tag the assets but we need the ability to individually recognise the facets despite the fact it looks the same as 10 other assets you might hold from similar sources.”

Changing operational processes

To segment assets and keep within new capital buffers, some banks and broker-dealers are moving away from the traditional omnibus structure, under which all assets are directed to a single pool at the custodian or collateral agent.

Increasingly, they are creating multiple accounts with separate boxes defined for specific purposes whether that be house accounts, client accounts, overnight accounts or accounts earmarked specifically for different capital treatment.

This enables trading desks at banks to demonstrate to Treasury Departments they have held the right amount of liquidity against a liability globally and ensure that they are operating within limits and capital buffers.

“In simplest terms, in today’s world where outright positions require a minimum of 30 days funding, the secured funding desk at the bank has to take that specific asset and put it against a specific term financing trade where that specific asset must get allocated as collateral with the correlated account,” says Todd Crowther, head of business development and client innovation at Pirum Systems.

“To currently achieve this, many banks implement a hard-coded system to segregate assets using multiple dealer boxes and correspondingly are required to maintain excess collateral buffers in different locations.

“Furthermore, many utilise tri-party allocation drivers to try to achieve an optimal allocation of collateral. However, given the complexity of the allocation drivers, their variability over time and the volatility of many asset books, many challenges still remain.”

It also creates significant operational costs and friction. One bank participant in this study said it had taken more than nine months to go through the process of breaking down the internal flows and directing them to the right custody account. It is also now required to isolate assets with specific inflow caps and route them to the relevant account.

This increases the cost and complexity of doing business and was described by one bank as “a blunt Neanderthal approach to custody infrastructure in the absence of a better solution on the market today”.

Crowther agrees with the sentiment: “The industry is working together to try to get away from this sledgehammer approach and is looking for ways to be able to better pool assets, mobilise them, and direct them to the correct location. This requires tight coordination between the client and their counterpart and service providers from a pre-trade, trade and post-trade perspective.”

Defined allocation

This move towards a greater understanding of the funding cost of an asset at a firm level is resulting in a growing trend for defined allocation of securities in which the client will direct their collateral provider to allocate a specific asset against an exposure.

In some ways, this is a return to how the model worked in the past, says Gooden. “10 – 15 years ago, clients individually instructed an asset to move from account to account manually. Then as optimisation sophistication at triparty agents grew, they took on more responsibility for allocation.

“Today, with the increased complexity of the funding costs of assets (which tend to be bespoke per client) and advances in optimisation tools, clients of triparty are again taking more responsibility for allocation decisions. However, the process is now far more automated than it was historically.”

Rather than relying on cheapest-to-deliver or highest quality assets first and lower quality collateral last, each client has their specific, bespoke binding constraint when it comes to what is optimal.

“As a triparty provider, we only have partial visibility on our clients' positions and requirements, e.g. assets held with another triparty agents. In absence of an exhaustive views on credit exposure, balance sheet requirements, liquidity ratios, etc... it is increasingly difficult to optimize collateral on a client's behalf. We have built a new service that fulfills the growing demand from several dealers to optimize themselves, telling us which assets to use in which specific transaction. ” says Olivier Grimonpont, CEO of GlobalCollateral Ltd, a joint venture between Euroclear and DTCC.

“Their optimal allocation has multi-variable dimensions, many of which are very proprietary to those dealers, sometimes focussing on Risk-Weighted Assets (RWA), sometimes on LCR or NSFR, credit, etc, so it is down to the dealers to tell us their best optimisation while we continue to fulfill all our obligations via-a-vis the collateral takers, such as amongst others, bilateral agreed eligibility sets, concentration limits, valuation, reporting, settlement etc.

So the direction of travel has been and will continue to be that clients would like to be much more granular and directive about what their end-of-day allocation looks like.

This means relying less on triparty agents to calculate the most optimal allocation, but facilitating the most effective deployment of that view. However, to do this, firms need to have the capabilities internally to optimise across all of their operations.

Crowther says: “The focus used to be just about covering exposures with assets on a post-trade basis and as a result there was a lot of over-hedging and inefficiencies. Dealers are now looking at whether they are positioned in the right way based on multiple levers of efficiency – their asset/liability profile, their financial resource usage and their risk-return appetite.

“They want to know whether to reduce a particular trade because they don’t have the right assets to fit in it or increase certain trades due to a pending funding or collateral requirement. It is the ability to monitor efficiency and make dynamic decisions on how to structure their financing books which they are looking for.

“Correspondingly, the point is that they need to do both pre- and post-trade optimisation and hence firms are working with vendors and service providers to further improve and tailor their solutions.”

Purnell says this is resulting in a shift within banks regarding where collateral and external relationships are managed.

“It is an interesting dynamic as the operations collateral teams have always made the decisions of where assets should go and managed the triparty relationships but now it is becoming more of a trading relationship and that relationship internally within the sell-side is undoubtedly heading towards a more centralised structure,” he says.

Digitising collateral schedules

To enable clients to have more control and flexibility over allocations and collateral selection, and to be able to respond more quickly to changing constraints, TPAs have been digitising collateral schedules and automating more processes around the selection and substitution of assets.

“Basic allocation theory is relatively simple in terms of understanding what is most optimal,” says Gooden. “Where we get involved more today is in increasing the efficiency of deploying the optimal allocation.

“It was one thing to provide the clients with the tools to define the allocation but we needed to provide digitised formats of eligibility and schedules.”

Because of the friction and costs inherent in moving collateral, there can be diminishing returns to full optimisation in which the cost of moving the assets eventually becomes greater than the benefit received from doing so. Reducing that friction increases the scope for optimisation.

“Every client we have is at a different stage in terms of where they are with the process of optimisation and have different approaches, which is only natural. Some are very granular requesting a specific security from a specific account because they have done their own analysis and know exactly what they want,” says Gooden.

“Others will be more directional using what we have described as overlay, which is based more on delivering a specific type of asset such as fixed income into a specific account but they don’t want to get involved in the granularity of specific instruments.”

BNY Mellon and other agents are developing tools to facilitate links between themselves and their clients. BNY Mellon is working towards digitizing collateral schedules and giving clients tools to manage collateral more efficiently.

“Digitisation of documentation and schedules is the future,” says Mark Higgins, a managing director at the firm.

“The industry has employed the same methods for 20 years so change is long overdue.”

J.P. Morgan’s Gooden adds: “We are continuing to work on digitising schedules, which clients can now download, and providing more interactive search functionality so clients can identify what collateral different counterparties will accept. Another very effective tool is the ability to run hypothetical simulations for new transactions or to identify eligible collateral held away from the TPA.

“The next deliverable is online approvals, removing the paper schedules that are currently used. It is all about making the process to agree schedules and make changes easier and quicker.

“In addition to the operational efficiency that brings, it can make clients more comfortable with moving down the risk curve as they can adapt and react to market moves more quickly. Waiting for people to sign documentation and get that implemented is an impediment to an efficient market.”

Optimisation of the infrastructures

Efforts are also underway to harmonise processes across the European infrastructure. The European Central Bank is leading a task force to identify inefficiencies and propose industry-wide solutions to what it sees as an inherent inefficiency in the processing and transfer of collateral across the Eurozone.

There have been major advances in recent years in this respect with the T2S systems going a long way to greater mobilisation of collateral across custody networks and creating some elements of interoperability between triparty platforms.

However there remains a lot more work to be done and the real benefits of T2S are only beginning to become apparent. “The landscape has normalised but the changes have been glacial,” says one industry expert.

The international ISO20022 standard for collateral management is growing in adoption by financial institutions and provides an opportunity to move towards more harmonised workflows and business processes as well as a common set of messaging protocols with interoperable market infrastructures.

Other initiatives are also underway. DTCC-Euroclear GlobalCollateral has launched the Margin Transit Utility, which is designed to aggregate a firm’s holdings across all custodians.

A drive towards standardisation

The ECB identifies standardisation of messaging and the “language” of collateral as key to greater efficiency. It envisages a world of interoperability between triparty agents and collateral venues across the market.

But with standardisation comes commoditisation and there are concerns that the drive to standardisation will decrease the room for competitive innovation.

In reality, the industry is a long way from standardisation. Every triparty agent or custodian has a different dialect of Swift messaging. That is a very simple example of the need for standardisation. But as you go further down the chain, the argument for standardisation decreases.

“Clients all say they want interoperability and standardisation but they like the fact we can offer bespoke schedules and eligibility sets, so as you broaden the discussion of what can be standardised there are clearly limitations and points where the client doesn’t benefit,” said one executive.

Another issue is who defines the standardisation. The industry needs to group around a single standard but if one triparty agent comes up with the standard, they will have an edge as it is adopted.

“If you talk about standardising how everyone interacts that punishes people who have invested to overcome the complexity and put in the hard work so what is the driver for that standardisation?” said the head of collateral at a bank.

Pirum’s Todd Crowther says that the focus should be on standardising the processes, which would increase efficiency in the market without diminishing the opportunity to develop competitive edges. “We are helping clients automate the processes of agreeing, reconciling, calculating and posting the collateral.

“The faster and more efficient that is done the greater the ability to optimise from a trading point of view and that is where the opportunities lie today.

“The processes are the beta, managing things correctly and efficiency to your parameters, the alpha is the trading and the optimisation of the collateral trading book.”

Phil Morgan agrees: “Spend your money wisely on things that are going to differentiate and create alpha for you and your clients. Processes that are operational or designed to meet a regulatory mandate don’t provide a competitive edge and so why would firms look to build it. We can build it once and socialise the cost.”

On the road towards a Better environment

Over the past five years, numerous barriers have been broken down within firms and across jurisdictions to create a more harmonious environment for collateral efficiency.

This has been driven by client needs as well as regulatory change, says J.P. Morgan’s Meredith, as clients look to maximise their inventories and enhance liquidity by moving their assets around globally.

Financial institutions have a far more granular understanding of the costs of collateral and have developed new principles, and methodologies for optimisation and allocation.

New technology solutions and processes have been launched that have gone a long way towards reducing friction between parties across the collateral spectrum and facilitating automation.

But there remains a long way still to go to create a frictionless environment and many barriers remain to a perfect market.

In the next part of this series, we take a look at emerging technologies and concepts that will continue the evolution of the collateral environment and define the next decade of innovation across collateral management.

To download the full whitepaper, click here
To read Part II, click here.  

  • Export:

Related Articles