Solutions: Risk & Analytics

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

A LONG TIME AGO, IN A GALAXY VaR, VaR AWAY…

Posted by

This blog post originally appeared on FOW, and also appears on TabbFORUM, Securities Finance Monitor, and in the John Lothian Newsletter.

In recent months, there has been an interesting twist in the story of VaR, the central measure of market risk, capital requirements, and one of the targets of criticism in the aftermath of the credit crisis. Not much comment has been made of it, but VaR has been adopted as a key plank in the central clearing of derivatives, precisely because of its ability to produce a meaningful single number. This is ironic given that it was this single number, and its attendant lack of insight into the deeper tailed risks, that led to much debate about VaR’s suitability as the key metric, and the role it played in risk management leading to the crisis. And coincidentally, it seems the story of VaR is paralleled well by the journey of young Anakin Skywalker in George Lucas’s prescient epic Star Wars.

To fully appreciate the turnaround, it’s worth looking briefly at the origin of Value-at-Risk.

Obviously complex metrics have their roots in many areas, and tend to be cross-fertilized evolutions rather than single-point revolutions, and VaR is no different. What can be said though is that the emergence of complex derivatives and the growth of those instruments in the late 80s and early 90s did pose a problem for institutions looking to capitalize adequately for adverse market movements. This is not unlike the issue faced by the Jedi Council as they searched for the “one to bring harmony to the Force” at a time of growing tension within the Galactic Republic.

The financial industry’s risk problem was twofold. The increasing complexity of the derivative books created risks that were hard to quantify under traditional small movement measures. Simultaneously, the siloed nature of financial firms led to “natural hedges” (or risk offsets) between trading positions being lost in the risk aggregation. The result was an interesting problem where the risks were either being (vastly) understated or (vastly) overstated.

The answer, VaR, appeared a perfect solution to this twin-horned dilemma. By determining the sensitivities of all of the traded instruments at a firm, to a range of the most commonly observed drivers (interest rates, volatility, FX rates), the problem could be reduced to then using those sensitivities in a range or distribution of scenarios, itself determined by a correlation process across the observable market data, and finding the worst results at required percentiles (i.e. the 99th worst result). This was uncannily similar to the discovery of the gifted pod racer, Anakin, whose natural powers would be molded by his master-to-be, Obi-Wan Kenobi so he could become the future Jedi hero.

Determining this “parametric VaR” involved many approximations and smoothing, but was a huge step forward in the search for a number that best expressed the risk position of an organization.

Improvements in the underlying mathematics continued, as second order movement in the underlying risk drivers were added, but it was the exponential increases in computing power that paved the way for improvements.

The drawbacks in the parametric approach became obvious, and are well documented, so the next step was to move to full revaluation of the underlying trades under the distribution of scenarios. Two main avenues were explored and adopted: Historic VaR and Monte Carlo VaR. Historic VaR uses actual histories to directly create the scenario distribution, while Monte Carlo uses statistical analysis and correlation data to generate a range of possible future scenarios. Portfolios are then valued under each of the scenarios and the worst cases at specific percentiles are again found and reported.

It is at this high point, VaR’s very own “Battle of Coruscant,” that things start to go awry.

The aim of VaR was always that the “minimum loss that should occur, at the frequency of the percentile selected.” What this means is that the VaR number, when working, should never be a maximum loss number, but was always the minimum loss that the portfolio would incur at a frequency determined by the selected percentile (once every 100 days for 99th percentile). What was never intended was that this should be seen as an indicator of the losses that could occur with less frequency, but be far more devastating.

The very neatness, scalability and simplicity of VaR – its ability to capture so much complex information into a number – led regulators to pick it up and embed it as the risk measure, rather than a key risk measure. Once it was linked so completely with the capitalization demanded by the regulators in most jurisdictions, with the only choices left being which of the three VaR methods to use, its centrality in any subsequent crisis was all but assured. It was truly on the path to the “dark side.”

Once the crisis hit, and pre-crisis risk management was scrutinized with the crystal clear lens of hindsight, VaR was found to be imperfect. Many pointed out that it did not deal well with extreme events, deep tail risks, emergent risks and specifically regime change risks. This was not news to those who had conceived and/or used VaR for a long time, echoing master Yoda’s initial and ongoing misgivings about Obi-Wan’s gifted protégé. It was simply restating certain limitations that had always been caveats of the metric. The interesting thing is that even given the storm around the calculation, no suitable replacement could be found, a situation that remains to date.

Of course, the recent crisis was rooted in credit and liquidity. One of the driving changes across the world has been to take individual institutions out of the systemic default loop via the mechanism of central clearing. This most importantly covers swaps and credit default swaps.

Central clearing relies on daily margin calls by the central clearing agencies, so it’s natural that some kind of measure would be required to cope with complexity while delivering a single number that could be used for this margin. After analysis and experimentation, the most appropriate number was found to be…Historic VaR.

Value-at-Risk, the same measure implicated as a contributor to the credit crisis, was called forth to be the central tenet of the solution to that crisis. This is possibly the greatest twist of fate since Lord Vader rose from his knees to cast the emperor into the deep chasm running through the second Death Star.

Different clearing houses use different driving parameters, but the method remains the same. To experienced risk folks, none of the above is new, and is in fact a ridiculously abbreviated history of such an important metric, but it does show the unintentional irony of industry genuflections and trends.

May the Force be with you…

While you’re here…

  • Learn more about our enterprise risk management solutions, Adaptiv.
  • Read more risk insights from our capital markets experts.

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

WHEN RISK MANAGEMENT HITS CLOSER TO HOME

Posted by

This blog post also appears on Risk Management Monitor.

A couple of weeks ago, there was a house fire at my home. It is important to state at the start that nobody was hurt, and the house is now in a restoration stage.

Afterward though, it occurred to me that I write, speak, and consult exclusively on the subject of risk management, so this raises an interesting set of questions. How well do I internalize the risk management mindset, and do I apply the principles I espouse in the most important environment I know: my own home?

With this in mind, for this blog I decided to move away from strict financial firm risk management and instead apply the same kinds of tests to myself. In a risk strategy assessment, I would normally look at a range of indicators, so I decided to assess myself with the same criteria:

  • Early warning of impending crisis.
  • Contingency tactics for immediate reaction to the crisis.
  • Post crisis, effect mitigation.
  • Buy in across the team to the crisis management strategy.

Early warning of impending crisis

In a typical financial institution, and “early warning system” would involve the risk management team understanding the level of risk that was deemed acceptable, and understanding what factors feed into this risk metric. This enables ‘tail’ analysis to be done, in order to understand what negative effects are hiding in the extremes of possibility, as well as limit monitoring, against immediate spikes in the risk factors as they are being observed now. If the limits are set in accordance with the risk policy, then whilst the firm is taking active risks, these should be within the boundaries of management risk tolerance.

In a home and family situation, it is not dissimilar. Understanding the potential sources of risk, such as wood burning stoves, electrical wiring, etc., and establishing the accepted level of risk is critical. A home needs to be heated in the winter, and the risk that this poses in terms of fire has to be offset by the need to maintain a reasonable house temperature. That said, appreciating the risk of fire has to be taken into account, and mitigated to the extent to which it can, by regular maintenance of the chimneys and stoves. It is also vital to have a warning system in place, in this case, most likely a smoke alarm system.

I would grade my home situation as follows in these areas:

  • Risk appreciation and tail mitigation (regular maintenance) – the chimneys are swept annually, and wood is stacked away from the stoves, but wood burning stoves are the principle heating source for the house. GRADE: B
  • Early warning system (smoke alarms) – battery operated, rather than mains and not wired to the fire department. GRADE: C

Contingency tactics for immediate reaction to the crisis

This is the second most important aspect of risk management. Once the crisis/emergency is underway, the situation and losses need to be held under as much control as can be expected.

In banking terms, this could be seen as liquidity reserves. How long can we survive as an institution under stressed conditions, and how do we make the most of the liquidity that we have? It is here that liquid assets, collateral and re-hypothecation of that collateral come under scrutiny.

In the home fire situation, it is more a matter of evacuation. Does each room have at least two viable exits? Do all members of the family know the exit strategy, meeting points, etc.? It is important to understand that a fire is most unlikely on a sunny afternoon, with everyone wide awake. It is far more likely that smoke could be filling the exit corridors while everyone has been sleeping soundly until the moment of crisis. Finally the calling of 911, in a clear, calm manner is crucial.

In many ways, this is the same kind of problem faced by risk managers, who report on VaR numbers based on normal market conditions, only to be faced with a collapsing market (correlating, in its fall, close to one) and generalized confusion and panic across the market. Indeed, it is the stressed vs. normal assumptions that have caused a lot of criticism of the VaR based risk reporting.

So, how do I grade my house situation?

  • Exit strategy – there was only one exit route per room, which is too reliant upon the main exit route being clear. GRADE: D
  • Exit awareness – we had had a conversation with our children about this before, but as with most families, we had never actually practiced the drill. However, the actual exit was orderly. GRADE: B
  • 911 call – remaining calm during the call was instilled into the children and this was accomplished (and the fire department was at the house in around 8 minutes!). GRADE: A

Post crisis, effect mitigation

This stage is really covering the failure contingency, or hedging effects. In banking terms, this typically takes the form of credit default swaps, diversification and market hedges. Stressing these relationships and running disaster scenarios should be a routine job of a risk department.

In the home situation, it comes down to insurance, and protection of key documents needed to activate that insurance.

How did we fare?

  • Insurance – we had excellent insurance coverage that has pretty much guaranteed the complete restoration of the home and provided for a living situation for the 9-12 months it will take to restore the home. GRADE: A
  • Document protection – all insurance and related documents were preserved, but were not in a fireproof safe. GRADE: B

Buy-in across the team to the crisis management strategy

I regularly speak and blog about risk culture and how the true risk managers in a bank are the traders and portfolio managers. The role of the risk department itself, in my view, is to facilitate communication of the risk appetite and the risk position between the senior management (who create the appetite) and the risk takers (who consume it).

In the home situation, the same thing applies. A fire evacuation plan is only good if it is understood by all who may be affected. Smoky 4:00 a.m. darkness is not an environment to start communicating about what needs to be done to prevent or survive a fire. The family has to recognize the smoke alarms, know to call 911, understand the exit options – including how to select the best one, and then know where to meet safely outside.

So, how did we do?

  • Recognizing alarms – this was achieved without panic. GRADE: A
  • Call to 911 – this was achieved without panic. GRADE: A
  • Calm exit to safety – this was achieved without panic. GRADE: A

Overall, I would rate my own risk management in this situation a B-.

Ultimately there are risk management trade-offs to be made between in order to achieve levels of reward or comfort.  This is as true at home as it is within a Wall Street firm. I would rather not have tested my own ‘micro’ risk culture in this way, but since it was tested, I now believe it can be made better.

It is also only right to express my deepest thanks to the fire departments of Bedford Hills, Katonah and Somers, the loss adjustment team at ACE insurance and the large network of friends who have been so helpful in the aftermath of this event.

While you’re here…

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

WHAT IF THE RISK DEPARTMENT REALLY LOST AN HOUR?

Posted by

This blog post originally appeared on FTF News.

With spring in the air, the U.S. clocks have gone forward, stealing an hour from us.

Obviously it is an illusory lost hour, but it does provoke an interesting thought experiment: if you were to actually lose an hour of the working day, where could you improve efficiency to make up for the lost time? In the risk management department, it would be a tough call. Let’s take a walk in a risk manager’s shoes for a day…

The average day starts with checking the overnight results. Here, time is at a premium as any obvious data errors need to be identified, and if the error is large enough, it can prompt a complete rerun of the risk numbers. This is, in part, due to the fact that risk reports are run against all levels of the hierarchy, with each new level being treated as a new entity.

If the base risk numbers could be run at the trade or position level, and the logic required to transform these into hierarchically aggregated risk reports done at the point of enquiry, it would dramatically reduce the time taken to correct errors.  If only the trades or positions affected by the error needed to be rerun, as soon as the correction was in the system the right numbers would be running through the logic, creating correct risk reports.

Once the runs have been checked, there is the process of distribution. This will vary from firm to firm, but generally involves transplanting risk results into spreadsheets or reporting databases, and emailing or releasing the numbers once this is complete. The time “criticality” of this process is largely determined by a few factors:

  1. How much do the stakeholders use the risk numbers? If risk numbers are used as a strategic tool, then they need to be available before trading starts. If they are simply a reporting base, this is not as urgent.
  2. How much do the stakeholders trust the risk numbers? If the models for the risk or the pricing within the risk are not fully trusted, then it is likely that even with the best intentions, the risk reports ultimately will not be used by the stakeholders.
  3. How well do the stakeholders understand the risk numbers? Are the numbers created to reflect the risk takers’ view, senior management’s view, or is it the risk manager’s job to translate between the two?

If the risk calculation engine could send the results immediately to the reporting database, where they could be checked see above using the same distribution tools and dashboards used by the stakeholders, then this entire step could be removed. As corrections occurred, the results would be dynamically corrected as well, with potential operational errors involved in manual report distribution disappearing, and a simple traffic light system could be employed to show when a report was considered good.

Whilst we are here, we could throw in online risk model validation like a dynamic back-testing report, so that as well reducing the time to get the reports into the stakeholders’ hands, the doubts around the validity of the models could also be mitigated. By combining dashboards between desk levels, stress testing and enterprise level risk numbers, it would be possible to illustrate how these metrics impact each other and work together, significantly increasing the understanding of risk from the multiple points of view.

Once the reports are out, the stakeholders themselves may find errors and email the risk managers, who then check those errors and make adjustments if necessary. This may be time critical, based on the importance of the numbers to the stakeholders. The bigger risk of efficiency loss is the crossing of emails, and the obvious operational risk impact that has.

If the system’s dashboards could be used to message around the acceptability or otherwise, and be fully audited, this would remove a significant headache and potential extra time loss.

It would seem that by streamlining the calculation process and increasing efficiency in the production control and result distribution process, there would be a good chance of increasing the cultural adoption of risk. This involves putting new eyes on the system and its outputs, and improving the communication between those who set the risk appetite and those who consume it. Something for the risk department to think about as they enjoy the lighter spring evenings.

While you’re here…

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

Q&A: HOW WILL RISK MANAGEMENT CHANGE TO SUPPORT THE NEW OTC DERIVATIVES MODEL?

Posted by

Last week I participated in a webinar and a Twitterview with DerivSource, covering the changing landscape of risk management in the new OTC derivatives workflow model.  It goes without saying that risk is at the center of regulatory reform; the new world of risk management must develop to meet the new requirements and challenges facing the OTC derivatives markets.

During the Twitterview in particular, DerivSource boiled down several big questions about drivers of change with regard to risk management today. It’s amazing how much you can actually discuss in a few simple tweets. This Twitterview, under the #derivrisk hashtag, touches on drivers and changes to risk management in the OTC derivatives markets, credit valuation adjustment (CVA), risk with relation to CCPs and potential strains on liquidity, the role of “real time” in risk, and more.

If you missed this risk management Twitterview with DerivSource, search #derivrisk or follow the full Q&A below. And as always, if you have your own questions about risk management in the changing OTC derivatives landscape, leave me a comment or ask me on Twitter to continue the conversation.

QUESTION: What is the biggest driver for risk mgmt improvements – transparency requirements or new risk in the new OTC model? #derivrisk

MARCUSCREERISK: New risks in the OTC clearing space are a far bigger driver for risk mgmt in my opinion…The FCM world is facing longer tenors, and more complexity than it has hitherto known #derivrisk

QUESTION: What is the biggest change impacting pre-trade risk mgmt for firms participating in the new OTC derivatives market? #derivrisk

MARCUSCREERISK: Understanding the longer term impact on the portfolio, and on the margin requirements, of any new trade… A deteriorating position in a longer dated trade could cause serious liquidity issues #derivrisk

QUESTION: How are firms improving market risk mgmt to support the new OTC derivatives model? #derivrisk

MARCUSCREERISK: We are seeing a great deal of proactive improvements w/ firms looking to… Leverage the risk experiences of the bilateral world and to tightly manage VaR driven margin requirements #derivrisk

QUESTION: What challenges do firms face in improving credit risk mgmt to support the new OTC model? #derivrisk

MARCUSCREERISK: Credit risk is less of an issue w/ central clearing. It’s replaced by potentially more dangerous liquidity risk… Credit risk remains central focus for bilateral trading, w/ firms bolstering collateral management & CVA measurement #derivrisk

QUESTION: What about credit valuation adjustment activities (CVA)? What is the driver behind improvements to #CVA activities? #derivrisk

MARCUSCREERISK: Controlling #CVA on bilateral trading enables front office to directly control its credit risk profile… It’s also interesting how different models are emerging from advisory to CVA-specific trading desks #derivrisk

QUESTION: Clearing via CCPs for some swaps introduces new liquidity strains. How will risk mgmt change to mitigate new liquidity risks? #derivrisk

MARCUSCREERISK: Since margins are determined, in part, by VaR-based models, firms can approximate & stress expected margins… This is the foundation of a working #riskculture and the best defense against liquidity risks #derivrisk

QUESTION: Is real-time risk management necessary to support trading and clearing OTC derivatives going forward? #derivrisk

MARCUSCREERISK: Real time is an interesting topic. The risk that cleared OTC trades carry is liquidity/margin based… Margin is driven by VaR models, using clean end-of-day data. As new trades enter portfolios, they need… to be reflected as they happen, but real time pricing has a far smaller impact #derivrisk

QUESTION: Enterprise-wide risk mgmt – a buzz term or necessity to support newly regulated OTC market? #derivrisk

MARCUSCREERISK: This was, is and should continue to be the focus and aim of all financial firms… Risk taking is what firms do. Understanding risk at all levels is vital 4 controlling biz & underlying risk takers within biz #derivrisk

QUESTION: What advice would you give a CRO struggling to make sense of how risk mgmt will change in the new OTC derivatives market? #derivrisk

MARCUSCREERISK: Call @SunGard! Seriously, the key is “risk” has not really changed at all. Best practice has become regulation… and to a certain extent, liquidity risk has been increased at the cost of decreasing credit risk… The need for a strong risk culture underpinning the firm has always been there, now it’s just more obvious #derivrisk

 

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

MF GLOBAL: DEFINING RISK MANAGEMENT FAILURE

Posted by

The failure at MF Global has sparked much debate within the financial industry, not least around the apparent shortfall of risk management at the institution. This raises an interesting question about what a good risk management program should be expected to achieve.

When thinking about risk management, it is important to understand the objectives and parameters of what the term “risk management” can mean. Some important definitions are:

  • RiskRisk sits at the very center of any financial organization. Financial services as an industry could be defined as the pricing of risk and the investment in activities where the expected returns are judged to compensate for accepting that risk.
  • Risk Appetite – The level of risk, actively sought, in search of higher returns, is one of the differentiating characteristics of financial firms.
  • Risk Tolerance – Unlike risk appetite, the risk tolerance is a measure of how far beyond the sought risk level an institution is willing to be, as situations and strategies play out.
  • Risk Management (Process) – Risk management itself, at its best, is an internal process that allows all knowable risks to be determined, from both a current and strategic viewpoint, to be priced correctly, and then to be compared to the risk appetite of the organization. Further, the position needs to be stress tested against a variety of possible scenarios to determine the extent to which the risk tolerance is breached.
  • Risk Management (Function) – The role of the risk management function is to facilitate risk management process, and to further aid the communication between strategic decision makers and those responsible for the risk appetites and risk tolerances. It is critical that risk taken into the firm is understood throughout that firm and in line with the corporate levels of risk accepted by the board and shareholders.

The key with the MF Global story is that failure of a financial firm is not the same as failure of risk management within that firm. A firm exists to take risks and profit from them (hopefully) and/or suffer from them (possibly).

It is clear, though, that the risk appetite and tolerance for MF Global was insufficiently communicated, creating a situation where losses occurred to investors and clients who believed the risk appetite to be far lower than it was. This is a management failure, rather than a risk management one.  A true risk management failure would be the failure to account for the liquidity effect of the losses, rather than the losses themselves, and the resultant lack of a contingency plan for that event.

It is also pertinent to compare the MF Global failure with the AIG bailout, as both have been labeled as examples of risk management failure. AIG presents an interesting comparison, as the firm moved from a top-down, focused risk management approach to a less coordinated one, which meant that knowledge of the risks, particularly CDO-based risks, and lack of communication about the known risks being taken, led nearly to a global systemic disaster. In this case, it is far easier to conclude that the lack of communication between the risk takers and the risk appetite setters was a risk management failure.

So, what are the lessons to be learned?

  • Risk management is not risk mitigation. Financial organizations exist to exploit under-priced risk, and some will suffer as a result, while other grow and prosper.
  • The risk management function must enable communication and full understanding throughout the organization. This function should be providing a backdrop to discussion on strategic risks, and empowering risk takers to work within the appetites and tolerances set by the board and senior management.

[NOTE: This commentary is based on reported details and “facts,” and therefore is based on incomplete information. In the case of MF Global, I have looked at what has been reported, and in the case of AIG, I use, as reference, the book Fatal Risk: A Cautionary Tale of AIG’s Corporate Suicide, which is an extremely detailed, but very readable, description of happened leading to the bailout.]

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

WHAT DO RISK MANAGEMENT AND THE SUPER BOWL HAVE IN COMMON?

Posted by

This blog post originally appeared on TabbFORUM here.

“The best defense is a good offense” may be one of the most oft-repeated sayings in sports, and it is a generally useful outlook in many situations. There are times, though, when the opposite is true. When a Super Bowl is won by a mere 4 points in a game that saw a defensive score in the first quarter and a key interception later in the second half, there are a lot of New Yorkers who would say that a strong, reliable defense is a prerequisite for an effective offense.

Coincidentally, not too far from the newly crowned New York Giants’ stadium is Wall Street, where a solid defense may be even more crucial to success.

A solid defense in capital markets means having a risk system that can act as an early warning against external and internal events whose impact could be catastrophic. Most importantly, though, and again like football, a good risk system has to do two things:

1) Reflect the culture and attitude of the firm. Protecting capital and understanding what risks are being taken, and the risk/reward relationship, has never been more important than it is right now.

2) Provide a platform from which the offensive, risk taking part of the firm can operate most effectively.

Both points require the system to be accurate and capable of intraday corrections to ensure that accuracy. They also need the output of the risk calculators to be available throughout the organization.

Now it is interesting to wonder, amongst all the post- Super Bowl talk, how many capital markets risk stakeholders will cast an eye toward the MetLife Stadium (the New York Giants’ home field), then look back at their risk reports and systems, and contemplate how much stronger they could be with a more robust defense in place.

vice president, risk solutions, SunGard's capital markets business

Connect on LinkedIn

RED-BLOODED RISK TAKING IN A TIMID NEW WORLD

Posted by

An excellent new book on risk — Red-Blooded Risk: The Secret History of Wall Street, by Aaron Brown — explains, in a way I have not previously encountered, risk management in financial institutions. It also raises specific questions that should be asked of any risk system, before its inception, during its development, and continually throughout its active use.

To understand risk management it is critical to first understand risk itself, and perhaps most importantly, to understand how the institution’s views of risk at all levels. Differences in the perception of what risk actually is would likely vary for a number of reasons, including, but not limited to:

  • Misinterpreting VaR as a worst case scenario, which it is not, or a directly predictive number, which it is not. This is common and represents, in the worst case, a potentially catastrophic communication failure.
  • Holding differing attitudes to potential loss as a consequence of risk taking. This occurs when the risk takers are acting in silos and are therefore blind to how they fit into the larger risk picture. It is also a consequence of different views of the risk/potential loss/potential gain relationship, which in turn speaks to incentives, attitudes and market knowledge.

It is a fact that trading, particularly in the derivatives arena, has become a highly quantitative activity, with complex structures being traded using mathematical models within the front office, often by highly trained mathematicians or financial engineers. It is also a fact that traders by nature are comfortable working with uncertainty, and their role can be seen as maximizing exposure to risks, which are seen to be appropriately (or under) priced.

In an environment where risk taking itself may be demonized with senior management at financial institutions expected to produce “statements of risk appetite” and provide evidence of that appetite being managed internally via limits, with exposures monitored throughout the day, it is easy to see how miscommunication can lead to the perception of control failure, and accusations of excessive risk taking.

I have blogged previously on the subject of risk culture, as has my colleague Mat Newman. With the above in mind, it is interesting to re-imagine the role of a risk manager as a facilitator for the risk takers to perform their roles in the context of the overall risk position, and as advisors to senior management in terms of the risks being taken, and how they fit into the statement of risk. This is particularly the case when dealing with low probability, high impact risks that start to appear in the “Extreme Tail Loss” or “VaR Shortfall.”

The viability and usefulness of any risk system has to be judged by its ability to assist the risk management team in those aims. This may always have been true, as an abstract ideal, but the current levels of market nervousness and fear have made now made it an imperative.

deputy head of strategy, SunGard’s capital markets business

TRENDS IN OTC DERIVATIVES: TWITTERVIEW WITH DERIVSOURCE, TABB GROUP AND SUNGARD

Posted by

With trends in OTC derivatives in mind, we recently held a Twitterview with DerivSource and Kevin McPartland of TABB Group to delve into the topic a bit more. We discussed the rising cost of participation in the OTC space, the data management imperative, and balancing IT budgets given the continued uncertainty around new rules.

I’ve included the full panel-style Twitter discussion below. If you have your own questions, leave a comment or send a tweet to me, Kevin McPartland or DerivSource.

- – – – –

DERIVSOURCE:
Cost of participation in OTC markets will rise; how will firms evaluate if they should exit this space due to higher cost? #TENfs

KMCPARTLAND: For most it’s simple ROI analysis – will upfront costs yield substantial profits? For others it’s cost of staying in the game at all #TENfs

TONYSCIANNA: Firms will need to do cost & risk analysis to determine whether or not it’s still cost-effective to be in markets they’re in #TENfs

DERIVSOURCE:
With electronic trading landscape still in flux, what challenges does industry face to reduce costs and improve returns? #TENfs

KMCPARTLAND: Many on the buy side will rely on their brokers to do the legwork –technology, compliance, etc. … For the dealers, it’s about designing the right business model based on what we know now #TENfs

TONYSCIANNA: It’s about creating transparency and liquidity. If you can get these things right, you will automatically reduce costs #TENfs

DERIVSOURCE:
How will firms cope w/ margin & liquidity squeeze clearing will introduce? What should firms change now to prepare? #TENfs

KMCPARTLAND: Margin financing, collateral optimization and other similar services are in high demand, and their use will grow. This will help reduce buy-side margin needs and help dealers make money around clearing #TENfs

TONYSCIANNA: Firms should work on collateral optimization for clearinghouses with major banks to provide their clients with optimized CM #TENfs

DERIVSOURCE:
Why should firms focus on enterprise data management now? How can firms improve agg of data across silos in cost efficient way? #TENfs

TONYSCIANNA: It’s a must. Firms will need to capture, standardize & have access to data across enterprise in real time http://ow.ly/6wt4q #TENfs

KMCPARTLAND: @tabbgroup sees transaction volume could grow 20 fold – related data even more. Waiting to deal with this is not an option #TENfs

DERIVSOURCE:
How will firms balance tight IT budget w/ onslaught of new reg requirements & uncertain timeframe for implementation? #TENfs

TONYSCIANNA: A lot of firms will seek point solutions as reqs are issued. We see larger firms taking a more enterprise-wide approach #TENfs

KMCPARTLAND: Some firms are deciding not to offer client clearing. The payback was not seen as justifying the cost… Others in the 2nd tier will take a wait-and-see approach to limit unnecessary work #TENfs

DERIVSOURCE:
Is there a silver lining to the transformation taking place in the OTC space for firms & the industry at large? #TENfs

KMCPARTLAND: OTC #derivatives reform will ultimately be good for the industry. The rules overreach in some parts, but … more automation and open access will ultimately improve liquidity and pricing #TENfs

TONYSCIANNA: Clearly the intent of transparency & reduction of systemic risk will benefit the industry, though will take awhile to get there… Industry coming together to create standards & reduce systemic risk is ultimately a good thing #TENfs

global head of strategy, SunGard's capital markets business

10 TRENDS IN OTC DERIVATIVES

Posted by

It is clear that regulatory changes are transforming the OTC derivatives space, from execution to settlement. There are many challenges at play here. As we head into 2012, market participants will need to manage large volumes of data in order to clear and process trades, and we will see new pressures on the cost and the more effective use of capital. In response to this industry transformation, my team and I have identified 10 key trends shaping OTC derivatives today.

I have posted the full list below. How is your firm approaching these 10 trends in OTC derivatives?

  1. Regulations such as Basel III, Dodd-Frank, EMIR and MiFID II are spurring financial services firms to improve their return on capital rather than simply focus on top line revenues.
  2. Shrinking profit margins may drive existing players to exit certain asset classes, such as structured equity, rates or credit markets.
  3. Competition will increase as greater transparency into OTC derivatives pricing and lower barriers to entry attract new players to the market.
  4. Firms will leverage new electronic trading capabilities for OTC derivatives to help reduce running costs and improve returns, particularly in their flow trading and market-making businesses.
  5. The cost of participating in OTC derivatives trading will rise, with the introduction of central counterparties altering the risk profile and margin requirements of OTC derivatives portfolios.
  6. Clearinghouses and market participants will require a consolidated view of collateral assets and margin movements to manage new pressures on margin and liquidity as well as new regulatory requirements for collateral.
  7. The need to optimize collateral and leverage every margin offset opportunity will become more pressing as the new capital charges take hold.
  8. Real-time risk analytics will become a necessity, with market best practice moving towards the incorporation of Credit Value Adjustment on a pre-deal basis.
  9. Firms will need to aggregate data from across asset classes and business silos as regulatory agencies shift the burden of reporting position limits and large trades from exchanges or clearing houses to firms.
  10. Firms will demand agility and adaptability from their technology given the uncertainty about the exact details and timelines for the new rules.

president, SunGard’s capital markets business

COLLATERAL IS KING: A VIDEO INTERVIEW WITH TABBFORUM

Posted by

I was recently interviewed by TabbFORUM’s Greg Crawford on the increasingly important and complex topic of collateral management.

As we discuss in the video below, new regulations and their impact on financial services firms’ business models are shining a light on collateral, which in the past hasn’t generated much discussion.

The first step is to think about three key questions:

  • Why has collateral management become so important?
  • Why is it so hard to manage?
  • How are firms dealing with this challenge today?
Brian Traquair on collateral management with TabbFORUM

CLICK ABOVE TO WATCH: COLLATERAL IS KING

What do you think? You can share your reaction to our discussion in the comments section below.

While you’re here: