I’ve sold three different financial risk management systems in the last three years. I’ve had the opportunity to speak with a good number of participants in the marketplace and have learned some very interesting things that about the industry in the process. I wanted to share a few vignettes on customer engagements I’ve personally encountered that I think are particularly telling.

For those not interested in my preamble, feel free to jump to the takeaways.

But… what’s the default?

While at a small niche risk management shop in New York City, I was selling a few licenses to a very large hedge fund so that they could price their synthetic CDOs to validate quotes they were getting from their bankers/dealers. The many ‘switches and knobs’, so to speak, that make a CDO pricing model complex can be used to tune the settings and calibration. Due to a number of factors no truly standardized modelling approach had been reached by the marketplace.

Given all of this, you’d expect the participants of this market to be pretty technically savvy. I went into the meeting having spent the morning reviewing all of the different ‘switches and knobs’, all of the different pricing techniques and nuances used in the models that we offered, to ensure I could be confident that I would be able to identify the correct combination that would meet the client’s expectations as they arose. What I didn’t expect was the question I got when I put the solution up on the screen and launched into my explanation of the different possible settings: “Yeah, okay, that’s all good. But… what’s the default?”

The client wasn’t interested in fine-tuning their risk models; they just wanted to use them out of the box. It slowly became apparent that this was not an isolated occurrence. A year later, this time selling risk software offered by a large data company to a division of a multinational private equity firm, the question came up again. The head trader inquired about the default data to be used when pricing nearly-bespoke term loans. After a multiple-week engagement detailing the different approaches that could be taken to obtain some form of “loan spreads”, the client’s response to our in-depth research was, “Just give me whatever the default is.”

It took me a full year and hearing it a few more times before I was able to fully embrace the complete ramifications of this question: Risk is often viewed as a cost center. The objective in today’s environment may not always be to chose the “best” risk system, but to chose the cheapest as long as there is street consensus on the model(s) being used [read: the regulators are satisfied with the math!].

tl;dr: A growing set of clients no longer want to have to decide how to configure risk models. They would prefer to be handed a standard approach as long as it is accepted by the regulators and the marketplace.

The fight for desktop real estate

In late 2013, a salesman and I were following around a base correlation book – a trading book so massive and so risky that only a few shops would even be willing to sniff at it, and so convoluted that it required highly specialized software to even begin to make a determination as to its valuation. We’d sell a firm an annual software license, help them sort out their valuation, and when they eventually passed on the trade, we’d inevitably get a call from another firm looking to us to assist them in accomplishing the same goal.

While working alongside the head synthetic credit trader at one of these extremely large hedge funds, I asked if they would continue to use the software after this limited engagement was over. This question received a snort of derision which I initially took as a slight against our product. I put on my product management hat and asked how the software might be improved so that he’d want to integrate it into his workflow. He dismissed my question and assured me that the product was fine. His explanation has stuck with me ever since:

“Look, I’ve got three monitors. I’ve got my chat and data feeds on the left one because that’s how I talk to my dealers. I’ve got my spreadsheets in the middle because… well.. everything is spreadsheets in my world, and I’ve got my email open on the right monitor. If you seriously want me to consider using a new piece of software, you have to demonstrate that it adds more value than one of those three screens.”

All of those UX design meetings where we had taken sides and passionately debated the best workflows for our tools seemed trivial in retrospect! The client didn’t want to use it if they didn’t have to! Again, we wrote it off as an isolated occurrence but sure enough, corroboration would come through future engagements. Even when I joined a firm whose risk management product had a cutting-edge new interface – one of the product’s core strengths was the UX – we still received the following questions from the marketplace:

  • “Is there a WebAPI we can use instead of loading the GUI?”
  • “Can I just export this data out to csv or xml?”
  • “Is there a way to automate this behind the scenes?”

The underlying sentiment here is similar to the one highlighted in the first vignette. Risk management isn’t always the final step in the user’s workflow, at least not the users that are driving the requirements to purchase systems. While the functions performed by systems are integral to the client’s end goals, they still have to switch software to finish a task. Consider the large asset manager that need to generate reports for their clients. While risk systems will produce data that goes into the end report, much of the analysis is often supplemented from another, likely front office, suite of products. Theoretically, either system can be used as the primary report generation tool, receiving supplemental information from the other. When the clients make that choice, do you think they’ll choose a middle office system or one that already sits on one of the front office user’s three monitors?

Our vision

We developed our platform on requirements gathered by listening to the marketplace and recognizing the convergence of a few key movements in technology. This is why my colleagues and I came back home to IBM Algorithmics: we saw an unprecedented opportunity to disrupt the status quo within the financial technology industry and an actual chance to really make a difference. Here’s how we plan on doing that:

Standardization of models: Simplify the user experience

  1. Remove the Data Configuration Question: Flip the implementation conversation from “What data would you like to use?” to “We suggest this setup. Do you agree?”
  2. Solve the Data Integrity Problem: Centralize the management of market data for all clients instead of having them be implementation-specific. One of the biggest time-sinks for our clients is the cleansing and enhancing of data.
  3. Abstract-Away the Financial Model: Combine the data with our financial models, and the client can get to the heart of what they’re trying to accomplish more quickly.

Standardizing models dramatically reduces the number of parameters the user has to specify. This makes products significantly more pleasant to work with, and enables access to those who were previously daunted by their complexity.

Delivery mechanisms: The democratization of risk

We want to break open our applications and liberate the components from their monolithic containers. This kind of software is known as open architecture. The trend today is to create a series of “microservices” that can be accessed by anyone, through any platform.

This allows new market participants to access risk management functionality. In our base correlation trader’s case, he could access our models through his Excel spreadsheets! Large institutions with legacy systems can fit us into the gaps without the sales process involved in replacing an entire technology stack which might otherwise span years. It also allows us to bring financial analytics to the masses. It democratizes access for the small to medium businesses who might want our models but previously couldn’t afford them.

Innovation: The Cognitive Frontier

What excites me most is the integration of cognitive methods with our risk management functionality. We have already identified several key areas in which we think cognitive computing, combined with data and analytics, will have a major impact on the investment management industry. Take a look at this demo, Investment Insights with Watson, to see where we’re heading.

In short, it’s time for a major change. We’ve modernized our products to align with the current technological expectations of the marketplace. We’ve simplified their use through standardization of models and data, democratization of access, and leveraging innovative new technologies. In the end, the burden on our clients will be dramatically reduced, the time it takes to accomplish tasks will trend towards real time, and with any luck, we can define a new paradigm for financial analytics entirely.

1 comment on"Next-gen risk management systems in a changing marketplace"

  1. […] 5 new IBM Algorithmics APIs. Algorithimics is a market leader in investment risk management, and these new APIs give developers access to high value risk capabilities such as stress testing portfolios and historical pricing. Learn more. […]

Join The Discussion

Your email address will not be published. Required fields are marked *