Lionpoint on building better, faster forecasts

GPs are hungry for swifter, smarter forecasts to cope with today’s extreme volatility, say Jonathan Balkin and Bill McMahon of tech consultancy Lionpoint.

This article is sponsored by Lionpoint

2020 is bound to leave many investment managers reluctant to predict what will happen next month, let alone over the life of the fund they just closed. That doesn’t mean they’ve given up trying to see around the next corner, but covid-19, and the resulting economic fallout, have encouraged triple checking before pulling triggers on new investments, acquisitions or funds.

There is no such thing as a true crystal ball, for all the potential of cutting-edge technologies. To get the most out of forecasting tools, tech consultancy Lionpoint founder and executive director Jonathan Balkin and director Bill McMahon believe GPs need to invest in cleaning up and consolidating current data.

It’s hard to imagine a GP who wouldn’t want faster, more accurate forecasts right now. What are the key hurdles to getting them?

Bill McMahon

Bill McMahon: It’s important to step back and examine the firm’s entire approach to data management. Because without question, what slows, complicates and even jeopardizes the accuracy of that forecast is the availability and standardization of data. What I mean by that is most firms have bespoke Excel models for micro-level forecasting. These are often owned by a single individual, with that one person being responsible for the inputs, outputs, really, the entire infrastructure.

But that isolates those findings and makes it very hard to overlay additional assumptions and blend them all together for a more comprehensive view.

Jonathan Balkin

Jonathan Balkin: I’d add, if you look at all the technologies used by alternative asset managers today, they are siloed by department. Most of our clients have Salesforce or DealCloud for IR, fundraising, and their deal pipeline, and plenty will have a system like iLevel, Chronograph or Cobalt for portfolio monitoring, or rely on Excel. Then they’ll have Investran, or eFront for fund accounting, or use a fund administrator.

Outsourced administration might be the most difficult scenario here since GPs won’t have direct access to data beyond an integration or an Excel sheet sent through a secure portal once a month, or once a quarter. These programs are all owned by different business units and seldom do they have a close alignment of data, with different teams using different definitions. A lot of the core systems and solutions I mentioned are great at capturing the facts of the organization, what happened today and what happened in the past.

But this approach means Excel is the only medium people have to connect different data sources into a single platform that can allow executives to play with inputs, and see the potential impact of say, if LIBOR goes up by 50 basis points or management fee compression, or whether they exit an investment now, as opposed to three years later. People can do these forecasts, but it’s in a highly siloed, manual and resource intensive environment.

What’s the danger of those silos, or even simply relying on Excel, as so many in the industry have done for such a long time?

JB: These silos create an enormous potential for human error, as they’re built from scratch and incredibly labor intensive, often with people inputting data by hand. Then there’s the human capital questions. What if the one person responsible for that model leaves the firm, or in the covid era, has to take a leave of absence to help care for a loved one or themselves? We’ve already seen a lot of that this year. These models are built by very smart folks, but if only one person has the Rosetta Stone to use them, how resilient is that system?

BM: This slower, labor intensive process can also erode a firm’s competitive advantage. GPs should look at where their staff are spending their time. Are they devoting hours to transcribing data from portfolio companies? Are they busy just building the basic quarterly reports? Because there are tools that can automate this kind of data collection and management, leaving more time for higher level analysis. Even if the current process is working well, a peer may not be wasting the time and resources on this kind of administrative effort. And that can end up jeopardizing a firm’s competitive advantage, as other firms get their forecasts even faster, and can act on that information sooner.

That’s why we’ve been developing our forecasting platform, using Anaplan, to build valuation models that connect to the waterfall model, the carried interest allocation model, and then go up to the GP, so a firm can move from an underlying assumption all the way to the top of house, in real time, all without the labor of building an Excel model from scratch over and over again.

We’re able to overlay the implications of a particular assumption at a macro and micro level, creating a framework to understand the different potential outcomes for investors and the overall return for the firm. Being able to quickly generate multiple forecasts employing varying sensitivities creates a more comprehensive and accurate view of the variables and what they mean for the potential upside and downside, and the window to achieve the best return. That means better risk mitigation and portfolio construction as well as a clear understanding of fund and firm economics.

If a key hurdle to better forecasting is data standardization and availability, must GPs first invest in consolidating their data to benefit from this kind of platform?

JB: Yes, and we appreciate a lot of the cutting-edge tech out there is an investment. However, we advocate that CFOs and COOs find ways to standardize systems so that say, finance and deal teams, use the same terms across systems and let the various disciplines inform how a CRM or accounting system is designed and employed. For example, compliance might have some useful input or needs from a CRM. This allows for closer collaboration and added rigor that will yield faster, more accurate results.

BM: It’s important to understand that even the best technology really only addresses quantitative tasks, not qualitative ones. The real promise of AI, machine learning and big data, is to free up the staff’s time so they can use their creativity to generate new insights. Micro-forecasts might have their place, but allowing various members of the team to zoom in and out of various assumptions and results speeds their ability to make smarter decisions. And that requires housing data where everyone can access it, and everyone can vet their assumptions.

JB: We strongly urge that these initiatives be driven from the top. The firm’s leadership needs to advocate for making the most of technology and instill a sense of accountability for those who lag in adoption. For example, during the Monday morning deal pipeline meeting, firms could leverage DealCloud to ensure all of the latest deal updates are discussed. There’s tremendous potential for better forecasts using today’s tech, but the only thing we can predict with real certainty is that the alternative asset industry will only become more competitive, and GPs will need systems that move at the speed of business.