The primary goal of any private equity group is to increase the value of each company in its portfolio – a goal most effectively achieved by reducing total costs. Therefore, a common challenge for the leaders of any GP is figuring out “What immediate action can we take to help a portfolio company lower their costs?”
Cost reduction is often at the top of any company leader’s (long) list of goals, but sometimes that leader is missing what he or she needs most – relevant and reliable information on the company’s spending for goods and services. Without accurate spend data, driving any successful savings program is like trying to fight with one hand tied behind your back.
In many cases, portfolio companies are unable to capture relevant data and turn it into one set of actionable information because the data resides at different locations, on different systems, and/or in different formats. In these instances, they are missing a tremendous opportunity to consolidate supplier spend and drive bottom-line savings while also improving supplier relationships and service delivery.
Imagine if this problem could be solved for not just one portfolio company, but across all of your portfolio companies. Consolidating spend on goods and services across your entire portfolio can reduce total purchasing spend by well over 15 percent.
Very few private equity firms can independently access portfolio company data. When detailed data is needed, they must request individual data extractions from each portfolio company and are left to decipher true spend from isolated, non-standardized data points. With a streamlined, coordinated approach and defined data submission process, portfolio companies can easily produce the required monthly data to facilitate informed decision-making and cost reduction.
Private equity firms hoping to reduce spend should follow these key steps to improve data management:
1. Evaluate, select and implement the right tools
Begin by developing requirements for the types of reporting needed. Both static and analytical reporting are common, useful models. These requirements will drive the types of tools needed by the business. Once the requirements are developed, select the appropriate tools and implement them. Note that you’re not done yet: these new tools need data on which to report.
2. Define needs and extract format design
With data housed in multiple disparate systems, companies are faced with the challenge of developing reportable/analytical information. In order to leverage the tools selected and implemented in the previous step, create a standardized data format in a proprietary data warehouse. The warehouse tempers ‘local’ systems data by using industry standard extraction and transformation tools to ensure that the data is inputted in a consistent, reportable manner.
3. Receive and verify data
After the data extraction program is put to use, hundreds of tests and validations need to be performed to confirm that the ‘source’ data (that information extracted from the local systems) is received in the correct format, and remains accurate as it passes into the warehouse. Proprietary software can be written to validate this data load process. If tests fail, generated error reports need to be reviewed and required resolutions must be developed and retested until users are comfortable that the warehouse information is accurate. User “acceptance” is required to move to the next step.
4. Create and test static reports
As the warehouse is being created and tested, develop requirements for static reports by working with your key operational users. These reports are built using the tools selected and implemented in the first step, along with the data structures applied in the warehouse. After the data has been confirmed in the warehouse, these static reports are tested through rigorous scripts created by the user community.
5. Load and test analytical tool data
As the warehouse nears completion, the analytical tool chosen in the first step should be configured to present the data in user defined ‘dimensions.’ These dimensions are loaded with test data from the warehouse by users to be sure that multidimensional analysis can be completed and that spend categories (and other critical information) are presented accurately to the operational management team.
6. Train your users
Information is only valuable if it can be accessed by those that need it to make intelligent and sound decisions. That’s why user training is critical. Conduct multiple training sessions so that users understand how the data is organized in the warehouse and how to use the new reporting and analytical tools. Reporting projects that skip this crucial preparation step are the most susceptible to failure.
7. The result
The right data management program can alleviate the burden of the disconnected data ‘molehills’ that private equity firms experience with many of their portfolios today. By implementing the right spend analysis tools and leveraging a data warehouse, GPs can turn those molehills into a planted field of information that is ripe for the picking of opportunities for strategic sourcing – and savings.
Wally Powers and John Stiffler are Chicago-based directors at management and technology consulting firm West Monroe Partners.