Crowe Horwath: Digging for value in big data

The famed mathematician John Allen Paulos once quipped: “Data, data everywhere, but not a thought to think.”

That quote has grown only more relevant as the age of big data begins.

The multitude of systems that companies employ today may be a great resource in gathering, storing, and managing data, but they remain only as valuable as the data inputted, and the skill with which that data is interpreted. In short, businesses need more than a few thoughts about how to drive tangible results from the data their systems produce.

One GP says that the pace of today’s dealflow requires being able to swiftly understand elements such as the working capital situation and how to improve the arithmetic there. And without accurate data available at the push of a button, it becomes incredibly difficult to pull the trigger on a transaction.

But the need for comprehensive data continues well after the company is a part of the portfolio. And even if the data is available, it doesn’t mean it can be easily deployed.

It’s a challenge to manage that wealth of information for effective value creation initiatives. The process requires taking a step back to understand the key drivers of the business and reconfigure data stored across a multitude of systems, spreadsheets, and whiteboards to track the key value drivers. With those unique categories defined and tracked, management can devise the right strategies to improve operations, often with immediate results.

The human factor may be required to maximize organizational potential, but it’s also responsible for most of any system’s failings. Improperly inputted data or a mismanaged implementation process can turn costly investments into little more than a digital version of a dusty lock box overflowing with a pile of receipts, invoices, and inventory lists. Faulty data produce faulty decisions, leading to all kinds of strategic errors.

More often than not, slow delivery of a diligence questionnaire or contradictory statistics can call the quality of data into question. In that case, it can require verifying and re-running numbers, until the flawed data point is discovered.

One GP admitted to looking at the same three or four stats every day at one portfolio company until they made sense. For example, the gross profit margin by a customer contradicted the overall 2 percent gross profit margin being reported. Eventually the overall gross profit margin turned out to be 32 percent.

“Without good data, they don’t have a handle on inventory or understand how much raw material or extra parts they have,” says Cory Eaves, an operating partner with General Atlantic, who’s previously been the CTO at two enterprise resource planning (ERP) companies. “It leads to sub-optimized decisions across the business.”

According to a client analysis report by a large consulting firm, a multi-billion-dollar auto supplier was unable to cope with a substantial upturn in volume due to a number of issues, not least of which an out-of-control backlog of past due forged and machine parts. And a construction supply business wrestled with drastic seasonal swings of volume, without a formalized inventory segmentation and replenishment strategy.

Fixing the system

Given how much returns are driven by operational improvements, GPs shouldn’t discount the importance of properly translating the immense amount of data generated.

Underperforming businesses may certainly indicate systems and data issues, but how does anyone go about improving them? Is it always a matter of scrapping the existing system and starting from scratch?

“Most ERP systems are proven, scalable and provide the same functions,” says Eaves. “The actual package is one of the least important decisions to make in improving your data process.”

Bart Kelly, a principal at Crowe Horwath, says that failing to optimize data and value drivers frequently involves simple human error and mismanagement. “It’s bad training or lackluster upkeep, or it’s underutilized infrastructure.”

Several market participants warn that most flaws in data transparency and value drivers can be traced back to mistakes made during implementation. “Companies begin by identifying what they want from their systems, in terms of the benefits and outcomes, and less about which vendor to choose.” says Eaves. “It’ll better inform all the decisions along the way.”

And in identifying those outcomes, management should use current, verified performance data to indicate flaws, vulnerabilities, and lingering questions the current system seems unable to answer. Experts warn against trusting a gut feeling, otherwise known as “inherent business knowledge” to decide what the systems should do. It’s a catch-22 of sorts; the only way to improve one’s understanding of the data is to understand it sufficiently to know what’s wrong.

One of the most common mistakes cited by experts in implementing a data-driven improvements project is treating it as an IT matter. “They’ve got to view it as a driver of business performance, rather than some data cleanup exercise,” says Kelly. And that involves placing the project in the hands of someone outside of the IT division, which often can be siloed away from core business operations.

“It’s absolutely critical that the owner of the project be someone on the business side, rather than IT,” says Eaves. “For middle-tier companies that person may sit on the finance side, but for larger operations, there may be the appointment of a team of five or six individual people dedicated to the task.”

Given the importance and potential workload, does reforming an existing platform always involve hiring new staff?

The consensus seems to be that extra staffing might be helpful, but a necessity is to designate someone to oversee the improvements, from implementation onward. “Even if that person is a current employee, they need to designate a person to handle the project on a full-time basis,” says Eaves. But designating that person could require an additional person to handle that individual’s prior responsibilities, so some hiring may still be involved.

Another consideration involves working with an outside consultancy to navigate the process. “People tend to overweight the selection of the software and underweight the selection of a partner in implementation,” says Eaves. “A good partner will have gone through this hundreds of times, while – hopefully – you will only do it once.” It isn’t just a matter of technical expertise, but the change management and insight into key business and value drivers to ensure the best possible outcome.

Once that partner is tapped, the key person is appointed, and the priorities are established, there’s still a matter of ensuring the quality of the data, and that means rigorously vetting the stats as they are entered. For example, safety stock levels should be verified by objective measures, like number of days on hand or aging. What are the actual supplier lead times, as defined by their behavior in the past? “Take an impartial look at overall performance in key categories like high, medium, and low runners,” says Kelly.

Often, this involves establishing procedures for inputting individual data points and defining the underlying source for a given number. If this sounds like a taxing amount of work, it may prove worthwhile in the end. “When we see companies adopt more robust systems, it forces them to focus on data ownership and management and who own the process,” says Eaves. “And in doing that, it can go a long way in improving the quality of data being employed to make business decisions.”

However, even if the system is up and running smoothly, companies may not be tapping the full potential of the data at their fingertips. The best way to employ an ERP may be to go beyond downloading just payables, receivables, and inventories.

“The system collects all these data points and has a penchant for certain things, but there’s so much more to harness,” says Kelly. “All too often ERPs are treated as black boxes of information, when they can provide genuine insights if you combine them with the right insights and dig deep enough.”

As with any tool, its effectiveness is about how it’s used. For most business platforms, the difference between being a glorified ledger and an oracle of business performance involves the quality of questions. For example, can the company see what inventory items are negatively affecting it at the SKU level? Does its visibility into what customers or vendors have offer the most opportunity for improvement? What SKUs or customers are most at risk? Now there’s a hunt for data to answer those inquiries. But no vanilla shell system is going to automatically offer up the best way to slice and dice the numbers a company has on hand.

Every business has its own vulnerabilities, performance issues, and market conditions to contend with, so it’s a matter of identifying the key drivers of the enterprise. “If you sell electronics, it’s about turnover, and if you’re in food retail, it’s a matter of the short shelf life,” says Kelly. “So every company has a set of drivers that should inform what questions it asks of an ERP.”

That’s where the art of using these systems comes into play. Some GPs tap a mix of outside consultants, internal expertise, and portfolio company management to figure out what those drivers might be and to sift through the existing data sources for answers. One of the biggest assets of any data strategy is the ability to quickly calculate the data in any number of ways to find pivot points that could improve performance.

Smaller buckets, better questions

More often than not segmenting data points into smaller and smaller categories can highlight opportunities and risk. “We use segmentation methodologies to pinpoint areas of potential upside,” says Kelly. “That means dividing inventories, receivables, and payables into transparent categories that allow us to truly see performance at a detailed level.”

This process involves no small amount of trial and error as some product categories or customer types might appear to be crucial in theory, only to prove otherwise when the numbers are crunched. But that’s a necessary effort to discern where the real hot spots are for operational improvement.

Experts stress that some of best methodologies to apply are the classics, like the Pareto principle, which holds true across most applications of inventory, receivables, and payables. By combining these with key business drivers that have been identified such as volatility, age, gross margin, risk factor, fill rates, and the like, firms can discern the unique challenges facing a portfolio company and begin to address them.

This process can also streamline communication to senior management in narrowing the focus to a few critical data points, so management can make an informed decision as to how to proceed. Private equity ownership often involves an aggressive agenda of change, and this helps prioritize the decisions that will make the biggest difference to the company’s fortunes.

When data analytics are creatively deployed, the results can be impressive. Kelly recalls collaborating with one private equity firm that had just acquired an appliance and HVAC supplier, where optimal inventory levels were hard to calculate. The company couldn’t rid itself of dated parts since many appliances were still in use, and yet it still had to maintain inventory for a growing list of parts for new devices.

By divvying up data according to SKUs into smaller and smaller categories, Crowe Horwath was able to help devise a formal approach for when to replenish a given product. “We transformed the data to determine which categories were volatile and which were consistent, and they weren’t always what we expected,” says Kelly. Within the first year of private equity ownership, that effort alone created a 20 percent inventory reduction along with a fill rate increase from 70 percent to 90 percent.

Similarly, that construction supply business that couldn’t cope with seasonal swings in activity was able to increase inventory turns by 34 percent and reduce working capital by $12 million by introducing new inventory system parameters, all grounded in data found in the existing systems and business processes. Mining the existing data sets with some skill can capture low-hanging fruit for immediate impact.

Eaves recalls working with a professional services business that did contracting work for the government, and, according to the ERP, it was taking over a month to invoice a client. Delivering an accurate invoice in a timely manner is a major driver for working capital, so the company worked to automate the process. So as work is completed, approved, and audited, that data flows directly to the invoice. “Now, it’s less than 10 days to issue an invoice and that makes a huge difference,” says Eaves.

Of course, the better one understands the drivers, tracks the relevant data, and introduces new initiatives, the better chance any long-term strategy has of succeeding as well. Like any other technology, transactional-based systems are still subject to the limitations and the creativity of the people using them. In the right hands, they are less a black box and more of a Rosetta Stone, helping to translate the countless data points into a course of action.

Launch faults

Before companies can make the most of any data system, it’s crucial to avoid these common mistakes.

Treat it as an IT project: IT staffs rarely understand the business sufficiently to dictate the key metrics and priorities of any system.

Appoint a part-time leader: Someone should own the data system project and that role should be a full time responsibility.

Go it alone: Outside consultants bring invaluable experience that comes from multiple system implementations, when most companies only want to do this once.

Trust one’s instincts:
Any data system approach should be developed by a deep dive into the current numbers, rather than merely an intuitive understanding of a company’s priorities.

This article is sponsored by Crowe Horwath. It was published in a supplement with the October issue of pfm magazine.