M&A: The Devil is in the Data
The mergers and acquisitions (M&A) market had a relatively quiet 2022, but most analysts expect things to ramp back up in the second half of this year. This is good news, but there’s often so much focus on getting the transaction itself to the finish line that planning for the post-M&A environment takes a back seat. That’s asking for trouble in a variety of ways and is probably why some research indicates that roughly half of post-merger integration attempts result in a poor or failing environment.
The importance of data
It goes without saying that pre-transaction due diligence should include a careful review of all departments and systems for both entities, but one area where we often see things go wrong is in not paying adequate attention to the data.
Data encompasses a wide variety of metrics, of course, from customer lists that might overlap to sales resources to warehouse and inventory information to costs and pricing, and much more. A thoroughly planned integration across all those areas is a big undertaking, which is perhaps why many organizations simply hope for the best in that post-merger environment … and why so many fail.
What can go wrong?
Failing to plan is planning to fail, as the saying goes, and the consequences of poor data integration can affect the new organization in some fundamental ways:
Wasted time and energy: Duplicate or incomplete data and the frustration of manual processes to marry disparate systems can take a huge toll on productivity. And it doesn’t stop there, as those frustrations can have a direct impact on morale, never more vital than in today’s employee engagement crisis.
Customer experience: Few things will annoy customers more than when your ability to serve them takes a hit due to data quality issues. The last thing any business needs in this (or any) economy is clients wondering whether they need to find a new solution because their needs aren’t being met.
Decision making: It’s impossible to know where you’re going if you can’t tell where you are now. Data that’s out of date or not actionable impedes planning and nimble, proactive decision making.
Getting it right
Proper data integration starts with a comprehensive understanding of data and systems for each organization, and that means asking questions around the skills possessed by each entity, the tools and technologies already in place, where those areas overlap between the two organizations and where there will still be gaps. The ideal result is a plan that eliminates both those gaps and any redundancies.
It’s worth noting that these questions need to be asked not just of upper management but all the way to the front lines. There are often substantial differences between management perceptions and front-line realities in regard to exactly what the existing systems are capable of and how they’re being used.
In some cases, this process reveals substantial gaps that will still exist post-transaction. That might require pressing the proverbial reset button and finding technological solutions that are lacking in both organizations. It’s far better to discover this now and to know it needs to be addressed than to find those gaps after large amounts of time and money have been invested in integrating systems. A prime example of this is when we encounter two organizations that are still using local server-based applications or storage when a move to cloud solutions might offer multiple advantages.
At the time of the transaction, it’s a safe bet that at least some employees will have to learn systems or processes they haven’t used previously. Again, that makes it the right time for deploying new systems across the board if necessary. Figuring this out later, after those team members have already had to change their behaviors once, is a surefire recipe for employee disengagement.
It’s easy to view all this as a series of large challenges, and it is. But these moments are also opportunities to not only avoid pain and pitfalls but to create a new organization that is greater than the sum of its parts.