You might recall that about three weeks ago, there was a mini-spate of articles about how leaks imperil M&A deals. Here’s a typical one, from CFO.com (not to pick on them in particular, there were many others). Here’s how it reported the contents of a new study from Cass Business School:
The Cass research — which included results among more than 350,000 mergers and acquisitions between 1994 and 2007 — found that 49 percent of all leaked deals are completed, for example, compared to a 72-percent completion rate for those not leaked.
It’s fair enough reporting, since all that reporters had to go on was a press release. In the pitch which got sent to most of Portfolio we were told this:
In a study analyzing over 350,000 global M&A deals, it was found that less than half of all leaked deals complete, compared to 72 percent of non-leaked deals.
It was enough to pique our interest, at any rate, and Portfolio’s Caitlin Roman went back and forth a few times with the PR agency trying to get a bit of color on the whole thing, and/or to get a copy of the actual report. We were vouchsafed an executive summary, along with this note:
Please note that the contents are only for background, we do not
used in coverage.
Were they trying to hide something? Turns out, yes they were. And now that the report has finally appeared online, the whole world can see exactly what they were trying to hide:
Look, they have figures down to the nearest basis point! They must be super accurate!
"The exhaustive research project" – words dutifully quoted in many news articles – turns out not to have analyzed "more than 350,000 mergers and acquisitions" after all. In fact, the total number of M&A deals that it examined was, er, fifty-nine.
Here’s what the study did not do: it didn’t take a universe of deals, work out which were leaked and which weren’t, and then calculate how many of each group were completed. Instead, it took a (very small) universe of leaked deals, most of which were never completed, and then compared that to an enormous universe of "total deals", most of which were completed. As far as I can tell, they never even checked to see that all their 59 leaked deals were even included in the "total deals" in the first place.
In other words, it’s almost a textbook case of an apples-to-oranges comparison. The study specifically targeted pre-announcement leaks: that is, deals which haven’t been done yet and could fall apart at any time and for any reason. It found just 59 such leaks, over a period of 13 years.
The researchers then took a deals database of more than 350,000 deals over the time period in question. The database is in no way intended to include every instance of one company looking at or talking to another company: it’s a database of deals, after all, not a database of failed M&A bankers’ fantasies. And it turns out that most of the deals in the deals database were completed; some fell apart, but only after they had been formally announced.
And by comparing these two utterly incomparable figures, the researchers concluded that the probability of a deal going through was 2,286 basis points higher if it wasn’t leaked than if it was. What on earth were they thinking? Oh, hang on, let’s have a quick look at the introduction to the study:
Welcome to this important new research study commissioned by IntraLinks from Cass Business School –
M&A Leaks:Issues of Information Control.
In the current M&A environment, getting your deal process right has never been more necessary…
As the market leader in secure online document exchange, IntraLinks is committed to delivering a better
way to control and monitor the exchange of confidential information involved in the entire deal lifecycle.
We look forward to continuing to work with all participants in the M&A marketplace over the coming year and helping to ensure the highest level of security and control.
Aha, things are getting clearer. This isn’t a proper academic study at all, it’s a report commissioned by a document security company to try to demonstrate the need for document security. Of course it always helps if such reports get press, so might as well tell reporters that the study is "exhaustive" (without actually showing them the study, of course) even when it simply isn’t.
I’ve never quite worked out why professors at reasonably high-profile schools like Cass allow themselves to be pimped out in this manner. But it would be nice at least if journalists insisted on seeing research before writing about it. That way the rest of the public might not end up being fed misleading information.