Writer108’s thoughts…

Predicting the future: Big data’s killer application?

leave a comment »

At the turn of the last century, a group of gamblers operating in America’s horseracing circuit seemingly achieved the impossible. The group ‘predicted’ the winner of every race they wagered upon, reaping handsome rewards for their efforts and confounding peers and bookies in the process.

In an industry where everyone searches for an edge – the group appeared to have harnessed the ultimate one – omniscience. However, their God-like powers owed little to crystal balls or mathematical models, instead, exploiting a simple quirk to their benefit.

Due to the illegal nature of horserace betting, bookmakers were forced to work away from the gaze of authorities – outside of racecourses, relying upon associates at the course to feed them results. This created a delay between the time a race finished and when the bookies were aware of the result – the more inefficient the messaging system, the greater the delay and opportunity for those with more efficient means of communication.

The group exploited this inefficiency, betting on races AFTER they knew the results, but BEFORE bookies had been passed the information from the courses – giving the illusion of precognition. However, bookmakers soon got wise to the scam and the invention and implementation of the telegraph across the bookmaker network brought the advantage (and streak) to an abrupt end.

Today, information is instant and ubiquitous and legislation prevents financial gain from inside knowledge, and any (legal) advantages in predicting the future with more timely and accurate information appear largely spent –forecasters now rely upon more esoteric and less accurate means of divining the future.

And yet, a new paradigm is emerging, breaking the old rules about the limited value of publicly available information. Enter stage left: Big Data – the tech buzzword of the year, and who’s killer use case appears to be its value in predicting the future.

Big Data differs from (what one must assume is) small data in three distinct ways: greater volume, complexity and speed. And everyone from the BBC to McKinsey are evangelising about its numerous applications and potential.

So, how does Big Data help predict the future? By exploiting three basic concepts: firstly, there is always someone out there with more information about the likelihood of a particular event than you. Secondly, sentiment often precedes outcome. And thirdly, the emergence of social technology has created an environment where sharing thoughts and opinions online have global reach and permanence.

To consider the application of Big Data in predicting the future, lets take a question that is close to my heart: ‘Which players will Tottenham Hotspur sign before the close of the transfer window?’ Before the advent of Big Data, one was forced to rely upon the scattergun approach of tabloid back pages, or some insider knowledge from the manager’s next-door neighbour’s second cousin. But with the emergence of Big Data we can now use the ‘wisdom of crowds’ to more accurately predict an answer.

For example, one basic measure is the frequency with which a particular player’s name is linked to the club – the higher the frequency of mention, the more likely the deal.

The following graph, taken from Google Insights shows the number of times Adebayor’s name has been mentioned in the same article as Tottenham, in 2011:

It should come as no surprise in mid-August 2011 (well after the upward trend had begun) Spurs’ Manager Harry Redknapp confirmed the club were trying to buy the player. The phrase ‘there is no smoke without fire comes to mind’, and when it comes to transfers, there are often multiple parties involved and information gets leaked. Prior to the advent of social technology these leaks had limited reach, but now, a single tweet or article can reach millions in an instant and once information is on the Internet, it is there forever.

Another way in which Big Data is transforming forecasting methods is through sentiment analysis. The basis of this approach has been around for some time – economists have used leading indicators to make predictions for decades. However, with the emergence of Big Data, the complexity of data being used has radically shifted; whereas economists use structured, numerical data such as production, or export numbers to predict macro-economic growth, Big Data has facilitated the use of subjective and unstructured data to see into the future.

One application of this is its use to predict the success of a particular product. By mining social media sites such as Twitter and Facebook, marketers are able to build up a view of the collective sentiment on particular products and predict their relative market success. A recent academic paper demonstrated the application of this approach in predicting box office sale for blockbuster movies, by demonstrating how the frequency of positive mentions of an upcoming movie on Twitter was positively correlated with its future performance with unnerving accuracy.

It should come as no surprise that businesses have begun to sit-up and take notice of Big Data and industries where success is directly linked to predicting the future, such as investment management, banking and government have led the way. Recently, the Bank of England announced a collaboration project with Google to use search statistics (such as the number of times the word unemployment is searched for) as a leading indicator to predict unemployment figures. The CIA has heavily invested in Big Data – including in one firm that claimed it predicted unrest in the Middle East nearly two years ago. And we now have the first Twitter hedge fund, whose strategy is based upon an academic paper that demonstrated the frequency with which companies and stocks are mentioned on Twitter is positively correlated to price direction.

However, not everyone is convinced about Big Data’s predictive potential, and critics highlight two major issues that stunt its usefulness. Firstly, there is the issue of noise – whilst there will be people ‘out there’ with more information than you, there are also those with less (or no) insight, and the Internet is not gated with a ‘true knowledge’ filter. As a result, a great deal of ‘information’ on the Internet is mis-information. Secondly, there is the problem of context and the imprecision in the English language, which does not lend itself naturally to sentiment analysis. For example, consider the Twitter post ‘The iPad is bad.’ Is the poster making a disparaging comment about Apple’s tablet, or are they saying it is baaaad (as in good). With basic mining tools, how can you find out?

Whilst protagonists will argue the law of large numbers will eradicate idiosyncrasies such as those described above, the charges against Big Data’s usefulness are clearly serious. So much so, that a cottage industry has emerged with numerous start-ups claiming to have developed complex lexical and mathematical algorithms to reduce the effect of ‘noise’ on Big Data predictions. Companies such as Recorded Future have secured millions of dollar in venture capital on the promise of unleashing the predictive power of Big Data with their sophisticated algorithms.

However, one company whose name is not yet associated with Big Data may emerge as the true powerhouse in the field. Gallup, the research-based consultancy, carries out and captures data from millions of questionnaires per year. The questions are consistent over time, the answers are contextualised and specific, and information about who answered the question is tightly coupled. For example, every day Gallup asks a thousand people – ‘Is your company hiring or firing?’ This is a very specific question; unlike subjective search results such as the number of times unemployment is mentioned. The question is directed to people in the know – the guys and girls directly on the ‘shop floor’ of companies’, and the answer is very specific – either hiring or firing. Imagine the potential of this information as a true lead indicator for a country’s employment health?

Gallup has built up a vast dataset of answers over time to thousands of questions. The following graph illustrates declining consumer confidence in the US market, based upon Gallup’s analysis:

The days of profiting by predicting events after they have occurred are behind us – and long may that continue. However, the legacy of how this problem was solved – the implementation of technology with instant and ubiquitous information transfer has given rise to a new predictive capability. Big Data’s potential is vast, and like the simple betting scam, it may not necessarily be those with the most complex models who harness its true power.


Written by Brijesh Malkan

September 5, 2011 at 7:26 am

On changing Srila Prabhupada’s books

leave a comment »

Look, I really do not want to write this. You really do not want to read this. So why don’t we just get this over with as quickly and painlessly as possible so that we can get back to more constructive and pleasurable uses of our time – like pulling out our fingernails with a set of pliers, shall we?

I don’t want to write this article – I said that already, right? Let me try to explain why: arguing about the rights and wrongs of changing Srila Prabhupada’s books is one of the greatest exercises in futility I can think of. Succeeded, perhaps only by writing blog articles that no-one ever reads(!). You see, trying to ascertain whether it is right to edit/rewrite/correct/defile (delete as appropriate) Srila Prabhupada’s books through debate and dialogue is akin to trying to decide whether the Mona Lisa is smiling by polling a group of 5 year olds. The truth is, (almost) everyone who debates the issue is speaking from a position of near total ignorance – how many of the people who take part in the ‘discussion’ actually spoke to Srila Prabhupada on the topic and received unequivocal instruction? Ultimately, the matter comes down to faith in the one person who may (or may not – how can you be certain you have not misinterpreted or heard what you wanted to) have received instruction – Jayadvaita Swami. And trying to persuade someone to have faith through debate and discussion is as useful as trying to convince someone to marry you by listing out pros and cons – neither gratuitous nor ultimately fruitful – imagine the story to the kids – hey, your mum really didn’t want to marry me – but I just whipped out my list and persuaded her through my superior debating skills! Hardly, the ‘hearts and minds’ realm that marriage (or faith) should occupy.

So there you have it – you do not know for certain what Srila Prabhupada wanted with respect to the editing/rewriting/corr… (oh, you know the rest) and I state unequivocally you can never know (short of divine revelation). So, stop it. Now. Please.

But you won’t stop, will you? No. You are upholding Srila Prabhupada’s wishes and legacy, right? Ok, so let me try a different tact: here are some of the arguments used by both sides of the debate. I present these in the hope you will see them for the nonsense they truly are:

1) Jayadvaita Swami is defiling Srila Prabhupada’s books and changing them to fit his own/GBC/Zionist/Illuminati/CIA/(please fill in missing conspiracy) agenda.

Now, admittedly, I have not met Jayadvaita Swami on many occasions. But I have met him. And I have listened to his classes and read some of his writings, and if Maharaja really is a GBC-apologist, he has done a damn good job of hiding it over the years – speaking against the GBC’s conduct in the past, maintaining financial transparency and showing no obvious signs of establishing a New World Order.

However, that really isn’t the point. Maybe Maharaja is a particularly good actor, maybe he is sincere. I don’t know. And you know what? Neither do you! Unless you have divine vision to see the truth of a person’s heart, stop wasting time trying to ‘prove’ conspiracy theories. And if you do have divine vision – use it for something more befitting this gift!

However, what may approach the boundaries of usefulness is providing specific examples of changes that may demonstrate a bias. However, I have yet to see many. The only one I know of is the removal of the line “greatest exponent of Krsna consciousness in the western world” from the description of Srila Prabhupada. Anti-change protestors claim this is direct evidence of the GBC’s jealousy and desire to usurp their spiritual master’s position. Whilst this is not necessarily a natural conclusion, the removal of the line does beg questions. But if you have such examples and want to protest/find out the reasons for such changes, channel your time and energy by speaking to someone who has influence and ability to answer the question.

2) Srila Prabhupada specifically approved Jayadvaita Swami to make the changes.

Pro-changers almost always revert back to Srila Prabhupada’s ‘whatever he does is approved by me’ line to substantiate their argument. But how do you know this line actually reads ‘whatever changes he makes to my books for all time are correct and authorised by me’. Guess what? Again, you don’t! So, stop using it and insulting everyone’s intelligence. And you, yes, you with your ‘rascal editors’ taglines, stop trying to prove every change is invalid with more out of context quotes of the same vain.

Both the pro and anti change debaters often get caught up in peddling their book change arguments as proxies for other issues. The pro-GBC folk use Srila Prabhupada’s words to establish the primacy of the institutional structure and leadership and brand any nay-sayers as anti-Prabhupada, fried-out, irrational individuals with axes to grind. And whilst there are many with serious concerns and objections to the changes, there are also those who use the changes to demonstrate the validity of their ritvik/conspiracy theories.

The point is not that all is well in Iskcon – clearly there are plenty of concerns and issues with respect to management, institutional reform and all the rest of it, but hiding behind book changes to make your arguments is likely to lead to confusion and ineffectual outcomes.

3) Hey, what’s your problem, both versions of the books are in publication.

The pro-change and ambivalent will argue that as both versions of the books are in publication, there really is no argument to be made against the changes. Great in theory, but this is a point of practise, not principal. Windows XP is still in production, alongside Windows 7. However, how many people stock, purchase and develop applications for XP, today? Next to none. This is not due to production, but how standards are established. Are there temples with policies where the old books are not to be used in classes? Are people who question which version a book is, labelled trouble-makers? How easy is it to purchase/access the old versions? How long will we publish the olds ones for – a generation, two? What happens then?

You see, it is not that simple. Standards naturally evolve over time and just because you publish the original versions does not matter a fig of its own accord.

4) It’s Jayadvaita Swami’s service – butt out.

Pro-changers argue that people should mind their own business – Srila Prabhupada gave this service to Jayadvaita Swami and he is carrying out the personal instruction of his spiritual master. How would you like it, if someone interfered in your service to Guru and Gauranga?

This is a valid point, but to a limit – as the nature of the service must also be considered. This is not quite the same as being given the instruction to wash pots and pans in Govinda’s – there are lasting and fundamental ramifications for us all – if Maharaja is wrong, or God forbid implementing some agenda, what does that mean for the Krishna Consciousness movement.

Final points

The point of this piece has not been to minimise the seriousness of the issue of changing Srila Prabhupada’s books, just to highlight how futile it is to engage in the attempt to persuade one another of our respective positions over the Interweb. From my observation, those who take up any opportunity to engage in such debates are those who love the sound of their own keyboard taps and rejoice in inflicting their own opinions and self-righteousness on others (something I am undoubtedly going to be accused of, or subject to).

As stated at the beginning, this ultimately comes down to a matter of faith, as we do not know for certain one way or the other – for every pro-argument/quote there is an equally compelling anti proposition. With such ambiguity, should we be continuing to push ahead with the changes? I don’t know. I have an opinion, but I do not know for certain. Do you? Really?

Written by Brijesh Malkan

July 6, 2011 at 5:44 pm

Posted in Uncategorized

Skill or Luck: How beach-balls and Fergie-time effect Premier League results.

leave a comment »

Whether it’s a beach-ball assisted goal for Darren Bent, the infamous ‘Fergie-time’ at the end of a match at Old Trafford (when Manchester United are not winning) or the ball ricocheting off an un-expecting Pirlo to win the greatest prize of them all for AC Milan – The 2007 Champions League FInal, luck has always played its part in the beautiful game. But how much?

The airwaves of radio telephone-ins are often full of supporters bemoaning a dodgy penalty decision or piece of bad luck that has robbed their team of a result, and they are often consoled by the advice that over course of a season, these random events even out – this week’s bad luck will be next week’s good luck: a statistical concept known as reversion to mean. Common wisdom argues therefore, at the end of the season it is skill alone that determines league placing and random events of luck have no tangible bearing on the overall outcome. But is this true? Some simple back-of-the-envelope statistics can shed some light on the debate.

Adapting a model originally used by Tom Tango to understand the contribution of luck in the NBA, we can define the contribution of luck to Premier League matches.

Firstly, we need to calculate the win percentages of each team in the Premier League. Let’s look at the most recently completed season (2009/2010):

The average win percentage was approximately 37% and the standard deviation is 17% – so there was approximately 70% chance of picking a team with a win percentage between 20% and 54%. The variance is 0.0273.

Now, if skill played no part at all in deciding game outcomes, the probability of winning a game would be 50% and for a 38-game season, the variance would be 0.0066.

From here, we can solve the equation:

Variance(skill) = Variance(observed) – Variance(luck)
= 0.0273 – 0.0066
= 0.02068

And the ratio of Variance(luck) to Variance(observed) can be used to determine the contribution of luck:

= 0.0066/0.0273 * 100
= 24%

So, according to this analysis, in the 2009/2010 Premier League Season, nearly one quarter of all results could be attributed to luck – a beach-ball, a slip in the six-yard box or another random event.

I ran the same analysis for 3 seasons and the outcomes were pretty consistent:

Put another way, skill determines 75% of all Premier League outcomes and a more skilful team should beat a less skilful teams 3 out of 4 times.

How much of a difference does this make? Well, if (my beloved) Spurs had luck go all their own way in the 2009/2010 seasons they would have finished the season on 87 points and been Champions of England. COYS!

Written by Brijesh Malkan

March 22, 2011 at 12:07 pm

Why John Lewis’s strategy works – to a limit.

leave a comment »

Yesterday’s news that John Lewis’ profits are up 20% will come as no surprise to anyone who has been on the receiving end of the John Lewis shopping experience.

At a time when bricks and mortars retailers are coming under pressure from a multitude of forces, shrinking margins and profitability, John Lewis has continued to thrive and operate at the premium end of the industry, avoiding a price war that has ensnared many of its competitors. This success has been due in large part to a competitive advantage that distinguishes the John Lewis shopping experience – outstanding customer service.

Founded in 1864, The John Lewis Partnership was built upon a set of principles relating to how business should be conducted and how the relationship between employers and employees should be structured – the mission of the Partnership is dedicated to ‘the happiness of its staff’, a far cry from the shareholder value edict espoused by most firms. It’s founder, John Spedan Lewis argued whilst capitalism had done enormous good, it had created an unstable society, with executive reward far outstripping contribution, and agency issues leading to the misuse of capital and value destruction – a message equally relevant today as it was 150 years ago. These beliefs and principles underpin both John Lewis’ corporate structure and strategy.

As the UK’s largest co-operative, the Partnership has a unique corporate structure, with its 69,000 employees all partners, sharing profits, information and decision making power. Decisions are made based upon a constitution, not proponents argue, short-term shareholder whims.

The Partnership highlights its unique organisational structure as the source of competitive advantage, arguing that it reduces agency risk and creates strong incentives for employees to act in interests of profit creation, the proceeds of which they share. As former group chairman Sir Stuart Hampson highlights: ‘Customers do notice the difference in the level of service they receive from an owner of the business’. This has enabled the Partnership to remain profitable despite its relatively small market share.

However, this same competitive advantage has also been an obstacle for a firm that operates in a mature market and in an industry with shrinking margins, and which must look to new markets and industries for growth.

Firstly, as an organisation which has built its reputation on quality of service, the expertise of its employees and unique corporate structure, its competitive advantage is inherently firm-specific, which necessitates new market entry through direct entry. This requires employees to move to new locations or extensive training for local staff, or risk damaging brand through using employees without same ethos. The reluctance of staff to relocate has meant overseas expansions have been modest at best – in 2008, John Lewis announced plans to open a store in Ireland, with the €50m Dublin store expected to open in 2013. However, John Lewis announced there were no immediate plans to expand into other parts of Europe.

Furthermore, whilst the Partnership’s unique corporate structure incentivises pursuit of profit and has led to prudent decision making, the Partnership is accused of an ultra-conservative ‘twin-set and pearls’ approach due to the un-diversified risk of partner employees – who have ‘all their eggs in one basket’, which curtails value creation through risk-adverse and slow competitive moves. This aversion to risk and pedestrian approach has meant the Partnership has missed out on opportunities and profits in new markets, such as China and India, where it’s competitors have begun to invest and reap rewards.
So, whilst John Lewis has undoubtedly done well this year, whether it continues to flourish depends largely on its ability to leverage the strengths of its competitive advantage whilst manage its clear shortcomings. Given its significant exposure to the UK economy, it remains to be seen whether a continuing economic downturn will force its hand to increase the speed of expansion and pursuit of profitable industries and markets, or whether its unique structure, the source of its competitive advantage, will ironically, hinder plans.

Written by Brijesh Malkan

March 11, 2011 at 12:10 pm

Posted in Uncategorized

Why Aditya Chakrobaty is wrong on Facebook

leave a comment »

Aditya Chakrobaty’s recent piece in The Guardian on the effect of social networking sites on our relationships struck a chord with many. To highlight the degrading effect of Facebook on our friendships, Chakrobaty provides an emotionally-charged rendition of Simone Back’s story – a young girl who was found dead on Boxing Day from an overdose. Ms Back’s last words had been in a status updated on Facebook – ‘Took all my pills be dead soon so bye bye every one’.  Chakrobaty writes ‘Of the 1,048 people listed on Facebook as a friend of Back, not one checked up on her’.

Ms Back’s story has startling similarities to another sad loss of life – that of Kitty Genovese. Ms Genovese did not commit suicide, but was violently raped and murdered. Her death did not occur in 21st century British Suburbia, but on the streets of 1960‘s New York. And her pleas were not conveyed over the internet, but physical screams in broad daylight. However, like Ms Back, Ms Genovese’s pleas were heard – in Ms Back’s case by her 1,048 listed friends, and in Ms Genovese’s by multiple bystanders who bore physical witness to her plight. And like Ms Back, Ms Genovese received no help.

Then, as now, sensationalist reporting of the incident claimed this was a damning indictment of modern society – in the 1960’s the blame was pointed at a decline in moral values, today it is the rise of social media.

The truth however, is much simpler – then as now, the blame lies not with external corrupting influences, but with each of us.

In the 1960’s, two American psychologists – John Darley and Bibb Latane, moved by Genovese’s death and unconvinced by the ‘declining moral standards’ argument carried out a series of experiments that changed the way we understand human decision-making. Darley and Latane’s experiments involved staging a number of events in public places. These events involved a ‘stooge’ suffering from an emergency such as a seizure or injury, and observing public reactions. The results were surprising – the chances of the stooge receiving help were not related to the seriousness of the ailment, how much effort was required or even whether there was a perceived risk in offering assistance. Instead, Darley and Latane found the number of witnesses to the event was the key differentiator – the more witnesses, the LESS likely help woudl be forthcoming. The effect was far from trivial – when one person witnessed the seizure, there was an 85% assistance rate, with 5 witnesses the rate dropped to just 30!

Since Darley and Latane’s experiments, a number of further studies have consistently demonstrated the same phenomenon – known as bystander effect – the greater the number of active witnesses (those in a position to make a decision), the lower likelihood of positive action.

Why would this be the case? Behavioural psychologists explain when we are faced with a decision to make, we often look to ‘shortcuts’ or cognitive ‘rules of thumb’ for guidance. These shortcuts (known as heuristics) provide a mechanism to reduce complexity and help us reach a decision in a timely fashion. One such shortcut is to ‘follow the crowd’, the assumption is when people all act in the same way, they must have a good (rational) reason for doing so, and therefore this behaviour can be safely emulated. By following the crowd we can ‘free-ride’ on the rational analysis of others to make a quick decision. Sometimes this type of decision-making results in good outcomes – for example, going to watch a movie ranked highly in the charts or buying a ‘popular’ car can be the ‘good’ decision. However, in other situations it can lead to terrible decisions. For example, when multiple people are expected to make a decision in real-time, each decision-maker looks to others for a signal on how to respond, and our innate reluctance to stand out from the crowd means there is danger of ‘paralysis by observation’ – as we all look to one another for signs on how to react, collectively doing nothing. Or worse, the sheer number of witnesses leads us to assume that ‘someone will surely help’ – and when everyone thinks and acts this way, it leads to no-one helping.

In the sad case of Kitty Genovese and Simone Back, the easy shot is to blame an uncaring or selfish society, or social media. We after all, all love a villain (especially when it’s not ourselves). Finding a ‘big bad’ to blame helps to fulfil our innate desire to make sense of the world and answer the question of ‘why’ – because if we can understand why, we can control the what – make society more caring, stop using social media to replace real friendships, and so on.

However, the truth is that causality is often a little more complex than we would like to believe. And the danger is that a poor understanding (and reporting) of ‘the facts’ could lead us on a witch hunt that is as useless as it is gratifying.

Written by Brijesh Malkan

January 13, 2011 at 8:38 pm

The seven worst reasons for paying a premium on an acquisition

leave a comment »

Managers use a multitude of reasons to justify paying excessive premiums on acquisitions – here are some of the worst:

1. Acquiree managers will not recommend the deal to shareholders unless a premium is paid.
It’s great when an acquirer states they will respect the status quo and ‘not change a thing’. Great – sure. True – almost never. You see, in order to realise synergies something has to change – if both companies are run precisely as they were independently, by definition, there is no value creation. As such, M&A deals almost always result in reorganisation – and for incumbent management of the acquiree (and there is always an acquiree even in a ‘merger’ of supposed equals), this normally means loss of power, income or ultimately jobs.

This ‘principal-agency’ issue creates conflicting incentives for owners who would be expected to accept an offer that reflects a premium over fair value, and managers who face potential job losses. As such, managers may feel no pressure to recommend a deal unless a substantial premium is paid.

However, acquisition premiums are not a replacement for credible corporate governance – and do not be surprised if your market cap drops if you pay a premium for this reason.

2. Mistaking value transfer for value capture.
One of the most common mistakes made in M&A is confusing value transfer for value creation. In May 2004, General Electric agreed to buy Universal Entertainment from the French media conglomerate, Vivendi. Historically, the two companies had a long standing relationship stretching back to the 1950’s – NBC would purchase and distribute Universal content. In their 2003 annual report, General Electric (NBC’s owner) commented on the deal: ‘Over the years, we have built a strong NBC franchise. However, broadcast television will be impacted by changes in entertainment technology and distribution, and we felt that NBC could not stand still. Universal adds tremendous assets to NBC, including great content, attractive cable services, a leading film studio’. A leading pundit wrote in the New York Times at the time: ‘I think there is real synergy…you will see a company that has both ends of distribution and syndication which creates synergy’.

The argument put forward was that by purchasing Universal, NBC would have control over Universal content ‘for free’. However, NBC was sucked into a control illusion – instead of paying for content through individual transactions in the market, NBC simply paid upfront in the form of acquisition price. There was no value creation here, General Electric simply moved moved money from one pocket to another. Furthermore, whereas prior to the deal, Universal was able to sell content to the highest bidder – be that NBC, Fox or any other network, the ‘merger’ prejudices the price Universal receives, destroying value.

3. Our Consultants/Investment Bankers say this is the right price.
Full disclosure: I have worked as a management consultant (I am pleased to say I have put this shady past behind me) and have advised companies on M&A deals. So, I am not saying a consultant or banker’s recommendation should lead a manager to avoid a deal – however, it is vital to understand the incentives of those whose counsel you keep.

M&A is a caveat emptor world – sure, take advice when doing a deal, but make sure you do your due diligence and run your own numbers. When the deal is done and dusted – you will be the one left to realise value from the acquisition (with the help of ten 28 year-old consultants from McKinsey, charged at £5000 p/d, of course).

4. We can ‘afford’ to pay a premium as the value of synergies is more than the premium we are paying.
Synergies are at the heart of M&A deals. Acquirer’s know this – and more importantly, so do acquirees, and rightly or wrongly, they may well want a slice of your pie.

However, before you start paying away synergies, be sure to remember two facts: Firstly, there has been a historic tendency to overvalue of synergies, and secondly, realising synergies has proven hard – very hard – history is full of high-profile M&A deals that started with the promise of synergies but ended bitterly, without any of the promised fruits ever tasted.

For example, Quaker Oats bought drinks maker Snapple in 1994 for $1.7bn, promising to realise distribution efficiency and revenue growth opportunities – Snapple was spun off 2 years later at 20% of the price. In 2002, AOL and Time Warner merged in a $350bn deal – the biggest deal in history, promising to transform the media and entertainment industries. However, write-downs and losses resulted in a near 100% loss in shareholder value.

The point is deals fail – according to research three-quarters of M&A deals fail to create shareholder value. Paying away synergies to an acquiree on the guarantee they will be realised in the future is an exercise in gross naivety.

5. We have excess cash
The Economist, recently ran a story stating macroeconomic conditions are ripe for a surge in M&A activity. The rationale was, during the crisis, firms built up record reserves – which must now be spent and ‘If they don’t spend them, investors will demand bigger dividends or share buy-backs’.

However, in most cases, returning this excess cash back to shareholders is precisely what should be done – if a firm does not have new projects available that are able to earn at least a return equal to cost of equity then doing anything but returning the cash back to shareholders will destroy value. Not doing so is an abuse of the agency role to act in the interests of the owner.

6. Diversification
The ‘diversification’ argument is often cited by executives for M&A deals. For example, firms operating in an industry with poor growth prospects or a high degree of volatility attempt to control macroeconomic forces by purchasing a firm in an unrelated industry. However, diversification with the aim of controlling macroeconomic forces does not create value for shareholders, as purchasing stock in unrelated companies easily (and less expensively) achieves the same aim.

7. To win
‘Murders and Executions’. This is how Patrick Bateman describes his job as an M&A banker in the film American Psycho. And for for many managers M&A is war – normally between the acquirer and the acquiree, or, between multiple potential suitors and the darling acquiree. In either case, managers often become embroiled in a battle to win – at whatever price. Paradoxically, the appearance of potential bidders normally leads to a situation where each bidder believes they must win – justifying higher and higher bids by the fact that if the competitor wants it so bad, then there must be a massive pot of gold there somewhere. What you end up with is a classic example of ‘winner’s curse’ – where the only winners are the acquiree owners (and the bankers – lest we forget them).

So, there you have it – the 7 worst reasons for paying a premium on an acquisition. If you here any of them being used (and you will) it’s time to reevaluate your investment decisions.

Written by Brijesh Malkan

November 22, 2010 at 9:39 am

Posted in Uncategorized

Tagged with , , ,

M&A Deal-making: An enemy at the gates or an enemy within?

leave a comment »

Recent coverage of the Terra Firma vs Citigroup case and Hewlett Packard’s (HP) decision to pay a 265% premium for a small technology company (hey, they do stuff in the cloud, man!) has placed Mergers and Acquisitions (M&A) deal-making back under the spotlight – and even for a market accustomed to derision, this light is particularly unflattering.

In the battle for growth, M&A has long been the weapon of choice for executives, with the recent economic downturn showing little sign of altering this trend: in a recent Deloitte survey over 50% of executives expected M&A to add over 5% to revenue growth over the next two to five years – an amazing statistic considering average revenue growth over the past 10 years for S&P 500 companies has been under 4% per annum.

This enthusiasm is undermined by evidence that shows M&A outcomes are at best underwhelming and at worst, catastrophic. For example, in January 2000, the GlaxoSmithKline merger created a pharmaceutical titan, expected to be a catalyst for ‘stunning growth’. In January 2000, GSK was valued at £114bn. Today the company is worth £66bn – a fall of £50bn. Stunning, yes. Growth, no.

Or consider the poster-boy for destructive M&A strategy – taught in almost every MBA class as the classic example of ‘how not to do things’ – the 1988 ‘merger’ of German car manufacturer Daimler-Benz and American counterpart Chrysler to form DaimlerChrysler. At the time, the deal was heralded as a ‘marriage made in heaven’. In 2007, after destroying billions of dollars in value, the nine-year ‘marriage’ came to an end when Daimler sold Chrysler to private equity firm Cerberus Capital Management. Two years later, still reeling from this devastating merger, Chrysler filed for Chapter 11 bankruptcy. Research tells a familiar tale – between half and two-thirds of all M&A deals destroy value.

The search to explain this dichotomy has occupied some of the brightest minds in business and academia, and whilst many conclusions are offered, most agree on one major contributing factor – companies pay too much for acquisitions. Research shows companies pay an average premium of 50% for new acquisitions. As to the reason for this wonton destruction of capital? Two enemies of value stand accused – an enemy at the gates, and an enemy within.

The Terra Firma case (unsuccessfully) attempted to bring to justice the enemy at the gates – the Investment Bankers. The suit alleged Citigroup banker David Wormsley (aka The Worm) lied to Terra Firma chairman Guy Hands, claiming Cerberus Capital Management (remember them from the Chrysler deal – private equity is an incestuous business!) remained in the contest even after the rival had informed EMI of its withdrawal. This misinformation forced Terra Firma to increase its bid and overpay for EMI – a firm now worth a quarter of what Terra Firma paid in 2007.

The reason for this alleged deceit? M&A is big business for Investment Banks – fees associated with deals account for nearly 50% of revenues – the EMI deal alone cost Terra Firma £100M in investment banking fees. Moreover, fees are tied to deal value – the more a company pays for a target, the larger the fee. In the view of Hands – and many who share his perspective, it is in the Bankers’ interests for firms to overpay.

There are those that take a counter perspective. Supporters of the banking community argue reputation is everything and a bank operating against client interests would quickly find themselves starved of deal-flow – the lifeblood of revenues. Furthermore, it is the responsibility of the buyer to validate the price paid – if an acquirer overpays, they have no-one to blame but themselves. Quoted in the Financial Times recently, leading business commentator Luke Johnson responded to Terra Firma losing their case against Citigroup with the following words – ‘I’m not sorry. You have to be grown up in private equity and know it’s a caveat emptor world. You do due diligence and back your judgment. Hindsight is not allowed.’

Mr Johnson’s comments are the basis of the argument that value destruction is not the result of an enemy at the gates, but an enemy within – the villains of this piece are not vampiric bankers who suck the value out of deals, but egocentric and incompetent executives who persistently fail to learn from prior mistakes. HP’s decision to pay a 265% premium for 3PAR is a deal that bears all the hallmarks of value destruction by the enemy within.

Firstly, executives are charged with overpaying for deals due to overestimation of synergies (I have managed to write 800 words on an M&A article before mentioning the word synergy – surely, some type of record!). As John Nofsinger, Associate Professor of Finance at Washington State University states ‘People are overconfident. Psychologists have determined that overconfidence causes people to overestimate their knowledge, underestimate risks, and exaggerate their ability to control events. Does overconfidence occur in investment decision making?…It is precisely this type of task at which people exhibit the greatest overconfidence’.

Synergies sit at the heart of M&A strategy. According to corporate finance theory, a company’s financial value is the summation of all future cash-flows, discounted at a rate that takes into account risk – the greater the risk, the deeper the discount. Assuming both an acquirer and acquiree agree on cash-flows and discount rate, rational expectations theory states a target shareholder will be indifferent to receiving returns in the future (as dividends or stock buy-backs) or now (in the form of an acquisition offer). Furthermore, for a publicly listed company, the Efficient Markets Hypothesis (the central tenet of investment management) states market price IS fair value.

This theory has two implications: Firstly, if a company is acquired at a price equal to discounted future cash-flows then the deal has not created value – in order to create value, the acquirer must be able to realise synergies, which in layman’s terms means the combination of the two companies must either increase revenues or reduce costs by an amount greater than the two companies could achieve independently. Secondly, when a company pays a premium above fair value, it must realise synergies equalling at least the value of the premium in order to break-even – in the case of HP and 3PAR, HP will have to realise synergies of an eye-watering $1.5bn just to break-even – and as research shows over 75% of estimated synergies fail to be realised, they face a considerable task ahead.

Whilst an acquiree shareholder should be indifferent to fair value, in practise they rarely are – the existence of another variable muddies the water – a rival bidder. The second charge against executives is that due to egocentric tendencies they get caught up in a battle to win a deal at any cost – destroying value for their shareholders in the process.

Consider for example HP and Dell’s battle to acquire 3PAR. If both companies estimated the value of realisable synergies to be $1bn, were HP to offer fair value, Dell would be more than willing to offer fair value plus $100M – as even at this price, they would stand to realise $900M in value. But then, surely HP would be willing to counter-bid with fair value plus $200M (still able to realise $800M in synergies) – a bidding war commences.

Knowing this, 3PAR would be unwilling to except fair value, as the existence of a rival will bid up the price, and at the extreme, one of the bidders would be expected to offer fair value plus $1bn – still able to break-even on the deal and stop a rival from realising any gains – paying away all value to acquiree shareholders in the process!

This ‘winners curse’ is the sad reality of M&A deal-making – unfortunately, it does not end here. This should be the end of story – the losing bidder should graciously walk away to fight another day. However, what about all of the time, effort and money expended to get to this stage of the process – deal-making in an expensive business and the cost of bidding can run into millions of dollars – surely we need to keep on bidding to recoup some of these costs?! And so a bid for $1.1bn above fair value is offered. This is known as the ‘sunk-cost’ fallacy – the inclusion of money already spent in valuing a deal. But then what do you do as a rival bidder who has just seen victory pulled from you at the final moment? How do you save face with your shareholders – whom you sold this deal to – explaining how its vital nature for the survival of the company? You keep bidding of course – $1.2bn above fair value, anyone? And before you know it, you are in HP’s position, having paid $1.5bn over market price for a little technology company that does something in the cloud.

It is easy to see from these examples why so many deals end up on the value scrapheap. However, whilst these examples serve to explain why companies overpay – they fail to explain why smart managers, knowing the dangers of M&A still engage in this perilous activity. Let’s face it – what has been described above is no great secret – you just have to follow the share prices of an acquirer and an acquiree following the announcement of a potential deal to understand the value dynamic – in almost every case, the price of the acquirer will fall, whilst the price of the acquiree will rise, as the market takes into account the expected transfer of capital from the acquirer’s shareholders to the acquiree. So why then the veracious appetite for deal-making?

The answer lies in human nature’s ultimate frailty – the propensity to control. Whether its cars, houses or companies, history shows we are willing to pay a premium for ownership and control – even when it makes no economic sense to do so. Consider for example, car ownership – research shows for the vast majority, car ownership is not cost effective – the cost of buying and maintaining a car outweighs the cost of rental. And yet, despite the statistics, car rental firms such as ZipCar struggle to convince prospective customers of the futility of ownership.

Control has a price, one that we are quite willing to pay – it should come as no surprise the premium above fair value a company pays for an acquisition is commonly referred to as the ‘control premium’. Executives argue control affords dominion and dominion reaps its own rewards.

For example, one of the most common mistakes made in M&A is confusing value transfer for value creation. In May 2004, General Electric agreed to buy Universal Entertainment from the French media conglomerate, Vivendi. Historically, the two companies had a long standing relationship stretching back to the 1950’s – NBC would purchase and distribute Universal content. In their 2003 annual report, General Electric (NBC’s owner) commented on the deal: ‘Over the years, we have built a strong NBC franchise. However, broadcast television will be impacted by changes in entertainment technology and distribution, and we felt that NBC could not stand still. Universal adds tremendous assets to NBC, including great content, attractive cable services, a leading film studio’. A leading pundit wrote in the New York Times at the time: ‘I think there is real synergy…you will see a company that has both ends of distribution and syndication which creates synergy’.

The argument put forward for the deal was that by purchasing Universal, NBC would have control over Universal’s and access to content ‘for free’. However, NBC was sucked into the control illusion – instead of paying for content through individual transactions in the market, NBC simply paid upfront in the form of the acquisition price. There was no value creation here, General Electric simply moved moved money from one pocket to another. Furthermore, whereas prior to the deal, Universal was able to sell content to the highest bidder – be that NBC, Fox or any other network, the ‘merger’ prejudices the price Universal receives, destroying value.

Or consider the ‘diversification’ argument often cited by executives for M&A deals. For example, firms operating in an industry with poor growth prospects or a high degree of volatility attempt to control macroeconomic forces by purchasing a firm in an unrelated industry. However, diversification with the aim of controlling macroeconomic forces does not create value for shareholders, as purchasing stock in unrelated companies easily (and less expensively) achieves the same aim.

Finally, even when an acquirer buys control over management it is often faced with obstacles that make value creation difficult – poor assessment of synergies, competitor moves to counter an acquisition, changes in customer tastes and employee resistance to change can, and often do conspire to inhibit value creation. The reality is whilst the premium is real, the control is often a fairytale – Warren Buffet once famously quipped: ‘Many managements apparently were overexposed in impressionable childhood years to the story in which the imprisoned handsome prince is released from a toad’s body by a kiss from a beautiful princess. Consequently, they are certain their managerial kiss will do wonders for the profitability of Company T[arget] … We’ve observed many kisses but very few miracles. Nevertheless, many managerial princesses remain serenely confident about the future potency of their kisses-even after their corporate backyards are knee-deep in unresponsive toads’.

The truth is the real enemy of value is an illusory temptress, to whom we all fall prey – the propensity to control. Under the influence of the control propensity, executives continue to believe in spite of the overwhelming evidence of value destruction associated with mergers and acquisitions – ‘this time it will be different’. However, the only difference tends to be the executives involved, as shareholders punish and replace incumbents, hoping the new batch will be able to exert some self-control in the face of the M&A home (company)-wrecker.

Written by Brijesh Malkan

November 20, 2010 at 5:26 pm