Crisis lessons from a lead contamination scare

A high profile product safety scare in Australia has some important lessons in terms of crisis communication and standing firm in the face of risk allegations.

On 10 July the Queensland Building and Construction Commission (QBCC) issued a surprise report claiming that independent testing showed a kitchen mixer tap sold by German discount giant ALDI could contaminate drinking water with lead levels up to fifteen times the safety guideline.

With a reported 12,000 taps sold as part of a “special buy” offering, the story predictably went viral, generating very heavy media coverage throughout the country.

ALDI’s initial response was pretty much text book.

  • Placed a hold on further sales
  • Confirmed the taps had been tested prior to sale and were fully compliant
  • Pledged co-operation with authorities
  • Reiterated their return and refund policy on all products.

But most importantly they did NOT commit to a recall. Instead ALDI suggested customers temporarily avoid use of the Chinese-made tap while the company commissioned further testing. Such investigation, they said, could take more than two weeks.

In the face of massive adverse media coverage, and delayed re-testing, many companies might have folded and ordered an immediate recall, maybe using that foolish and largely meaningless phrase “an abundance of caution.” (Managing Outcomes Vol 4, No. 18)

But ALDI stood firm.  “Unfortunately further testing is a lengthy process that can’t be short cut . . . If these results present any indication that a health risk exists for our customers, we will take appropriate action. ALDI will always remove any product from sale if it is identified as a risk to our customers.”

This bold statement relied very heavily on the word “if” and could have gone badly wrong. For example, it might have provoked an involuntary, government-mandated recall – though that seemed increasingly unlikely with the revelation that the QBCC shock results had been based on testing just one single tap. However, the strategy paid off and the story effectively disappeared from the public radar within 48 hours.

Then, on 27 July, the company proudly announced that new tests showed the taps were “safe for use” and the results confirmed tests conducted prior to sale. But ALDI didn’t just issue a statement. CEO Tom Daunt personally hosted a video on Facebook, demonstrating both authority and confidence. And not only did he provide assurance to consumers, he also took the opportunity to rebuke QBCC for what he called its premature report which generated “unnecessary concern and inconvenience.” Their tests, he said, were “not conducted in accordance with the Australian Standard and were not conducted by an appropriately accredited laboratory.”

One tabloid newspaper tried to raise continued doubt, suggesting the QBCC had been gagged by threatened legal action. However, the media reaction was almost universally supportive. For ALDI it was a crisis communication slam dunk.

Posted in Crisis management, Reputation risk | Tagged , , , , | Leave a comment

Are crises really inevitable?

Just about everyone has heard the assertion: It’s not a question of IF you will have a crisis, only a question of WHEN. That might sound like a clever maxim, much loved by consultants and commentators. But is it necessarily true? And is it helpful?

This very common statement might be true, but equally it might not. Major crises are in fact relatively rare, which is one reason why they still create such headlines. Indeed, many organisations operate for decades and never face a genuine, existential crisis (as opposed to short term challenges and embarrassments).

Needless to say, that doesn’t meant they won’t face a crisis tomorrow. But if no serious problem has arisen recently, it’s easy to downgrade the likelihood of that happening in the future. In management terms that leads to the dangerous fallacy of “We’ve never had a crisis so why worry about it now?” which is one of the major barriers to crisis proofing.

So, to opine that “every organisation will have a crisis one day” is not very helpful. What sort of crisis? How serious? How damaging? Will it affect the whole organisation or just one division? And over what time period? The current strategic planning cycle? The life of the organisation? The tenure of the incumbent CEO?

There’s another problem too. Although the threat of supposed inevitability is probably intended to jolt you into action, it may also have the opposite effect: “If a crisis is inevitable at some vague time in the distant future there’s not much I can do about it, so I’ll focus on the here and now.”

Perhaps more useful than the generalized idea of crises being inevitable is the notion of crises as predictable. The former can produce inertia and hopelessness. The latter helps to create a clear path for action. If crises can be predicted, then there ought to be clear steps leaders can take towards prevention (or at least mitigation).

This alternative approach bred the concept of Predictable Surprises as championed by Harvard Professors Max Bazerman and Michael Watkins. They argue that many surprises, in all types of organisations, are predictable and avoidable. Moreover, they say predictable surprises are a failure of leadership, which happen when leaders have all the data and information they need to recognise the potential, or even inevitability, of major problems, but fail to respond with effective preventative action. They label these ‘the disasters you should have seen coming.’

It’s true that some crises are unpredictable and genuinely strike out of the blue. The problem, say Bazerman and Watkins, arises when the event was foreseeable and preventable, yet no action was taken. Indeed, the Institute for Crisis Management calculates that about two-thirds of organisational crises are not sudden unexpected events at all but are ‘smouldering  crises’ which occur after warning signs which should have and could have prompted prior intervention.

The answer, of course, is effective processes to recognise the red flags which precede just about every crisis, and to take proactive steps to make the organisation properly prepared before the crisis strikes.  That’s the core element of the process called Crisis Proofing.

In his best-selling book The Black Swan, Nasim Nicholas Taleb popularised the idea of Black Swan events — which are ‘highly improbable’ but can produce enormous shocks. Taleb’s advice to managers?  “Invest in preparedness, not prediction.”

Posted in Crisis management, Crisis Prevention | Tagged , , , | Leave a comment

How safe is your online information?

Cybercriminals and hackers get most of the attention. Think no further than the WannaCry and Petya viruses. But private information is equally at risk when trusted organisations carelessly mishandle sensitive data, jeopardising reputations and confidentiality.

Just last month it was revealed that an SAS trooper’s secret evidence, given ‘in camera” to the Australian Senate inquiry examining the military’s use of resistance to interrogation training, was mistakenly sent to the very organisation he was criticising.

A transcript of the soldier’s evidence, which disclosed the identity of a senior intelligence official and revealed highly controversial training methods, was mistakenly distributed to every witness who appeared before the inquiry, including military and civilian personnel. The secretariat for the Senate standing committee apologised, saying it was an administrative error and that they are “dealing with the individual concerned.”  Which was hardly helpful.

A few weeks earlier, home-schooling families in Victoria were distressed to find that details had been posted online about their children pulled out of school because they were bullied, had mental health issues or received inadequate support for disabilities. The blunder occurred when hundreds of submissions to the Victorian Education Department were uploaded to the department’s website without personal information being redacted.

Around the same time, the Department of Parliamentary Services in Canberra admitted that personal mobile numbers of many federal politicians, their staff and former prime ministers were inadvertently published on the Parliament House website.

Sadly, such failures are all too common and, needless to say, are not confined to government agencies. Think back no further than last December when the National Australia Bank mistakenly sent information including the names, addresses and account details of about 60,000 migrant banking customers to a wrong email account. The bank blamed human error and said 40 per cent of these customers had closed or had not used their accounts that year, and just under a third had balances of less than $2.  That might have provided some reassurance, but the central issue is not the detail but how and why such human errors keep happening and what’s being done to prevent them.

Of course genuine online errors sometimes occur, and they can be very damaging to security and reputation. Consider United Airlines, still reeling from global condemnation of the violent ejection of an overbooked passenger. Just weeks later the embattled company came under renewed scrutiny when a flight attendant inadvertently posted on a public website their secret cockpit access codes. What followed were the usual apologies and promises to improve. But when sensitive information is disclosed and lives are affected, it’s too easy to fall back on phrases such as “No-one is perfect” and “Mistakes do happen.”

Most importantly for managers everywhere, such errors are never “just an IT problem.” While hacking and cybersecurity tend to grab the headlines, and the usual focus is system integrity, the risk of simple human error is a massive issue and crisis vulnerability. To crisis proof the organisation, what’s needed is detailed attention from the executive suite. Improvement to data security demands better resourcing, better systems, better training, better supervision, better personal awareness and greater accountability. The old joke was that the most dangerous component in a motor car is the ‘nut behind the steering wheel.’ In the online world, the most dangerous vulnerability just might be the careless individual behind the keyboard.

Posted in Crisis management, Reputation risk | Tagged , , , , , | Leave a comment

Whistleblowers and the risk to reputation

Whistleblowers are a well-recognised possibility in most organisations. But how the organisation responds to a whistleblower can help determine whether the problem raised becomes a reputational crisis.

Take the recent case of CEO Jes Staley of Barclay’s Bank who provoked regulatory ire after he ordered bank security staff to try to identify a whistle-blower who raised questions about a friend Mr Staley appointed. The issue was relatively minor but regulators began investigating whether the CEO had breached rules relating to the treatment of whistleblowers. Staley apologised, but the bank now faces investor disquiet and unwanted reputational damage.

Or consider the quality recall currently underway by Hyundai, involving well over a million vehicles, which arose after a corporate whistleblower released internal documents about alleged engine defects. While the recall is serious enough, the New York Times reported the whistleblower was fired for allegedly leaking trade secrets, but later reinstated by Hyundai following a South Korean government ruling to protect whistleblowers. Then the company filed a criminal complaint against the man, which it dropped after he quit the carmaker last month.

Or finally, consider the Fairfax Media report late last year which claimed senior executives at Leightons (later renamed CIMIC) were under investigation by regulatory authorities after failing to respond to an employee who raised allegations of a bribery scandal in India. The report said the whistleblower was sacked in 2014 and his concerns were ignored. It added that the Federal Government and Opposition both subsequently urged consideration of whistleblower reform in the private sector.

The important lesson from such cases is that how a whistleblower is handled can sometimes trigger reputational damage even greater than the original issue.  Management experts advise that whistleblowers typically try to raise concerns using the “normal channels” before taking the “nuclear option” when they believe no-one is paying attention.

So what can be done to protect the organisation? Effective issue management and crisis prevention involves identifying problems early and taking steps to address them. This means whistleblowers should not be seen as a threat, but as a warning that something is not right. Organisations should encourage blame-free upward communication rather than a culture of “shooting the messenger,” plus have a formal process by which individuals can legitimately raise concerns.

The American writers Nystrom and Starbuck suggest that top managers should listen to and learn from what they call “dissenters, doubters and bearers of warnings” to remind themselves that their own beliefs and perceptions may well be wrong.

Similarly, the Australian disaster expert Andrew Hopkins agrees that leaders should not rely on assurances from subordinates that all is as it should be. He describes problems as “lying in wait to pounce,” and says mindful managers should use every means available to probe for these problems and expose them before they can impact detrimentally on the organization.

Indeed, Hopkins goes even further, arguing that mindful leaders should actually welcome bad news. “They recognise that it is often difficult to convey bad news upwards,” he says, “and they develop systems to reward the bearers of bad news.”

Welcoming bad news and rewarding Jeremiahs might seem a stretch for some executives, but there is no doubt that crisis proof organisations have formal systems in place to identify and respond to concerns before whistleblowers feel the need to act. That’s how planned issue management and crisis prevention help protect reputation.

Posted in Crisis Prevention, Issue Management, Reputation risk | Tagged , , | Leave a comment

Celebrity or scientist: Who do you believe?

Some celebrities are famous for having kooky ideas – think no further than Gwyneth Paltrow. But when these ideas potentially threaten the health of thousands of citizens, it can be a real challenge for issue managers and risk communicators.

Television chef Pete Evans seems to be emerging as the latest example of that age-old conflict between opinionated celebrities and experts who actually know the facts.

As one of the stars of Australia’s top rating My Kitchen Rules, Evans has long courted controversy with his promotion of the Paleo Diet. And that came to a head when a publisher had to withdraw and pulp a book he co-authored which suggested bone broth and chicken liver pate as superfoods for young babies.

But his continued claims that dairy strips calcium from your bones, that fluoride does not prevent cavities and that sunscreen is toxic, have now prompted the Australian Medical Association to issue a warning that his “extreme advice” endangers lives. “Celebrity chefs shouldn’t dabble in medicine.”

Managing Outcomes has no particular opinion about the veracity of his claims. But what makes this of special interest here is Evans’ stout defence of what has been called “celebration of ignorance.”

In an extraordinary TV interview, Evans was asked why he gave medical advice when he had no qualifications. “What do you need a qualification for to talk common sense? Why do you have to study something that is outdated, that is industry-backed, that is biased, that is not getting the results? That would be insane to study something that you’re gonna waste your time with? That’s just crazy.”

Evans is certainly not the first to take such a bold stance, and it’s ground which has been well travelled. However, as with anti-vaccine activists, anti-windfarm campaigners and others, such open antagonism towards science creates a major hurdle for issue managers and policy makers.

In an insightful essay in the latest issue of Foreign Affairs,* Tom Nichols wrote that people have reached the point where ignorance – at least regarding what is generally considered established knowledge – is seen as an actual virtue.  “To reject the advice of experts is to assert autonomy, a way for people to demonstrate their independence from nefarious elites – and insulate their increasingly fragile egos from ever being told they are wrong,” he says.

“I fear we are moving beyond natural scepticism regarding expert claims, to the death of the ideal of expertise itself: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, teachers and students, knowers and wonderers – in other words, between those with achievement in an area and those with none.”

The reality is that managing issues is hard enough at the best of times, and legitimate scientific disagreement makes it even more difficult.  But celebrity intervention in the face of scientific orthodoxy can make it virtually impossible.  And what’s the answer? While recognising that issues are by definition contentious and often revolve around emotion and unfounded opinion, never stray from the known facts and the testimony of genuine experts.

As Nichols concluded: “Experts are the people who know considerably more about a given topic than the rest of us. They don’t know everything, and they’re not always right, but they constitute a minority whose views on a topic are more likely to be right than the public at large.”

*”How America lost faith in expertise.” Tom Nichols, Foreign Affairs, 92 (2), March-April 2017.

Posted in Issue Management, Risk Communication | Tagged , , , | Leave a comment

Bean counters are vital for crisis prevention

Why are accountants and other money managers not always included as core players in crisis planning and preparedness? Why are they sometimes seen simply as tactical contributors, maybe concerned more with business continuity and system recovery?

The reality is that their full participation is vital. Crises cause a variety of impacts — organisational, reputational, operational and political — but no crisis impact is typically as immediate as the financial fallout.

The cost impact of a major crisis can be staggering. Think no further than BHP Billiton which lost $8.9 billion of market value in a single day when the Brazilian government announced a mega-claim against the company arising from the Samarco dam collapse. Or think of the Costa Concordia disaster off the coast of Italy, where refloating and removing the sunken cruise ship took total costs to well over £1 billion. Or the Volkswagen emissions crisis which has already cost the company more than $20 billion . . .  and counting.

It might be tempting to think that such massive crises are exceptional and only apply to big multinational corporations. But a study of Australian crises over a ten year period showed one in four crises cost the organisation affected in excess of $100 million and more than 25% of the organisations went out of business or ceased to exist in their current form.

In the face of such stark numbers you would hope that any organisation’s bean counters were intimately involved in crisis planning. But here again the facts suggest otherwise.

One indication of the worrying reality can be seen in a survey of financial analysts and investor relations officers at companies across Canada and the United States. Of responding analysts, 85% said a corporate crisis — fraud resulting in accounting restatement — has the greatest negative impact on a company’s value, yet over 50% said their company plan prepared them only for an operational crisis. Furthermore, 50% didn’t even know if their company conducted crisis simulations.

Commenting on these conclusions, Tom Enright, President of the Canadian Investor Relations Institute, said investment relations officers needed to play a much larger role in developing the crisis communications plan, executing crisis drills and regularly updating the document. He said they should be involved in the process from beginning to end, but our question is: why isn’t that always so?

Anyone who doubts the importance of this challenge need only look at the dark history of financial and accounting-related crises. Accounting fraud and financial mismanagement represent very high levels of risk, sometimes with massive losses, yet resulting crises continue to grab the headlines — choose your favourite example. And most often, such cases go right to the heart of the new concept of Crisis Proofing, with its increased focus on prevention.

The Crisis Proof organisation demands executives and managers at all levels who understand that crisis management is not just about what to do when a crisis strikes but how to identify potential crises and how to act to reduce the chances of a crisis happening in the first place.

While companies have auditors, fraud units, risk assessors and forensic accountants, these activities are typically seen as part of asset protection and risk management rather than as core elements of crisis preparedness. However, industry experts know that most crises are preceded by red flags, and the detection of warning signs should be a vital role for accountants and financial managers. If they are not intimately involved, then it’s time for a major rethink.

 

Posted in Crisis management, Crisis Prevention, Reputation risk | Tagged , , , | Leave a comment

Crisis or disaster? IT has helped blur the language

It’s time business stopped misusing the word disaster, and the IT industry needs to take a good share of the blame.

Most recently, an April post on the Hewlett Packard Insights blog, declared: “In general, anything that significantly impairs day to day work can be considered a disaster.” The reality is, No, it can’t!

Writer Wayne Rash went on to say: “It’s worth noting that a disaster in this (IT) context does not necessarily mean widespread destruction, loss of life, or general catastrophe. What a disaster means to you is defined by what interferes with your operations to the point that it endangers your business and thus requires a disaster recovery response.”

What Mr Rash is saying just might, maybe make sense in the IT world where such language is common, but it’s bleeding into general management usage, and that’s a big problem.

Of course the IT industry can’t take all the blame for devaluing the word disaster. Contrary to typical news media headlines, losing a crucial football match is not a disaster, nor is a temporary fall in a company’s share price. In fact, in recent times, the word ‘disaster’ has progressed from being devalued to being entirely trivialised.

A celebrity posting an unwise twitter message is now labelled as a ‘PR disaster’ or a ‘social media disaster,’ while a Hollywood star choosing the wrong dress for a red-carpet event becomes a ‘fashion disaster.’

This language is genuinely unhelpful and distracts attention from serious matters of real concern. Consider by contrast the United Nations definition of a disaster as: “A serious disruption of the functioning of a society, causing widespread human, material or environmental losses and exceeding the coping capacities of the affected communities and government.” Or within a business context, the Dutch crisis experts Arjen Boin and Paul ’t Hart say: “A disaster is a crisis with a devastating ending.” Anything less just doesn’t quality.

While there is clearly a massive difference between a pop culture ‘disaster’ and a true societal or organisational disaster, contamination of broader business language by misuse of the word has serious consequences for issue and crisis managers.

A key consequence arises from the widespread belief in the IT world that the answer to just about every such problem is a disaster recovery plan. As Mr Rash put it: “A disaster recovery response is the set of actions your organisation must take to continue operations in the face of an unforeseen event.”

Business continuity and operational recovery are vital, but they are just one tactical element of an organisation’s crisis management process. The modern approach to crisis management recognises that it should encompass crisis preparedness and prevention; crisis response; and post-crisis management (of which operational recovery is one part). And that it applies to every type of crisis – financial, organisational, legal, political and reputational, not just operational.

We all love IT and the wonders the digital world can bring to issue and crisis management. But any organisation which says: “We have a great business continuity plan so we are crisis-prepared” is in line for a very big and very costly surprise.

 

Posted in Crisis management | Tagged , , | Leave a comment