If you enjoyed my articles on the lying and cheating culture of academia, and corporate, you might have been wondering to yourself: how can we do better? And that’s exactly where behavioural risk (management) comes in.
Behavioural risk is the possibility of undesirable risk outcomes caused by poor workforce behaviours (or conduct) and the cultural factors that drive them.
At the end of the day, the whole is bigger than the sum of its parts. And behaviour within an organisation is no different. Organisational culture is the sum of employee behaviour, and then some. It is complex, very difficult to observe and even more difficult to change. There is possible no ‘one size fits all’ for what a desirable culture looks like – it’s likely quite unique to each organisation and the organisation’s purpose, strategy and values, as well as external factors such as the competitive context.
Now you might be wondering, where does risk come into play here, rather than just organizational culture? The interaction of organisation culture with risk management is referred to as risk culture. This is a key focus area including in regulations, to ensure the organisation culture supports strategy execution within risk appetite, and promotes sound risk responses.
Potential risks come to life as issues because of human behaviours - people either doing (or not doing) things that are inconsistent with the strategy, values and risk appetite the organisation desires.
And here we have behavioural risk: the merger of the organisation’s culture and its risk appetite – as created through people’s behaviours.
Looking at the PwC example again, we can see what went wrong in terms of behavioural risk: there’s (external) regulations in place to prevent illegal behaviours from happening, but there’s something in the organizational culture that makes it worth the risk, somehow. With this example specifically, it was a lot of money. Billions were made through lucrative business deals where multinational for-profits were made of multi-national tax laws, before they ever hit the governmental registry. There’s a lot of money in there. Now you might think: but doesn’t that mean there’s just a few rotten apples in the bunch? Sure, might be. But what did the good apples do? Seemingly, nothing. And that’s also a part of behavioural risk.
Feeling unable to speak up when you see something bad happen, or knowing for a fact that the messenger gets it worse than the perpetrator means there’s no social opportunity for someone to speak up. Social opportunity is one of the 6 key drivers in the COM-B model. It identifies specifically the barriers to (good) behaviours, here the act of speaking up, or speaking out against someone. Social opportunity heavily relies on there being no backlash on the behaviour, from a more social perspective (e.g. being fired, ostracized, disbelieved). People always seem to condemn whistleblowers, but I often wonder what other choice they had. It also always makes me wonder if when organisations are *surprised* by a whistleblower, whether they really and truly are. For me the question remains: how did you NOT know?
So how do you make sure organisations have a well-managed behavioural risk? How do you manage an organizational culture? Step 1 is always clarity. What is it that you want, and what is it that you don’t want? From the PwC perspective, what PwC wanted was an ethical organisation, that behaved in accordance with the law. Undesired was anything that wasn’t that – so the current situation. From the academic perspective, desirable behaviour is good and robust science. Things that replicate and aren’t based on fraud. A solid foundation of findings from which others can build further.
Step 2 is identifying barriers to what you want. And often they can be found in the incentives of the individuals themselves: the individual incentives are massively misaligned to these positive organisational outcomes. I think most people who work the crazy work weeks in consulting firms want to move up quickly, and preferably at great pay. How do you do that? Attract amazing contracts, have a lot of work and people seeking you out to work with, and pay you. So you need a niche. A unique selling point to make you that person. Well, insider information would do just the trick. Academia is no different. People at the bottom of it (early career researchers) are severely underpaid and overworked. The people who are at the top very often got there because they found something really novel, interesting, unlikely or ground-breaking. But that is really difficult to do, and is more based on luck than skill. Once you find something like this, you can build a theory around it, continue publishing on it, get some speaking gigs, start a company exploiting this one thing. You can make good money for being an expert. Money and vanity – who could resist?
But let’s not blame the individuals exclusively, because these incentives have to come from somewhere.
Step 3 is identifying if your current culture even enables the desired behaviour. And with both the examples just outlined above, the answer is no. There is no counterweight to the massive benefit that you can get from cheating. Now you’re probably thinking: ‘But Merle, there are plenty of counterweights! There’s the law, punitive measures, the risk of being found out, losing your career, never working again, your own conscience.’
Well, are there? Really?
The existence of counterweights is not enough to claim that there are, in fact, counterweights. We are in the science of behaviour – it’s all about perception.
If someone who is, let’s say, *easily dissuaded* into making millions, or fast-tracking their career through not-so-legitimate means, do they perceive risk in the same way as someone who would absolutely not do that?
Maybe it’s even less behavioural than we think. Maybe this was a decision arrived at through cold hard logic: an expected value calculation. Take the years you can profit of the bad behaviour and the amount of profit, and weigh that against the risk of being caught and the possible negative consequences then.
Two moving parts here: the profit made in the good times, and your perception of the risk of actually being caught and having to face the bad times.
You can see where this is starting to go wrong…
Step 4 then becomes fixing the issues just identified. And that’s hard. If you’ve found that there’s people in your organisation not taking the ethics, legislation or strategy put in place remotely seriously, it’s time to start weeding out bad apples – and you do that by decimating the tree they come from, by the root. If the reason no ‘good apples’ spoke up is because the CEO or someone really high up is enrolled in this behaviour, get rid of them and whoever supported them. Because it showcases that this is the way to the top. It emphasis, rather too obviously, that the left hand side of the expected value equation is bigger than the right side. If things have already gone wrong, it’s time for an enquiry and a half. And proper punitive measures. You’ve found yourself in a status quo you don’t want to be in. And a status quo reset where some people will most definitely function as ‘examples’ will do more good than damage. If things haven’t gone completely south yet and this is a preventive exercise, still do an enquire. But more of a pre-mortem style one. Where could this possibly go wrong? Here you might want to apply the sunk cost fallacy as a tactic – or almost the IKEA effect if you will. Have everyone of the company be a part of the ethics guidelines they are building. What you’re involved in, what you yourself put effort in and feel control over, you’re much more likely to abide by. Not abiding by your own standards would cause a healthy amount of cognitive dissonance after all.
Oh, and rather obviously, if you do have behavioural risk and ethics committees in place, they should be independent. They should fulfill all criteria of the COM-B model, meaning that they have the power to put new standards and practices in place that NEED to be followed. And that they have the opportunity to speak up and condemn. So that’s behavioural risk in a nutshell.
Don’t make the mistake of thinking behavioural risk management is only for heavily regulated industries by the way (e.g. tax law, banking, medicine). Non-profits, academia and government also (need to) consider behavioural risk – where there’s people, there’s incentives and you better make sure those are all on the same page!