Skip to content


Decision Making and Observation Bias… exactly what I expected!



The February 2023 GMDP symposia marked the first time since the world was impacted by the Covid 19 pandemic that the event could be held live. Nothing quite beats a live show, and this return after 3 years was a great display of the MHRAs drive for more significant interaction with industry - several times the speakers asked if more symposia style events would be helpful for industry. A very candid discussion about the recruitment challenges the agency faces may, on face value, have set a rather dour tone for the rest of the event. However, I like to think of this as the agency’s attempt to be more open and transparent, something that more junior industry members may not see the MHRA as being.


Those in the industry who have attended previous MHRA symposia will come to expect a variety of presentation styles; peppered throughout with anecdotes, analogies and questions and polls. However, I was not expecting to see a faux talk show about the Inspection Action Group (hosted by Ewan Norton of course!) or two inspectors in full Star Trek uniforms demonstrating good decision making concerning Root Cause Analysis!


The latter presentation is what I will focus on here as, however silly it may have appeared, I believe it offered a great way to understand a somewhat complicated and often misunderstood topic. Root Cause Analysis is not a new concept and many companies have opted for simple tools to support investigations. The exciting thing highlighted by Christine Gray at the opening of this section is that whilst the tools may be correct, the decision making that drives them may be flawed. The psychology of decision making, especially in high-pressure environments, is a topic that could fill an entire symposium of its own. So, the Star Trek roleplayed scenario where two engineers’ problem solve an equipment failure was an interesting choice to concisely portray the difficulties of consistently good decision making – although I consider myself a huge sci-fi geek so this may be my own personal bias.


Being blinded by our own bias, whether it be due to reflecting too much on historical events as a basis for current findings or having an over-inflated sense of confidence in each topic, is something that is only possible with very careful self-reflection. Christine Gray did a great job of explaining the different bias that may influence decision making, and we saw precisely the consequences of this on the starship, with several playful digs at Christine herself. We should all try to be conscious of; Observation, Cognitive and Confirmation biases, Availability and Representativeness Heuristics and Anchoring, not only in relation to investigations but in our broader interactions.


Overall, the message is simple, remain objective. Only use facts and empirical data to support root cause analyses, especially where decisions are being made that may lead the investigation down the wrong path. Another intriguing suggestion by Christine was to work with someone you don’t get on with when investigating a failure. Your differing opinions and perspectives can be extremely useful in this area where two people with similar viewpoints may overlook something pertinent to the true root cause. Working as a cohesive team to solve a common problem is something we have always encouraged at Broughton, however like many things, I am sure we can do even better to utilize each other’s experiences and expertise.


I for one am committing to greater self-reflection in the course of routine investigations to understand where my own biases may influence an investigation. The uncomfortable thought that Christine highlighted, “our own thinking may not be as good as we think it is”, is a challenge that we look forward to overcoming, or at the very least, acknowledging!