Skip to content
Our website will be unavailable from 17:00 GMT Wednesday 20 November until 9:00 GMT Monday 25 November while we carry out important upgrades.

If you plan to update your membership, book an event or access APM Learning, APM Community or use other resources, please do this outside of these dates.

The 15 November Chartered Project Professional submission date is unaffected.

Thank you for your patience.

How to plan for the ‘unplannable’: human error

Added to your CPD log

View or edit this activity in your CPD log.

Go to My CPD
Only APM members have access to CPD features Become a member Already added to CPD log

View or edit this activity in your CPD log.

Go to My CPD
Added to your Saved Content Go to my Saved Content
Mark Rowland_Plan for human error_shutterstock_1033794853.jpg

The traditional approach to risk management does not take into account human error. This, in itself, is a mistake.

Traditional risk is narrowly focused on the technical or commercial aspects of project management, but that’s only part of the project dynamic. Managing human error is crucial if you want your project to succeed.

“People are emotionally driven,” says Edward Moore, chief executive at project consultancy Resolex. “Each of us has individual filters and perspectives. It shouldn’t be a surprise that, when things go wrong, human ‘error’ is often found to be the cause. It therefore makes sense to include human behaviour in the risk equation.”

Behavioural risks occur when the people involved in your project don’t behave as you expect them to. It could be as simple as the wrong switch being flipped at a critical moment. It could be people on the project leadership team ignoring evidence that they don’t want to see. There is a very wide scope for human error. So how do you plan for it?

Understand the common factors

“While we all have individual quirks and preferences, we tend to react to our circumstances in a consistent manner,” says Moore. “When people feel threatened or disrespected, they will typically disengage from the situation. This can lead to a drop in the care and attention that they would otherwise pay to their work.”

There are three common factors that often cause a project to fail:

  • lack of alignment: individuals in the project team have different objectives to those of the sponsor or stakeholder;
  • lack of engagement with the project: leading to lower levels of commitment and minimal creative problem-solving; and
  • lack of resilience in the team: pressure from repeated deadlines and a lack of capable resources reduces productivity, and increases stress and the likelihood
    of errors.

Create a feedback tool

The simplest way to monitor behavioural risk is to create a feedback tool, usually in the form of a questionnaire. What kind of data you collect will vary according to where you are in the project cycle. Many web-based tools can help you collect data. Simple DIY options such as SurveyMonkey might do the job, but if you want something more complex, you might want to consider a bespoke system such as RADAR.

“Successful mitigation requires the identification of the appropriate behavioural risk and then monitoring on a regular basis,” says Moore. “It is better to collect small amounts of data on a regular basis than to do periodic temperature tests that ask too many questions.”

Moore uses the example of a multimillion-pound residential scheme, planned for a large site in north London. “The site put a lot of constraints on the design; it needed to deal with difficult underground conditions, as well as being subject to strict planning criteria. Between 100 and 150 people were involved over the two-year design period, drawn from many design and development backgrounds."

They took an active approach to risk management and commissioned a RADAR survey to help mitigate for human error. Questions covered the alignment, engagement and resilience within the different teams on the project, including, resourcing, communication, team morale and project complexity.

“The responses were sent out to the whole team to coincide with project leadership meetings and were collated by a third party to allow for anonymity. The resulting report was presented to the project leadership team, who then agreed a course of action to respond to issues that were likely to be counterproductive to the design delivery process.”

Edward Moore discusses failure in more detail in the Spring edition of Project journal which is available free to APM members.

Image: MJgraphics/Shutterstock.com

2 comments

Join the conversation!

Log in to post a comment, or create an account if you don't have one already.

  1. Donnie MacNicol
    Donnie MacNicol 21 June 2019, 08:52 AM

    Mark. Many thanks for sharing this. I would go further and say that “ risk management does not take into account the human dimension”. We have aimed to address this through research done with Praxis Framework to identify how different types of people perceive and then practice risk management. You can seen an example of the thinking here - https://www.praxisframework.org/en/ima/risk-management This picks up on aspects such as “It could be people on the project leadership team ignoring evidence that they don’t want to see” as naturally they would not prioritise or focus on the same aspects that a team made up of different personalities would. We are looking to address this by providing leadership teams with personal and collective insight that allows them to make more informed decisions after understanding preferences, biases, etc. It is ultimately the bringing together of the process and the people or using the terminology that many of us dislike, but does sum it up rather well, the hard and the soft.

  2. Nicola Read
    Nicola Read 27 June 2019, 04:18 PM

    This is such an important and often overlooked aspect in projects and programmes. The assumption seems to be that the delivery team always follow the correct protocols and are highly skilled, which is not always the case. I'd like to see more on how we account for this when managing risk as it's very difficult to quantify or predict. An interesting acknowledgement by this article that the problem actually exists.