Close

November 2015

Let’s take you for a spin

Imagine that you are a caterpillar munching on a succulent leaf at the tip of a lush tree in the woods. Life is simple. You look back and remember a purposeful climb to reach that perfect leaf, not realising the millions of possible options you left behind at every branch you picked. In hindsight, your climb was a linear journey from the root to the ultimate lunch. However, the hovering crow that is about to eat you knows that your leaf is not so unique.

This is a tale about navigating chaos in engineering. Engineers, deterministic trained problem solvers, love to apply procedures to find the answer. Sequences of actions or instructions need to be followed in solving a problem or accomplishing a task. This mindset is everywhere in our methodologies. My contention is that we need to be more like that hovering bird than the inadvertent caterpillar.

Iteration
Something that must be avoided at all means and yet we always fail to accomplish in our madness to pursue a linear path to the solution. We must introduce iteration in our methods, in a contradictory fashion; we need to plan for iteration by recognising the intrinsic nature of our thought process. We need analysis and synthesis in cycles of divergent and convergent thinking. We must build iteration into our conceptual design process because iteration creates value. It is a diminishing return function; iteration has less potential to create incremental value to the point that the cost outweighs the benefits. So we must control the numbers of iterations we perform but exploit the value they create. When you believe that engineering is a perfect deterministic process you believe there is no value in iterations, only waste. The same set of data into the same process will always lead to the same answer. The caterpillar can’t be happier on his leaf! But design engineering is far from perfect, is far from deterministic. Two factors conspire against Cartesian solutions: complexity and uncertainty. Our engineering is riddled with chaos.

Complexity
We design complex systems installed in extreme environments by very expensive marine equipment. One bad decision could cost millions. Going back to the hardware store to replace a mismatched bolt and nut can cost a million dollars on standby rates. The million dollar nut forces us to plan at a very fine level of detail without losing sight of the huge magnitude of our projects. This makes our engineering at least as complex as aerospace engineering. When our proverbial caterpillar crosses 8-nodes, he has haphazardly picked one leaf out of 1028 (2^8). In other words an 8 yes/no decision tree creates a domain of 1028 possible outcomes, and is clear that we must deal with a lot more complex choices. No engineering team is perfect and many suboptimal choices will be made. Every time we navigate the decision tree we may follow a different path. In some cases we may end up in a completely different configuration by just changing one choice – the butterfly effect. Given that engineering design is an imperfect process we will hit a different solution on every iteration, with some better than others. The greater the variability, the greater is the potential for improvement; a justification to think outside the box. Without a proper selection process this variability would create a random walk around the optimum target. With a proper concept select process each iteration provides an option to find a better solution. The process is analogous to Darwinian evolution; the best random variations are selected to improve the adaptation and survival of the fittest. In that selection process any good option is taken any bad one is discarded – each ‘N’ iteration picks the best of ‘N’ solutions. Iterations allow us to make better decisions by comparison, by picking the best solution out of many options. Time and cost are of essence so the number of iterations must be optimised.

Figure 1- Option Value of Iterations
Figure 1- Option Value of Iterations

Now let’s be more rigorous with this explanation. Let’s leave alone the caterpillar and the crow and use a mathematical example. Given that engineering is an imperfect stochastic process that will lend different outcomes on each independent trial and let’s assume the results follow a normal distribution, the value created or destroyed can be represented as bell curve (Mean 0, Sigma 30 in the example). If we repeat the trial N times and select the best outcome we modify the probability distribution. The curves 1, 5, 10, 15, 20 iterations, show the probability distribution of value after a certain number of iterations with mean values of 0, 36, 47, 53, 57 respectively clearly showing the option value of extra iterations. Also the standard deviation 30, 20, 18, 17, 16 is reduced on each iteration. In a stochastic engineering process iterations create value and reduce risk. Because extra engineering can be costly the net gain has a maximum at a finite number of iterations. The chart below shows clearly a learning curve. Very early the first iterations prune the bad options; no negative value solutions survive the selection process. Subsequent iterations create value and reduce risk. This is the mathematical justification of a learning curve.

Figure 2 – Learning Curve
Figure 2 – Learning Curve

Uncertainty
In the previous discussion a unique set of data creates different outcomes. We well know that a big source of uncertainty comes from the input data. Input data uncertainty comes from several sources:

  • Limited sampling – fluid data from one test well representing the properties of a reservoir
  • Non site specific – no available data from the job site
  • No time history – limited historical records
  • Faulty models – imperfect reservoir models
  • Circular references – the chicken and egg conundrum – our mathematical models are full of circular references that span multiple disciplines
  • Wrong assumptions – humans constantly changing their mind
  • Prediction of future scenarios
  • And many more.

As we progress the project, nature will reveal itself in a mischievous fashion constantly changing our assumptions and results. Every iteration will create a new set of results that feedback circular references that create a new set of results. Are you dizzy yet? Have you already abandoned all hopes of following a linear deterministic path? Are you ready to embrace the complexity and uncertainty that makes conceptual engineering a stochastic process? At this point you should realise that iterations are not that bad. Iterations can be a powerful tool that give us solutions that we can’t find otherwise. Iterations gave us powerful methods like Newton-Rapson and genetic algorithms. Are you ready to accept the iterations creed?

Wait! Not too fast. Too many iterations can be really bad too. Let’s look at some of the perils of iterations:

  • Fall in a divergent loop that never converges in a solution
  • Multiple solutions – need to select optimum
  • Data uncertainty blurring the selection process
  • Diminishing returns on cost/benefit of each iteration
  • Schedule delays
  • Cost overruns
  • Cascade effects on other disciplines.

What is the right answer? Is there a Goldilocks number? Yes – 5! No, I am kidding. Or maybe not. The point is that we must accept the iterative nature of engineering and plan for it.

Do iterations with a purpose. Learn from each iteration and improve the solution on each lap. Do this in a controlled methodological way. Just like a Newton-Rapson algorithm. Here is my engineering training begging for some order in this chaos.

The stage-gate process provides a good framework for controlled iterations. Early iterations focus on system wide optimisation, later subsystems and components will enter on focus. So it is indeed five:

  • Explore
  • Appraise
  • Select
  • Define
  • Execute.

But each phase is by itself iterative in a recursive fashion (Recursive: See recursive): Framing, Select, Define, Detail. As we progress the project the cost of each iteration increases and big decisions are cast. Early screening of options considers multiple options and data scenarios at a minimum cost with maximum value creation. Exploring a wide range of solutions is only possible during early conceptual engineering.

Iterations are a powerful value creation tool but must be managed carefully to avoid value erosion. Iterations must be structured within the project realisation process. Each phase is an iteration and there are iterations in each phase. Early iterations address 'big' system level decisions, late iterations focus on 'smaller' subsystems and components. In a recursive fashion a system, subsystem or component can be designed with the same iterative methodology. Early iterations cost very little, do as much of them as you can. Late systemic iterations cost a lot of money and can create havoc in a project. Second guessing early decisions can be fatal and spending too much engineering optimising is components are wasteful. Each iteration should be based on a consistent set of data. Changing data in the middle of an iteration can create a lot of confusion and all disciplines must work on the same version of the data.

Is your head spinning now?