Design Age - Part 2

In Part 1 of this series we identified one of the core issues with design in today's consumer-centric client as one that is complex, if not wicked.(1) Additionally that there is likely a high probability of failure for the first solutions and there is no right or wrong solution, hence there is a high probability of unintended consequences unforeseen by anyone.

In this installment of the series we are going to compare the differences in problem solution outcomes between traditional single mind approach vs. a collaborative mind approach. We begin with the premise that
  • Solutions are found by progressive possible solutions, not simple linear decision-making
 This premise is based on the idea that linear solutions of the traditional Scientific Age are able to handle a limited set of variables where the Design Age uses a parallel solution method of many minds approaching a problem's solution from varied points of view.

Let's look at a graph of the traditional Scientific Age solution method.
From the left to right the colored area under the graph line represents the unknown values. The solution is solved in a linear progressive manner eliminating each unknown with a scientific hypothesis which is tested with a controlled set of data. As more and more information about modality, reaction and effect are discovered a solution is gradually revealed in a stepped form as the graph shows until a final solution is arrived at when all the variables are accounted for.

When we look at the emerging Design Age solution method we see a different kind of graphic representation of the solution.
Here we see a jagged line of resolution which suggests every time a near solution is derived more unknowns are revealed and a new solution is sought. It is a progressive revelation of possibilities until at some point a solution which is 'good enough' is proposed. We may have used a scientific method of testing and proof through data, but not necessarily. In some cases the data may have been approximations or even outright guesses. but in the end some kind of resolution was accepted by the solution team through consensus.

A second premise deals with the time constraints of more complex and wicked problems. time estimates for difficult problems always seem to be difficult for project managers. The time to deliver acceptable solutions always seems to be more than what was anticipated. therefore the second premise is:
  • Time required for more complex problems grows exponentially.
The next graph shows the complexity curve of time required to solve more difficult problems. Notice the curve is not linear but the expression of some coefficient of difficulty that indicates an exponential curve.

This curve is a depiction of  the "no stopping rule" Horst Rittel mentions as part of the definition of a wicked problem. Not recognizing a wicked problem often makes a project manager look bad since for every solution that looks like a resolution, more unknowns become evident and more time is required to find new solutions which account for the newly revealed variables. Stated in other terms an infinity of time is needed to arrive at the complete 'right' solution. It is diabolical and close to insanity to continue to apply Scientific Age solutions to this increasingly difficult problem type. Horst Rittel suggested that for a single, traditional  problem solver insanity must surely be immanent and sure. 

The final premise we will look at in this article states:
  • Resources grow more scarce as complexity and time are increased.
 This time we have two curves. One represents resource availability and the other the time needed to study the problem and arrive at a solution. Notice as we move from left to right the complexity of the problem increases from 'Simple' to 'Wicked'.

The first line representing Resources Availability is reduced from Simple to Wicked for the single reason that knowledgeable and experienced solution providers become more scarce as the complexity of the problem increases.

The second line represents the possible solutions are further constrained due to expansion of facts available for consideration and the lack of time and resources to evaluate an ever-expanding set of available data. Multiple solutions are possible and the lack of consensus between the solution team further constrains the time to an acceptable solution.

At this point if you think all is lost you are close to the mark for the simple reason that most of us are steeped in the belief of the infallible nature of the Scientific Method, yet we see here a case for it's fallibility when facing an unknown set of variables and an ever-increasing set of interrelated complexities. Designers truly have a difficult transition to deal with. We are taught to think in a da vincian method of scientific proof, yet no substantial proof seems to exist as more and more interrelated, previously unaccounted for data appear with each successive solution.

Horst Rittel suggested that a group of interested stakeholders approach the problem knowing that a single fully thought out solution is not probable, rather they were to reach some solution based on a set of documented assumptions and data to arrive at a position of consensus as to what action would be taken. The group was also to be made aware that there would be unintended consequences for their decisions and action taken, but that was part of the price of consensus and a 'best solution' result.

For some, this is a bitter pill of realization to take, but if you look closely and honestly at wicked problems  you find this is their nature and the solutions are of the 'good enough' variety, not perfection tested to the nth degree.

In the next article we will take a further look at the Traditional vs. Emerging problem solving methods. Pitting the singular master solver against the collective mind of collaboration. Until then remember "Collaboration is the glue of success."

Continue to Part 3

(1) "Dilemmas in a General Theory of Planning", Rittel, Policy Sciences, 4:2 (73:June) p. 155.

No comments:

Post a Comment