Wicked problems#

In brief#

A class of problems for which science provides insufficient or inappropriate resolution [1].

More in Detail#

Wicked problems are not objectively given but their formulation already depends on the viewpoint of those presenting them [2].

In spatial planning literature there is a difference between tame problems and wicked problems. The former is a problem with a set of well-defined rules and clear goal, e.g. problems like solving sudoku’s. There are however another set of problems that do not do well when we think of them in terms of search spaces, constraints, rules, and goal settings.

This class of problems, named Wicked problems [1] is largely determined by the professional skill of framing and addressing the problem in a particular way. These problems are political like policy around poverty. The setting and solutions are contingent, depending on political view, available information, and dependent on formulation. There are ten different markers that show wickedness[1].

  1. There is no definite formulation of a wicked problem.

  2. Wicked problems have no stopping rule.

  3. Solutions to wicked problems are not true-or-false, but good-or-bad.

  4. There is no immediate test of a solution to a wicked problem.

  5. Every solution to a wicked problem is a ’one-shot operation’; because there is no opportunity to learn by trial-and-error, every attempt counts significantly.

  6. Wicked problems do not have an enumerable set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan.

  7. Every wicked problem is essentially unique.

  8. Every wicked problem can be considered to be a symptom of another problem.

  9. The existence of a discrepancy representing a wicked problem can be explained in numerous ways. The choice of explanation determines the nature of the problem’s resolution.

  10. The planner has no right to be wrong

Spatial planning may be far removed at first sight from engineering, but with the embedding of technology in society, we do move towards a society were tame problems with well-defined rules fall short with regard to the potential impact implementations may have in society. In fact, many algorithms can already be regarded as policy in one way other another, as a particular implementation may provide benefits to a certain train of thought or a particular set of actions. A policy that gives financial benefits those in dire straits (e.g. those below the poverty line) can have a similar effect as an algorithm that allocates resources to those in need. Point being, we should not underestimate the similarity between administration and algorithm.

If we take the ten different markers for algorithmic implementation into account, then we can draw a number of inferences. First, it shows us the immediate impact of implementation, not only is implementation a one-shot operation because it may skew public perception[3], it also may influence other potential solutions. Second, these solutions are political and not to be framed in terms of optimization (true/false and good/bad). Third, there are a variety of equally effective solutions that would be permissible, meaning that the choice for this particular one carries a certain political or personal weight.

The first point latches onto something else in planning theory, that of path dependency. When an implementation becomes embedded in society it is hard to remove[4]. We also see this in philosophy technology through the Collingride dilemma[5]. In planning theory it shows that implementation may effect future implementations as it opens certain doors and closes others. Consider for example, our use of the QWERTY keyboard, this is partly due to its widespread use, rather than efficiency (DVORAK is more effective) [6]. The claim of path dependency is that these implementations (of which many others would be equivalent) can cause a path that is hard to step away from.

This last inference is one that engineers should take into account when designing algorithms for a societal context. It means that they themselves become a political player in the scheme of things rather than the executioner of the wishes of certain stake-holders. In essence, the role of the engineers and that of a policy designer are interlinked by their societal impact. Wicked problems are a way of showing the impact they have when dealing with bureaucracy, algorithmic or otherwise. Accountability comes into play when we consider that the engineer is not a neutral player within this game, they carry some of the blame of the outcome as it was their framing of the problem that led to a particular solution. Of course there are many ways to alleviate some of these problems

Bibliography#

1(1,2,3)

Horst WJ Rittel and Melvin M Webber. Dilemmas in a general theory of planning. Policy sciences, 4(2):155–169, 1973.

2

Richard Coyne. Wicked problems revisited. Design studies, 26(1):5–17, 2005.

3

Bas Verplanken. Beliefs, attitudes, and intentions toward nuclear energy before and after chernobyl in a longitudinal within-subjects design. Environment and Behavior, 21(4):371–392, 1989.

4

Richard J Lazarus. Super wicked problems and climate change: restraining the present to liberate the future. Cornell L. Rev., 94:1153, 2008.

5

David Collingridge. The social control of technology. St. Martin's Press, New York, 1982.

6

Paul A David. Clio and the economics of qwerty. The American economic review, 75(2):332–337, 1985.

This entry was written by Sietze Kuilman.