Feb 14, 2012

Complexity, Evolution and Near decomposability

I have been pointing out the relevance of (nearly) decomposable systems, modularity and hierarchy to deal with complex systems before. Some of you had asked for a summary at some point.

I would like to know what you think of the concept and where you deem it useful for our discussions. 

•         It is his story of how variety generation can produce structure (organized) complex results relatively quickly by employing hierarchical relationships in evolution. 
•         It is potentially interesting wrt how costs of interaction between interfaced systems composed of (sub-)systems can be explained and measured
•         It provides a potential basis for the measurement of “complexity” and risk based on the interaction strengths between systems.

Here is a summary of one of the core articles of Simon on the evolution of complex systems based on hierarchy and nearly decomposable system. 
   
It gives a summary of his thinking on this subject inspired by natural systems and the application to social systems. 

His view of the evolution of complex structures is based on the following assumption:

If the existence of a particular complex form increases the probability of the creation of another form just like it, the equilibrium between complexes and components could be greatly altered in favor of the former.
This process (obviously) makes evolution of complex structures from building blocks in hierarchic structures much faster than the independent evolution of complex structures. The result of the process of evolution “employing” such an approach is near-decomposability (again – as it is based on simpler elements that are interacting more strongly internally than externally).

Simon defines near decomposability as follows:

(a) in a nearly decomposable system, the short-run behavior of each of the component subsystems is approximately independent of the short-run behavior of the other components; 
(b) in the long run, the behavior of any one of the components depends in only an aggregate way on the behavior of the other components.

He defines hierarchy as follows:

By a hierarchic system, or hierarchy, I mean a system that is composed of interrelated subsystems, each of the latter being, in turn, hierarchic in structure until we reach some lowest level of elementary subsystem. 

And qualifies the arbitrary notion of the measurement scale of a system wrt measurement / classification of the basic elements (defined as systems) by saying:

In most systems in nature, it is somewhat arbitrary as to where we leave off the partitioning, and what subsystems we take as elementary.

He further qualifies it by comparing to social organizations (which he described before)

Etymologically, the word "hierarchy" has had a narrower meaning than I am giving it here. The term has generally been used to refer to a complex system in which each of the subsystems is subordinated by an authority relation to the system it belongs to. More exactly, in a hierarchic formal organization, each system consists of a "boss" and a set of subordinate subsystems. Each of the subsystems has a "boss" who is the immediate subordinate of the boss of the system. We shall want to consider systems in which the relations among subsystems are more complex than in the formal organizational hierarchy just described. We shall want to include systems in which there is no relation of subordination among subsystems.

which relates to 

NEARLY DECOMPOSABLE SYSTEMS. We can distinguish between the interactions among subsystems, on the one hand, and the interactions within subsystems. ... As a second approximation, we may move to a theory of nearly decomposable systems, in which the interactions among the subsystems are weak, but not negligible.

THE DESCRIPTION OF COMPLEXITY. The fact, then, that many complex systems have a nearly decomposable, hierarchic structure is a major facilitating factor enabling us to understand, to describe, and even to "see" such systems and their parts.

The definition of near decomposability can be mode more formal (where it ordered such that strongly interacting elements are placed near each other):

This article treats of systems that are nearly decomposable--systems with matrices whose elements, except within certain submatrices along the main diagonal, approach zero in the limit. Such a system can be represented as a superposition of (1) a set of independent subsystems ... and (2) an aggregate system having one variable for each subsystem. ...

From the abstract of Simon and Ando's 1961 Aggregation of Variables in Dynamic Systems
which then allows to deal with the analysis and measurement of complex systems:

There is redundancy in complexity which takes a number of forms: Hierarchic systems are usually composed of only a few different kinds of subsystems, in various combinations and arrangements. Hierarchic systems are, as we have seen, often nearly decomposable. 

Hence only aggregative properties of their parts enter into the description of the interactions of those parts. By appropriate "recoding," the redundancy that is present but unobvious in the structure of a complex system can often be made patent. 

This allows us to reduce “the complexity” of the measurement of complex systems by compartmentalizing interactions and effects in areas or “boxes”.

N.B.: If one reads several of Simon's papers on causality and measurement there are a number of Machian perspectives and topics, however Mach's evolutionary / developmental / genetic perspective gets crushed in a systemic classification of the (Simonian) world, where the classification system seems to introduce a static nature. (Please correct me if you think I am wrong!)

I would greatly appreciate your feedback and thank for your attention!

No comments: