Global Systems and the Mechanism of Entropy
City as a multidimensional system comprising individuals in a common geographical location utilizes various types of interaction to ensure cohesion of the collective. This collective state of coherence should ensure the operational functionality that ideally reflects the well-being of all involved. There needs to be a constant optimization to the dynamics of interactions in order to maintain a coherent operative mode. Substandard maintenance and non-adequate management result in operational failures reflected as pathologies in a form of systemic structure losing its integrity. By not being able to maintain an optimal performative mode, the state of systemic health is compromised, manifesting as systemic illnesses. Due to increased measures of dispersal in the system, these in spatialized forms can be (and currently are) detrimental to the infrastructures of survival. Such a system is undergoing a process of entropy.
The concept of entropy can be applied to different contexts implying meanings in various domains. It belongs to the scientific paradigm dealing with the nature of information as theories on the relative measurement for the structuralization degree of the system and addressing characteristics of functionality, relativity and quantification. Entropy measures the relationship of structural differences at a specific level of the system from the formal aspect. It is not aimed at the information itself, rather it focuses on the structuralization characteristics of the carrier itself.
The ambiguity surrounding the implications of entropy when it comes to the application of technologies that are defining the global society is prevailing. Despite the past long-term appeals of involved professionals pointing out the inevitable consequences of insufficient system maintenance, we are now facing collapse of the global (eco)systemic infrastructures. The forecast turned into our present. Current global system management is dealing with uncontrolled trends in development resulting in system failures on a planetary scale. The condition of concentrated system structure (large metropolitan areas) as nodes of operations and its operative extension is crucial in determining the collective future of our civilization. This article attempts to explain this process and describe it within the framework of current planetary events.
Clausius Principle of Entropy
After formulating the Second Law of Thermodynamics in 1850, German physicist Rudolph Clausius put forward the concept of entropy [1]. In 1867 he adjusted the law into the principle of the entropy increase which can determine the direction and limitation of the physical process in the macro state. The term originates from Greek ‚entropia‘, as ‚transformation content‘ or ‚trope‘ which means ‚transformation‘ [2]. It was assigned to the fraction of energy in the body unable to produce work, corresponding to the transformation-value content in an energy transfer. The implications of this formula is that entropy is ever-increasing and that the energy is always degraded due to transfer occurring, implying irreversibility.
dS = dQ / T
S - entropy
Q - heat
T - body of absolute temperature
Boltzmann’s Statistical Physical Entropy
In Boltzmann’s formulation of entropy from 1890 we look at the number of possibilities the system assumes — the number of possible arrangements. The greater the number, the greater will be the entropy resulting in a greater disorder. With greater disorder comes lower energy quality (associated with doom) as stated in Clausius’ fundamental law of the universe: the entropy of the universe tends to a maximum [3].
S = K lnW
S - entropy
K - Boltzmann constant
W - number of possibilities the system can assume
Shannon’s Entropy of Information Sources
In [A mathematical theory of Communication by Claude Shannon], Shannon gave a statistical character to entropy, considering the absolute limit of best possible lossless encoding of a digital message. The particles are bits creating the symbol and entropy is the logarithm of the possibilities of their arrangement together with the relative proportion of symbols [4]:
H = -Σ p(xi) log p(xi)
H - entropy
p(xi) - the probability of the variable x to assume values of xi... xk.
Shannon’s entropy was deemed to have a deeper meaning independent from thermodynamics and greatly significant for statistical mechanics. From this concept a principle of maximum entropy was derived in 1957 as a new type of subjective statistical inference for setting up probabilistic distributions based on partial knowledge [5]. This principle allowed the least biased estimation, which made entropy coherent with mechanical hypotheses and quantum mechanics. The older concepts were re-modified as uncertainty, while the principle of entropy maximization was used as a statistical inference to support spatial models simulating interaction [6].
Schrödinger’s Negative Entropy & Prigogine’s Adjustment
In 1944, one of the founders of quantum mechanics Erwin Schrödinger proposed the concept of negative entropy. In his book What is life? He focused on the question of how the events in space and time which take place within the spatial boundary of a living organism can be accounted for by physics and chemistry [7]. He determined that „life feeds on negative entropy“ meaning that entropy with negative sign is a measure of order. How does the living organism avoid decline of equilibrium? Of course, by maintaining the operative system (the metabolism) by eating, drinking etc. Referring to the exchange of matter, as any substance of the organism is found in its environment, he states that living organisms are constantly generating entropy and the only way to diverge from death is to continuously draw negative entropy from its environment.
So in order to live, an organism is driven to eliminate entropy by continually maintaining itself at a stable and low level of entropy. This generates stabilization at a high level of order on a contrary to a low level of entropy. An organism at various degrees of origin and complexity — from a being, through a city to a galaxy — lives entirely in the order of absorption. In the case of a being, they take in the food and release its degraded forms which could make us think that they consume it in order to obtain matter and energy. What the organism really does is absorb negative entropy from the environment to obtain order or organization. Considering order and organization discussed in the context of the communication and control theory, we can draw parallels and interlink it with the functional interpretation of information. In this regard we could say that life does not feed itself on negative entropy, it feeds on information. This opens up the subject of entropy as an open system instead of an isolated system like under the law of thermodynamics. Openness of the system, or the state of the system and the environment maintaining a certain degree of exchange, are the basis for the opposing conditions — entropy / negative entropy. If entropy describes a measure of chaos, disorder and uncertainty, then negative entropy constitutes organization, order and certainty.
In 1960s, Ilya Prigogine re-adjusted the second law of thermodynamics and proposed the dissipative structure theory [8], determining that the principle of entropy increase only works for isolated systems and that for open systems there are two additional factors to consider: (1) external entropy flow caused by the exchange between the system and the environment and (2) entropy generated within the system. One is generated by the interaction, the other spontaneously within.
Entropy in Spatial Context
The adaption of the thermodynamic concept of entropy to the evolution of landscape was done in 1962 by Leopold and Langbein [9], using the example of a geomorphology of a river basin. The study assessed the energy distribution in a river system as a statistical thermodynamic formulation of entropy. In their experiment they demonstrated the energy flow as the most probable condition and a uniform distribution subjected to physical constraints, applying the concept of maximum entropy to the geographical context.
The concept was introduced to the discipline of urban studies in 1970 with Wilson's publication ‘Entropy in Urban and Regional Modeling‘ [6]. He proposed a framework for constructing spatial interaction [urb entr], reformulating gravity associated with transportation models to maximum entropy models estimating urban transport within the maximum entropy environment. It became a subject of not only physical, but also social and economic constraints, later expanding its applicability to various aspects of urban planning [6]. In the 1970s, Henry Theil described a broader interpretation of his concept of relative entropy as a measure division and spatial dispersion. It indicates the proportion of the maximum possible dispersion in which a variable is spread among spatial categories (spatial zones).
Socio-economic and Cultural Entropy
Urban systems and their spatial structure emerge with spatial entropy. They maintain optimum levels of aggregation depending on scale and a fractal law. However, entropy is no longer a concern of just physics and urban geography. It is also a concern of economy, culture and society. Entropy in sociological context was described in 1990 in Social Entropy Theory [10], defining a correlation in the population distribution among the subcategories of social variables (socio-economic status, technology, area, cultural affiliation, information etc.). The social entropy in this case will be measured at its minimum with the correlation among variables at the maximum — groups of individuals sharing the same level of wealth, education, occupations, values and assets. The high level of entropy is measured by minimal correlation among variables, bordering or existing in an extreme segregation, dissociation and alienation of individuals and groups living in the area. To maintain a high-quality co-existence and standard of living, entropy has to amount to a low level.
Another use of the concept of entropy in sociological context is to measure the number of possibilities of a system structure for a given level of energy [11]. The total amount of resources of the social system (culture, knowledge, assets, skills etc.) accounts to the social system energy, and just like in thermodynamics, there is a tendency of increase with time while the probability of fluctuations decreases along with the size of the social system.
In this context, we can interpret entropy as a degree of adaptability or resilience — of the innate system itself and its response to external events. In terms of describing the structural order and behavior of systems, entropy has been immensely useful to demonstrate their elusiveness, ambiguity and volatility. It is used to quantify the measure on the spectrum of organization — chaos, order — disorder, diversity — uniformity or usefulness — uselessness on a large scale, with multiple possible sub-spectra on a smaller scale. When it comes to urban structures and urban behavior, it addresses the three big features [12]: location—position, flow—staticity and system—size.
Global System as Dissipative Structure
For a long time, the world has been discussing an impending crisis of our civilization. The economic bubbles, the debt, the inflation. The use of resources, the levels of their exploitation and how this poses a burden on further growing the economy. We look at segregation in every possible domain and at every scale. Collapsing ecosystem, social infrastructures, extremely weak resilience.
The global society is a dissipative structure. It means that it is only able to exist because it dissipates energy in order to stock information. The more energy, the more complexity and the more complexity, the larger the energy flow. At the beginning, our civilization started to appropriate energy (livestock, commodities, carbon etc.) in a certain way and throwing entropy into the biosphere — pollution, extinction, heating transforming the ecosystem, and throwing entropy into the society — war, migration, divergent developments in the class hierarchies. The rapid growth and globalizing trends gradually overruled the entire planet and thus it became difficult for the system to dissipate energy outwards. Referring to Prigogine’s dissipative structure theory, if we consider our planet a dissipative system, it needs a growing energy flow and an exchange between the system and the ‘outside of the system‘ to function properly. The fact that the planetary systems have reached their full capacity in terms of area and system complexity coverage, while they are no longer able to execute the exchange (interaction), entropy will be spontaneously generated within the system.
Observing today’s phenomena such as global pollution, massive migration or endeavors to exit the planet are the evidence that the global system is no longer able to expel entropy outwards, returning as a diminishing dynamic and crashing the system. According to this reasoning, the multi-level crisis, global disruptions and a lack of resilience is just a manifestation of the growing entropy inside the planet's own meta-system. The size and the complexity of the global system with its many subsystems allowed ‚storing the problem‘, i.e concentrate the entropy in the systemically weak and hierarchically subordinate areas such as developing countries, lower class or young population. But as the inwardly directed entropy increases, we are now at a crucial point of filling up the capacity of the weaker spots, entering developed countries and higher social class. These cumulative developments produce social instability and tremor the core of the system. Practically it means a loss of control over the internal dynamic, increase of randomness of variables and acceleration of events, while the system losing predictability. System’s loss of capability to dissipate energy condemns it to a fatal disruption and deterioration of the operative functioning.
Multi-leveled Order
The operative deterioration is now evident on multiple scales. Water and air pollution, acidification of the oceans, extinction of species are the signs of the ecosystem collapse. We can observe similar phenomena on a scale of multi-layered sub-systems, for instance a single organism and its ability to function and operate within a system of larger scales.
The multi-layered dissipative complex system of systems reaching interplanetary scale maintains its operative equilibrium. Our planet itself does not have a problem with its source of energy — the Sun. The energy received from the Sun maintains its levels and only increases gradually over very long periods of time. But the Earthy biosphere is currently under a collapse, which is not caused by a disturbance of the interplanetary equilibrium. It is an influence of the planetary sub-systems originating from human activity, causing one the most serious crises over its entire planetary history. Now, at the stage of advanced internal entropy we can expect a reduction of complexity and eventually a systemic implosion on a global scale.
Mathematically, only a drastic reduction of the energy input could divert the increasing entropy. Reduction of the energy input, though, necessarily means a reduction of complexity and information stored in the human subsystems. Current developments can thus follow these possible timelines, depending on the variables:
Continuation of current trends leading to energy input, information storage and complexity increase > high probability of progress of inner entropy, system implosion leading to collapse
Diversion of current trends by considerate reduction of the energy input, information storage and complexity leading to a radical simplification (and a shutdown of a large portion) of human-made systems while maintaining the current systemic order > high probability of entropy decrease, implosion diversion leading to preservation
Continuation of current trends leading to energy input, information storage and complexity decrease by system replacement and re-organization of the variables > evaluation of the possibility of maintaining equilibrium needed
Whether or not humanity ceases to exist, there is a higher probability of future adaptability of the biosphere than the persistence of civilization. In case of collapse of the human civilization, as long as the biosphere persists, a new civilization will emerge. When it comes to the probability of perseverance, the biosphere is more likely to persist, independent from a degree of environmental livability left by humans. The degree of livability, however, may be determined by the speed of the collapse of human civilization and the level of ecosystem depletion during the interval leading up to the point of the collapse itself.
Strategies of Complexity Decrease
Let’s look at the working definition of complexity:
complexity = the number of components in a system + the variety of relationships among these components + the pace of change of both the components and the relationships
Complex systems, as illustrated, are characterized by diversity, ambiguity and unpredictability of outcomes relative to inputs, or changes in conditions. The interaction of three dimensions – number of components, variety of relationships and pace of change in both. It means we cannot easily tell what a complex system is going to do. It also means it is more difficult to control. The number of components is obviously important. Larger systems are often more complex – but they may just be more complicated. To distinguish between complicated and complex, we need to understand the relationships between the components and the rate at which these are changing.
An airplane is a complicated system, with many interconnected parts and interactive control systems. But there is a high level of predictability in terms of how the overall system will react, either to changes in external conditions (temperature, pressure, wind), internal conditions (cargo, moving drinks trolleys, passengers) or to the controls (engine thrust, elevators, ailerons). A city is an example of a complex system. While it also contains many interconnected and interactive systems, the pace of the components and their relationships are much harder to regulate. In order to decrease the measure of complexity reflected in spatial systems (particularly in urban systems), the measure of the information input distributed into the system should involve a trade-off between the density of the distribution and the number of events that characterize it. This changes the pace. When the distribution pace changes at a faster rate than the number of events, the information input can decrease even if the city (the system) grows. These qualities are the representations of entropy in time that vary asymmetrically.
References:
[1] [3] Clausius, R. (1867) The Mechanical Theory of Heat: With its Applications to the Steam-Engine and to the Physical Properties of Bodies; J. Van Voorst: London, UK, p. 376.
[2] Oxford Dictionary
[4] Shannon, C.E. (1948) A Mathematical Theory of Communication. Bell Syst. Tech. J., 27, p.379–423.
[5] [7] Jaynes, E.T. (1957) Information Theory and Statistical Mechanics. Phys. Rev. 106, p.620–630.
[6] Wilson, A. (2010) Entropy in Urban and Regional Modeling: Retrospect and prospect. Geogr. Anal. 42, p.364–394.
[8] Batty, M. (1974) Spatial entropy. Geogr. Anal., 6, p.1–31.
[9] Leopold, L.B. (1962) Langbein, W.B. The Concept of Entropy in Landscape Evolution; United States Geological Survey: Washington, DC, USA.
[10] Bailey, K.D. (1990) Social entropy theory: An overview. Syst. Pract. 3, p.365–382.
[11] Stepanic, J.; Stefancic, H.; Zebec, M.S.; Perackovic, K.(2000) Approach to a Quantitative Description of Social Systems Based on Thermodynamic Formalism, p.98–105.
[12] Cabral, P., Augusto, G., Tewolde, M., Araya Y. (2013) Entropy in Urban Systems, Entropy 15.