A very first attempt at a theory of metasystem transitions. Argues that science, art, philosophy and mystical experiences each provide complementary first steps towards a higher level of cognition, which would transcend the present conceptual-symbolic way of thinking.
It is shown that the existence of faster-than-light signals
(tachyons) would imply the existence and detectability of a privileged
inertial frame and that one can avoid all problems with reversed-time order
only by using absolute synchronization instead of the standard one. The
connection between these results and the EPR paradox is discussed.
"Equal causes have equal effects" is reformulated by defining causality as a distinction-conserving relation. Unpredictable, respectively irreversible, processes are analysed as processes in which distinctions are created, respectively are destroyed. Different types of partially causal and pseudo-causal relations are examined. Time order is derived from distinction conservation. It is argued that the emergence of macroscopic distinctions and causal relations is due to a self-organizing evolution, characterized by natural selection. The relationship between "physical" and "observer-dependent" factors in determining causal relations is discussed.
Rational cognitive processes are defined as processes controlled by an external system of rules. This control is represented by the conservation of distinctions, where a distinction is conceived as an element of cognitive structuration. Four classes of distinctions (patterns, states, rules, and values) and four classes of distinction processes (conservation, destruction, creation, and creation-and-destruction of distinctions) are defined. The resulting 4 x 4 grid is used to classify cognitive processes. This allows one to model "non-rational" phenomena, such as creativity, emotions, mystical experiences, ..., in a relatively simple way, as incompletely distinction conserving processes.
Dynamical representations used in physics are analysed from a "second order" viewpoint, as distinction systems constructed by an observer in interaction with an object. The creation, conservation and destruction of distinctions can be understood on the basis of a distinction dynamics. The fundamental mechanism is the variation through recombination and selective retention of closed combinations. The conservation of all distinctions is shown to provide a demarcation criterion, distinguishing classical from non-classical representations. Different non-classical representations (thermodynamics, quantum mechanics, relativity theory, ...) are classified on the basis of which distinctions they do not conserve. It is argued that the specific structures of these non-classical representations can be reconstructed by studying the properties of non-trivial closure.
The conceptual and formal structure of quantum mechanics is analysed from the point of view of the dynamics of distinctions, occcuring during the observation process. The Hilbert space formalism is simplified with the help of the concept of closure: closure of an eigenstate under an operator is generalized to the linear closure of a subset of states, and this is further simplified to orthogonal closure, meaning that a set of states can be distinguished by a single observation. Quantum states can be seen as (overlapping) subsets of unobservable infra-states, with the transition probability between two states proportional to the number of infra-states they have in common. This makes it possible to reconstruct the superposition principle. An analysis of the observation process leads to the interpretation of closed sets of infra-states as attractors of the dynamics induced by the interaction with the observation apparatus. This interaction is always partially indeterminate, because of the unobservable micro-state of the apparatus.
It is argued that the difficulties to establish foundations for a unified physical theory are due to the predicative structure of traditional scientific languages, whose descriptions reduce all phenomena to static, independent elements. A new language is therefore proposed, whose descriptions are fundamentally dynamic and holistic. It is based on the concept of the "arrow": a relational entity which is completely determined in a bootstrapping fashion by the other arrows it is connected with, so that it has no independent meaning> An arrow represents a n elementary process, and connected assemblies of arrows represent physical structures. It is shown how the fundamentals of space-time geometry can be expressed in this extremely simple, "structural" language. It is argued that this description could be extended to the observation process, and thus to the fundamentals of quantum mechanics, by introducing cognitive structures.
Emergence is defined as a process which cannot be described by a fixed model, consisting of invariant distinctions. Hence emergence must be described by a metamodel, representing the transition of one model to another one by means of a distinction dynamics. The dynamics of distinctions is based on the processes of variation and selection, resulting in an invariant distinction, which constrains the variety of and thus defines a new system. A classification of emergence processes is proposed, based on the following criteria: amount of variety, internality/ externality of variation and selection, number of levels, and contingency of constraint. It is argued that traditional formal and computational models are incapable of representing the more general types of emergence, but that it is possible to generalize them on the basis of the dynamics of distinctions.
It is argued that in order to efficiently tackle complex problems, user and support system should intimately interact, complementing each other's weaknesses. Strengths and limitations of human intelligence, respectively computer intelligence, can be derived from the mechanism of associative, respectively "chunk-based", memory. A good interactive interface should hence allow one to translate between associative (context-dependent) and chunk-based (formal) representations. Associative knowledge can be expressed more explicitly through hypermedia, consisting of a network of connected chunks. Different mechanisms for supporting the creation of networks are reviewed: check-lists, outlining, graphic representations, search functions, ... These mechanisms can be complemented by looking for "closed" subnetworks, which define invariant, formal constraints, which can be used to guide inferences. A prototype implementation of an interactive interface, the CONCEPTORGANIZER, is sketched, and some potential applications in the areas of idea processing, knowledge elicitation, decision support, and CSCW are outlined.
Maslow's need hierarchy and model of the self-actualizing personality are reviewed and criticized. The definition of self-actualization is found to be confusing, and the gratification of all needs is concluded to be insufficient to explain self-actualization. Therefore the theory is reconstructed on the basis of a second-order, cognitive-systemic framework. A hierarchy of basic needs is derived from the urgency of perturbations which an autonomous system must compensate in order to maintain its identity. It comprises the needs for homeostasis, safety, protection, feedback and exploration. Self-actualization is redefined as the perceived competence to satisfy these basic needs in due time. This competence has three components: material, cognitive and subjective. Material and/or cognitive incompetence during childhood create subjective incompetence, which in turn inhibits the further development of cognitive competence, and thus of self-actualization.
It is argued that replicators evolving through natural selection on the basis of fitness are intrinsically selfish. Though the synergy resulting from cooperation is generally advantageous, selfish or subsystem optimization precludes the reaching of a globally optimal cooperative arrangement. This predicament is exemplified by the "Prisoner's dilemma". Different proposals to explain the evolution of cooperation are reviewed: kin selection, group selection, reciprocal altruism ("tit for tat"), and moralism. It is concluded that the proposed mechanisms are either too limited in scope, unstable, or insufficiently detailed, and that the analysis must therefore go beyond the level of purely genetic evolution if human "ultrasociality" is to be explained.
A new, integrated model for the evolution of cooperation is proposed, based on the concept of a meme, as replicating unit of culture. Meme evolution is much faster and more flexible than genetic evolution. Some basic selection criteria for memes are listed, with an emphasis on the difference between memetic and genetic fitness, and the issue of memetic units is discussed. The selfishness of memes leads to conformity pressures in cultural groups, that share the same meme. This keeps group cooperation conventions (ethical systems), resulting from reciprocal agreements, from being invaded by selfish strategies. The emergence of cooperative systems is discussed in general as a "metasystem transition", where interaction patterns between competing systems tend to develop into shared replicators, which tend to coordinate the actions of their vehicles into an integrated control system.
Löfgren's criticisms of the "structural language", based on his "linguistic complementarity", are considered. Though the impossibility of complete description and the "non-detachability" of language are acknowledged, it is argued that a complementaristic conception does not provide a sufficiently clear understanding of the limitations encountered when determining the meaning of a representation. In the structural language, meaning is represented by distinctions, which are determined in a bootstrapping way by the other distinctions to which they are connected. The number of distinctions that can be included in the description is open-ended. That makes it possible to continuously adapt or extend the description, thus overcoming some of the limitations imposed by languages based on combinations of primitive distinctions.
Similar to the "Notes on the Principia Cybernetica Project"
This paper examines the proposition that covariation information guides judgments about the dimensionality of attributions on the basis of causal principles of contrast and invariance, which are derived from Mill's methods of difference and agreement respectively. It is argued that the standard attribution categories specified in earlier research (e.g., person, occasion and stimulus) represent just one extreme of the attributional dimensions and require the principle of contrast, whereas additional attributional categories reflecting the opposite extreme of the dimensions (e.g., external, stable, general) require the principle of invariance. In three studies, subjects were given covariation information, and were asked to rate the properties of the likely cause along the dimensions of locus, stability, globality and control. In line with the predictions, consensus with others, consistency in time, distinctiveness between stimuli and contingency of one's actions showed the strongest effects on judgments of locus, stability, globality and control respectively. Similar results were obtained in a fourth study, where subjects had to judge the influence of eight causes with varying dimensional properties. Moreover, these judgments were rated somewhat higher given causes requiring the principle of invariance rather than the principle of contrast.
A new conceptual framework is proposed to situate and integrate the parallel theories of Turchin, Powers, Campbell and Simon. A system is defined as a constraint on variety. This entails a 2 x 2 x 2 classification scheme for "higher-order" systems, using the dimensions of constraint, (static) variety, and (dynamic) variation. The scheme distinguishes two classes of metasystems from supersystems and other types of emergent phenomena. Metasystems are defined as constrained variations of constrained variety. Control is characterized as a constraint exerted by a separate system. The emergence of hierarchical systems is motivated by evolutionary principles. The positive feedback between variety and constraint, which underlies the "branching growth of the penultimate level", leads to the interpretation of metasystem transitions as phases of accelerated change in a continuous evolutionary progression toward increasing variety. The most important MST's in the history of evolution are reinterpreted in this framework: mechanical motion, dissipative structuration, life, multicellular differentiation, sexuality, simple reflex, complex reflex, associating, thinking, metarationality and social interaction
This paper examines in how far Turchin's concept of metasystem transition, as the evolutionary integration and control of individual systems, can be applied to the development of social systems. Principles of collective evolution are reviewed, and different types of competitive or synergetic configurations are distinguished. Similar systems tend to get involved in negative sum competition, and this precludes optimization at the group level. The development of shared controls (e.g. through conformist transmission) may overcome the erosion of group level cooperation, and thus facilitate the emergence of a division-of-labor organization. The resulting social metasystem transition is exemplified by the emergence of multicellularity, insect societies and human sociality. For humans, however, the on-going competition between the cooperators produces an ambivalent sociality, and a weakly integrated social metasystem. Strengths and weaknesses of the main social control mechanisms are reviewed: mutual monitoring, internalized restraint, legal control and market mechanisms. Competition between individuals and (fuzzily defined) groups at different levels of aggregation very much complicates evolutionary optimization of society. Some suggestions are made for a more effective social organization, but it is noted that the possible path to social integration at the world level will be long and difficult.
Preface to "The Quantum of Evolution".
Formality, arguably the most important dimension of stylistic variation, is subdivided into "deep" formality and "surface" formality, which inherits most stylistic features from the more fundamental deep variant. Deep formality is defined as avoidance of ambiguity by minimizing the context-dependence and fuzziness of expressions. This is achieved by explicit and precise description of the elements of the context needed to disambiguate the expression. A formal style is characterized by detachment, accuracy, rigidity and heaviness; an informal style is more flexible, direct, subjective, and involved, but less informative. An empirical measure of formality, the F-score, is proposed, based on the frequencies of different word classes in the corpus. Nouns, adjectives, articles and prepositions are more frequent in formal expressions; pronouns, adverbs, verbs and interjections are more frequent in contextual expressions. It is shown that this measure (and related ones), though coarse-grained, adequately distinguishes more from less formal genres of language production, for some available corpora in Dutch, French, Italian, and English. A factor similar to the F-score automatically emerges as the most important one from a factor analysis of different language samples.
Deep formality, measured on the basis of the frequencies of different word categories, is correlated with several variables. Deeply formal language appears to be characterized by high richness of distinctions, but low fluency (and thus diminished surface formality) at the level of unprepared speech. This can be measured as an increase in lexical richness, word length, utterance length, frequency of filled pauses, and a decrease in speech accuracy and speech rate. Among the causes of formality are need for unambiguous understanding, lack of feedback, and lack of shared context. The latter entails positive correlations between formality and the situational variables of audience size, difference in setting and in background between senders and receivers, and time span between sending and receiving. At the level of personality, formality appears to be correlated with gender (women tend to speak less formally), introversion, and academic level. Preliminary empirical evidence and theoretical explanations for these propositions are presented.
It is argued that the Internet computer network provides an almost ideal communication medium for systems researchers. Different Internet services are reviewed, with an emphasis on the World-Wide Web (WWW), a recently very popular distributed hypermedia system. WWW allows researchers to publish complex, integrated knowledge systems electronically over the network. This knowledge can be interactively consulted, extended and edited by users anywhere in the world. The Principia Cybernetica Project, which aims at the collaborative development of an evolutionary-systemic philosophy, has set up such a WWW server: Principia Cybernetica Web. The architecture, contents and use of this system are described, with an emphasis on the tools (annotation, editing) available for co-operative development. Evolutionary methods for automatically reorganizing such a system are discussed, and the first results of an experiment with an adaptive hypertext web, that learns from its users, are reported.
On the basis of the perceptual control theory of Powers, the market mechanism is analysed as a negative feedback loop which controls the deviation between demand (goal) and supply (perception) by adjusting the amount of effort invested in the production process (action), through the setting of the price. The interconnection of distributed control loops for the different products and services facilitates the allocation of production factors over the different products. The resulting global control system becomes more efficient by learning how to be more sensitive to deviations from the goal, and less dependent on the availability of resources. In that way, it resembles the nervous system of a supra-individual organism, characterized by socially distributed cognition.
A list of the most relevant publications on complex, evolving systems is produced by counting the number of times each publication is cited in a collection of texts on the domain. The importance of these books and papers is summarized and put into its historical context by noting the main contribution to the field of their authors, categorized by the research tradition they originated from. These include biology, physics, chemistry, mathematics, cybernetics, systems theory, economy and complex adaptive systems.
It is argued that the acceptance of knowledge in a community depends on several, approximately independent selection "criteria". The objective criteria are distinctiveness, invariance and controllability, the subjective ones are individual utility, coherence, simplicity and novelty, and the intersubjective ones are publicity, expressivity, formality, collective utility, conformity and authority. Science demarcates itself from other forms of knowledge by explicitly controlling for the objective criteria.
The symbol-based, correspondence epistemology used in AI is contrasted with the constructivist, coherence epistemology promoted by cybernetics. The latter leads to bootstrapping knowledge representations, in which different parts of the cognitive system mutually support each other. Gordon Pask's entailment meshes and their implementation in the ThoughtSticker program are reviewed as a basic application of this methodology. Entailment meshes are then extended to entailment nets: directed graph representations governed by the "bootstrapping axiom", determining which concepts are to be distinguished or merged. This allows a constant restructuring and elicitation of the conceptual network. Semantic networks and frame-like representations with inheritance can be expressed in this very general scheme by introducing a basic ontology of node and link types. Entailment nets are then generalized to associative nets characterized by weighted links. Learning algorithms are presented which can adapt the link strengths, based on the frequency with which links are selected by hypertext browsers. It is argued that these different bootstrapping methods could be applied to make the World-Wide Web more intelligent, by allowing it to self-organize and support inferences through spreading activation.
Collective intelligence is defined as the ability of a
group to solve more problems than its individual members. It is argued
that the obstacles created by individual cognitive limits and the difficulty
of coordination can be overcome by using a collective mental map (CMM).
A CMM is defined as an external memory with shared read/write access, that
represents problem states, actions and preferences for actions. It can
be formalized as a weighted, directed graph. The creation of a network
of pheromone trails by ant colonies points us to some basic mechanisms
of CMM development: averaging of individual preferences, amplification
of weak links by positive feedback, and integration of specialised subnetworks
through division of labor. Similar mechanisms can be used to transform
the World-Wide Web into a CMM, by supplementing it with weighted links.
Two types of algorithms are explored: 1) the co-occurrence of links in
web pages or user selections can be used to compute a matrix of link strengths,
thus generalizing the technique of "collaborative filtering"; 2) learning
web rules extract information from a userÕs sequential path through
the web in order to change link strengths and create new links. The resulting
weighted web can be used to facilitate problem-solving by suggesting related
links to the user, or, more powerfully, by supporting a software agent
that discovers relevant documents through spreading activation.
a theoretical, philosophical discussion of the formality
concept expounded in a more empirical paper, with
emphasis on the intrinsic limitations of scientific modelling, such as
the Gödel theorem and the Uncertainty principle.
two subsequent papers arguing on first empirical then
theoretical grounds that the state of humanity as a whole is progressing.
Progress is defined as increase in subjective Quality-Of-Life (QOL) or
happiness. QOL is found to correlate with a range of socio-economic indicators
that turn out to represent the basic values or human rights: health, wealth,
security, knowledge, freedom and equality. Statistics are gathered to show
that each of these indicators has progressed over the last half century.
This provides a very strong indication that progress objectively occurs.
The second paper examines the evolutionary reasons for this progress: natural
selection leads to increasing fitness of people and ideas, and is further
boosted by knowledge and virtuous cycles. Negative side-effects and apparently
negative developments are discussed, including exhaustion, pollution, overshoot,
parasitism, and information overload. It is concluded that they can be
tackled without really endangering progress, but that there exists a "bad
news bias" in the media which creates a needlessly pessimistic mood.
A joint review of 5 books (by Pettersson, Maynard
Smith & Szathmary, Coren, Stewart and Turchin) discussing the
evolution of complexity levels.
a summary of PCP's philosophical assumptions, as discussed
in more detail on its website, and of their
implications for the practical development of the project.
a detailed exposition of the superorganism/global brain
view of society, and an examination of the underlying evolutionary mechanisms,
with applications to the on-going and future developments in a globalizing
world
a slighty shorter version of our unpublished paper on formality of language
An "adaptive representation" is defined as the information-proicessing structure a system uses to anticipate environmental changes. To understand this mechanism, we need an adaptive metarepresentation. This can be based on thebconcept of "distinction", which leads to a Boolean algebra structure. Dynamics is represented in this framework by morphisms between Boolean algebras. "Classical" processes are represented by automorphisms, which conserve all distinctions, "non-classical" processes by general morphisms, which delete or create distinctions. [Abstract of the main results of "Representation and Change"].
It is argued that in order to solve complex problems we need a new approach, which is neither reductionistic nor holistic, but based on the entanglement of distinction and connection, of disorder and order, thus defining a science of complexity. A model of complex evolution is proposed, based on distributed variation through recombination and mutation, and selective retention of internally stable systems. Internal stability is then analysed through a generalized mathematical closure property. Examples of closure in self-organizing and cognitive systems are discussed.
It is argued that in order to tackle a complex problem domain the first thing to do is to construct a well-structured problem formulation, i.e. a "representation". Representations are analysed as systems of distinctions, hierarchically organized towards securing the survival of an agent with respect to his situation. A preliminary variation-selection model is proposed for the generation of new distinctions. A research project for building a general model of representation construction is outlined, combining theoretical, computational and empirical-psychological approaches.
It is argued that the problems of emergence and the architecture of complexity can be solved by analysing the self-organizing evolution of complex systems. A generalized, distributed variation-selection model is proposed, in which internal and external aspects of selection and variation are contrasted. "Relational closure" is introduced as an internal selection criterion. A possible application of the theory in the form of a pattern directed computer system for supporting complex problem-solving is sketched.
Shorter version of the IJMMS paper on hypermedia interfaces.
Two paradigms for studying the relation between autonomy and cognition are reviewed and contrasted: the "artificial" paradigm, which sees autonomous systems as linear, information-processing organizations, and the "autopoietic" paradigm, which sees them as circular, self-producing organizations. It is argued that these two paradigms are not inconsistent but complementary, and that they can be synthesized in an encompassing paradigm based on the self-organization of complex systems through variation-and-selective retention, leading to the emergence of relatively autonomous subsystems. Some implications of such an encompassing paradigm on the level of science, technology, individual persons and society are outlined, with reference to the papers in this collection. It is argued that the further development of such a transdisciplinary approach will lead to a new "science of complexity".
New cybernetics is characterized by its concern for autonomous and cognitive systems, in contrast to classical cybernetics, which studies mechanistic systems. The latter systems are basically causal, i.e. they conserve distinctions, and can hence be easily controlled. The former systems, however, intrinsically destroy and create distinction. Autonomy implies the active maintenance of the system-environment boundary, by compensating external perturbations. In order to do this efficiently, the autonomous system must formulate and solve adaptation problems, by making the adequate distinctions. It is proposed to study this dynamics of distinctions by means of an "adaptive metarepresentation", characterized by a hierarchy of adaptation levels, and a variation-selection type of process.
Complexity is defined as the combination of distinction and connection. Analysing a complex problem hence demands making the most adequate distinctions, taking into account connections existing between them. The concept of closure in mathematics and cybernetics is reviewed. A generalized formal concept is introduced by reformulating closure in a relational language based on connections. The resulting "relational closure" allows to reduce low level, internal distinctions and to highlight high level, external distinctions in a network of connections, thus diminishing the complexity of the description.
The principle of natural selection is taken as a starting point for an analysis of evolutionary levels. Knowledge and values are conceived as vicarious selectors of actions from a repertoire. The concept of metasystem transition is derived from the law of requisite variety and the principle of hierarchy. It is defined as the increase of variety at the object level, accompanied by the emergence of a situation-dependent control at a metalevel. It produces a new level of evolution, with a much higher capacity for adaptation. The most important levels are discussed, with an emphasis on the level characterizing man as distinct from the animals. An analysis of the shortcomings of this "rational" system of cognition leads to a first sketch of how the next higher "meta-rational" level would look like.
reprint of journal paper.
A set of fundamental principles for the cybernetics domain is sketched, based on the spontaneous emergence of systems through variation and selection. The (mostly self-evident) principles are: selective retention, autocatalytic growth, asymmetric transitions, blind variation, recursive systems construction, selective variety, requisite knowledge and incomplete knowledge. Existing systems principles, such as self-organization, "the whole is more than the sum of its parts", and order from noise can be reduced to implications of these more primitive laws. Others, such as the law of requisite variety, the 2nd law of thermodynamics, and the law of maximum entropy production are clarified, or restricted in their scope.
It is argued that the analysis and control of complex systems demands a completely new, non-classical framework, based on a distinction dynamics. Dynamical representations are analysed as distinction systems. Classical representations are characterized by the fact that all distinctions are conserved. The creation, conservation and destruction of distinctions can be understood on the basis of a distinction dynamics. The fundamental mechanism is the variation through recombination and selective retention of closed combinations. The fact that the same process may be constrained by several independent closures is emphasized. Complex dynamics is analysed as an example of a theory with a limited dynamics of distinctions: distinctions can be destroyed but not created. It is sketched how a more general theory might be applied in solving complex problems in the form of a computer program based on variation and selection
Short exposition of the principles underlying a cybernetic-evolutionary epistemology: function, units and development of knowledge. Overview of the main criteria that select newly generated knowledge units: distinctiveness, invariance, learnability, survival, reproduction, salience, formality, ease of expression, contagiousness, consensus.
Given that knowledge consists of finite models of an infinitely complex reality, how can we explain that it is still most of the time reliable? Survival in a variable environment requires an internal model whose complexity (variety) matches the complexity of the environment that is to be controlled. The reduction of the infinite complexity of the sensed environment to a finite map requires a strong mechanism of categorization. A measure of cognitive complexity (C) is defined, which quantifies the average amount of trial-and-error needed to find the adequate category. C can be minimized by "probability ordering" of the possible categories, where the most probable alternatives ("defaults") are explored first. The reduction of complexity by such ordering requires a low statistical entropy for the cognized environment. This entropy is automatically kept down by the natural selection of "fit" configurations. The high probability, "default" cognitive categorizations are then merely mappings of environmentally "fit" configurations.
If society is viewed as a super-organism, communication networks play the role of its brain. This metaphor is developed into a model for the design of a more intelligent global network. The World-Wide Web, through its distributed hypermedia architecture, functions as an "associative memory", which may "learn" by the strengthening of frequently used links. Software agents, exploring the Web through spreading activation, function as problem-solving "thoughts". Users are integrated into this "super-brain" through direct man-machine interfaces and the reciprocal exchange of knowledge between individual and Web.
Shorter version of Development and Publication of Systems Knowledge on the Internet
This paper describes our attempts to devise a number of algorithms that can make distributed hypertext networks such as the World Wide Web self-organise according to their users' knowledge. A number of experiments were conducted in which experimental networks of English nouns were being browsed via the Internet by several thousands of participants. These experimental networks evolved into a stable state which represented the participants shared knowledge structure and associations.
Although the growth of complexity during evolution seems obvious to most observers, it has recently been questioned whether such increase objectively exists. The present paper tries to clarify the issue by analysing the concept of complexity as a combination of variety and dependency. It is argued that variation and selection automatically produce differentiation (variety) and integration (dependency), for living as well as non-living systems. Structural complexification is produced by spatial differentiation and the selection of fit linkages between components. Functional complexification follows from the need to increase the variety of actions in order to cope with more diverse environmental perturbations, and the need to integrate actions into higher-order complexes in order to minimize the difficulty of decision-making. Both processes produce a hierarchy of nested supersystems or metasystems, and tend to be self-reinforcing. Though simplicity is a selective factor, it does not tend to arrest or reverse overall complexification. Increase in the absolute components of fitness, which is associated with complexification, defines a preferred direction for evolution, although the process remains wholly unpredictable.
It is argued that the future of the senses should best be studied through the metaphor of society as a superorganism. The precise correspondence of functions between society and a multicellular organism is outlined. The information processing functions (nervous system) of the social superorganism are strongly enhanced by the on-going network revolution. This leads to the view of the future world-wide web as a "global brain", encompassing many individuals.
an extensive, non-technical review of the basic concepts
and principles developed in theories of self-organization, such as order
from noise, attractors, entropy, fitness landscapes, bifurcations, feedback,
closure, etc.
Heylighen F. (2001): "Mining Associative Meanings from the Web: from word disambiguation to the global brain", in: Proceedings of the International Colloquium: Trends in Special Language & Language Technology, R. Temmerman & M. Lutjeharms (eds.) (Standaard Editions, Antwerpen), p. 15-44.
applications of associative networks, which learn associations through Hebbian-style rules either by measuring co-occurrence of words in text or patterns of usage, to problems of ambiguity and meaning in language.
shorter version of Global Progress I, with some additional material on why the measurement of subjective well-being or happiness is prone to "relativistic" distortions, and how these could be minimized
an dense introduction to and review of the basic ideas of cybernetics, including relational concepts, information and entropy, cyclical processes such as feedback, self-organization and autopoiesis, goal-directedness and control, the laws of requisite variety, requisite hierarchy and requisite knowledge, and a constructivist view of cognition and model-building.