Next Article in Journal
Adaptive Control of Photovoltaic Systems Based on Dual Active Bridge Converters
Previous Article in Journal
Optimal Control of a Passive Particle Advected by a Lamb–Oseen (Viscous) Vortex
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emergent Intelligence in Generalized Pure Quantum Systems

by
Miroslav Svítek
1,2
1
Czech Institute of Informatics, Robotics and Cybernetics, Czech Technical University in Prague, 160 00 Prague, Czech Republic
2
Department of Computer Science, University of Matej Bel, 974 01 Banska Bystrica, Slovakia
Computation 2022, 10(6), 88; https://doi.org/10.3390/computation10060088
Submission received: 17 April 2022 / Revised: 20 May 2022 / Accepted: 23 May 2022 / Published: 31 May 2022

Abstract

:
This paper presents the generalized information system theory, which is enlarged into pure quantum systems using wave probability functions. The novelty of this approach is based on analogies with electrical circuits and quantum physics. Information power was chosen as the relevant parameter, which guarantees the balance of both components—information flow and information content. Next, the principles of quantum resonance between individual information components, which can lead to emergent behavior, are analyzed. For such a system, adding more and more probabilistic information elements can lead to better convergence of the whole to the resulting trajectory due to phase parameters. The paper also offers an original interpretation of information “source–recipient” or “resource–demand” models, including not yet implemented “unused resources” and “unmet demands”. Finally, possible applications of these principles are shown in several examples from the quantum gyrator to the hypothetical possibility of explaining some properties of the consciousness.

1. Introduction

Information theory was founded by Claude Shannon [1] and his colleagues in the 1940s and was associated with coding and data transmission, especially in the newly emerging field of radar systems.
Syntactic (Shannon) information was defined as the degree of probability of a given event and answered the question how often a message appears. The probability model of information [2] defined in this way has been used for the design of self-repairing codes, digital modulations and other technical applications.
Carnap and Bar-Hiller [3] firstly completed the model–theoretical work of semantic information. There are a number of other works on this topic, such as [4]. In this approach, semantic information asks how often a message is true. Zadeh [5] expanded this way of thinking to the theory of fuzzy sets, which is a specific tool that maps a value for which an element is or is not a member of a set, which is expressed as a number between zero and one.
Knowledge subsystems organized into various interconnections can lead to the controlled dissemination of macroscopic work [6]. There is a huge literature on self-organizing systems and their applications in biology, e.g., [7,8,9].
Quantum informatics employs Hilbert spaces for mixed quantum states [10]. A summary of this area is given, for example, in [11]. For pure quantum states, quantum informatics describes environmental properties as the phase space [12] of interconnected events. The entanglement [13] of quantum states leads to space–time synergies between particular events, which can be considered as a form of ordering.
It is clear that living organisms communicate with their environment to benefit from it. If their activities offer the possibility and opportunity to organize the environment for other organisms, too, we can speak of ethical dealing and collective intelligence [14].
In general, we deal with changes and events, i.e., what happened. Very little, rather not at all, we are dealing with what has not happened, something that can be called non-events. The monitoring of unrealized possibilities and opportunities in our environment is exploring its largest part.
What has not happened is a necessary complement to what has happened, and it is important to realize that it is sometimes more important than what has happened. Finding a method of creating a complement map of what has happened and what has not happened is a big task in finding a new, more comprehensive view of our environment. This is close to a description of the Frame Problem, which is probably the most famous task in Artificial Intelligence [14].
Section 2 and Section 3 summarize the basic definition of generalized system properties and generalized information systems. Section 4 enlarges this approach to generalized pure quantum systems with quantum emergent behavior. Section 5 shows the applicability of quantum emergent intelligence to explain complex system behavior. Section 6 includes a discussion of quantum consciousness and Section 7 concludes the paper.

2. Generalized System Properties

When modeling complex systems, generalized system properties describing an abstract view are often used. We can find a number of analogies in which generalized effort (strength, pressure, stress, etc.) and generalized flow (current, flow, velocity, etc.) play an important role. In Table 1, different examples of physical systems are presented.
The integral of a generalized flow is a generalized accumulation (electric charge, volume of fluid or gas, stretched spring, accumulated heat, etc.), and the integral of the generalized effort is the generalized momentum (kinetic energy, inertance). In Figure 1, parameters R, C and L represent proportionality constants between individual generalized system properties [15], which correspond to, e.g., resistance, capacitance or inductance.
The advantage of the proposed approach lies in the fact that the same principles as used in mechanics, electrical engineering, hydraulics, thermodynamics, etc. are applied for the modeling of generalized information systems. Extension to quantum models, where individual parameters are captured by complex functions, enables the modeling of various types of soft systems such as altruistic human behavior [16].
Thanks to the proposed analogies, it is possible to use many existing theories and procedures, such as the analysis and synthesis of electrical circuits, including positive or negative feedbacks, and thus describe emergent features of complex information systems.

3. Generalized Information Systems

3.1. Definition of Information Circuits

The concept of data means a change of state, for example from 0 to 1 or from 1 to 0, where the state vector is not necessarily only digital or one-dimensional. Every such change can be described with the use of a quantity of information in bits.
Information flow refers to the frequency of state/signal changes in bits per second or how often the change is carried out (quantity of information). Information content, on the other hand, characterizes the quality of information or how valuable the content is, which is measured in Joules per bit [15]. In information systems, we do not use the value Joules per bit but rather Success events per bit. Success event means the number of events or processes completed in the information system due to received bits of information. It is evident that a relation between the information flow and the information content can have a lot of time-dependent forms [17].
The concept of information power [15] has been constructed as a product of information flow and information content in order to study the problem of a system’s response to the certain information. The definition of information power may be obtained from generally known formulas: power is equal to the amount of work per time unit.
For the sake of simplicity, let us imagine an information subsystem as an input–output information gate, as shown in Figure 2, that issues from a matrix representation in the following form:
I 2 ϕ 2 = t a t b t c t d · I 1 ϕ 1 = T · I 1 ϕ 1
where the matrix T is called the transmission matrix.
Between the input ports, input information content is available, and input information flow enters the system. Between the output ports, it is possible to obtain output information content, and output information flow leaves the system.
Let us now examine the input–output information gate we have created. Input quantities can describe purely intellectual operations. Input information content includes our existing knowledge, and input information flow describes the change to the environment in which our gate operates and the tasks that we want to be carried out (target behavior).
Long-term information gained in this way can be used for the targeted release of energy, where at the output of the input–output gate, there may be information content in the order of millions of Joules per bit (or profits in millions of dollars). The output information flow serves as a model for the provision of such services or knowledge.
The basis of generalized information systems is the ability to interconnect individual information subsystems, or in our case, input–output information gates. It is very easy to imagine the serial or parallel ordering of these subsystems into higher units. A very interesting model is the feedback of information subsystems, because this leads to non-linear characteristics, information systems defined at the limit of stability and other interesting properties.

3.2. Information Environment

In the theory of information physics, the link between the information source and the recipient must be better analyzed and generalized for the purposes of complex systems. On the side of the source, each event u can be described by information source content and flow I S u , Φ S u .
The recipient tries to process the event u in its environment, providing its registration and its understanding. This process results in the representation of received information content and flow I R u , Φ R u on the side of the recipient.
For a lot of events u, we can suppose that there is no difference between the information source and the received information. We rightfully suppose that:
I S u = I R u Φ S u = Φ R u
For more complex events, this assumption is not valid. In social sciences, for example, we must have a lot of data available (information flow) to identify some social event (information content).
A typical information environment can be described with a gyrator matrix [18]:
I S u I R u = 0 D + D 0 · Φ S u Φ R u
The first equation explains that the more information content I S u the source contains, the greater the information flow Φ R u to the recipient. A negative sign means that the information flow Φ R u goes from the gate to the recipient and not in other way, as is commonly referred to in electrical circuits (Figure 2). The second equation states that the greater the information flow from the source Φ S u , the higher the information content I R u of the recipient.
If the information gyrator with tuning parameter D is connected to the information capacitors C 1 , C 2 modeling both source and recipient knowledge (Figure 3), the resonance with frequency f can occur [18]:
f = D 2 2 · π · C 1 · C 2
In the context of the gyrator, it is appropriate to deal with the problem of teaching, because the information subsystem called a teacher may be regarded as a source of information content I 1 . The teacher has prepared this information content for years, so that the maximal information flow Φ 2 can be passed on to a subsystem known as a student.
The students listen to the teacher’s information flow Φ 1 influenced by his/her pedagogic skills, and it changes their information content I 2 . If the students are not in a good mood, or if the information flow from the teacher Φ 1 is confused, the students are unable to understand the information received and to process it in order to increase their information content I 2 .

3.3. Information Alliances

It is hard to find an appropriate system to combine the characteristics of the different information subsystems, but it is possible to create a group of subsystems—a system alliance [19], where these characteristics can be combined appropriately. In this way, one can model a company or a society of people who together create information output that is very effective and varied, leading to improved chances for the survival and subsequent evolution of the given group.
To model generalized information systems, the multi-agent technologies can alternatively be used [20]. All requirements and resources are represented by demand agents and resource agents, which can negotiate among themselves. In a multi-agents environment, we can organize negotiations among agents through different modeling and simulation tools. Each model plays the role of a dynamical digital market place [14] with limited time-varying resources. Different demand agents negotiate in each time interval to capture requested resources. The best system structure is created to combine requests and resources to satisfy all the demands, so every match of the resource and demand will have their time slot.

4. Generalized Pure Quantum Systems

4.1. Definition of Pure Quantum System

Let us define N discrete events A i , i 1 , 2 , , N of a sample space S, with defined time-dependent probabilities P A i , t , i 1 , 2 , , N . The quantum state ψ , t η represents a description of the pure quantum system given by the superposition of N discrete events at location η and time instant t:
ψ , t η = ψ A 1 , t · A 1 η + ψ A 2 , t · A 2 η + + ψ A N , t · A N η
with N wave probabilistic functions defined as [21]:
ψ A i , t = α i t · e j · υ i t , i 1 , 2 , , N
where α i t = P A i , t is the modulus, and υ i t is the phase of a wave probabilistic function. We suppose that the reference phase is assigned to event A 1 at time t = 0 and typically is chosen as υ 1 0 = 0 .
Mixed quantum systems generally employ Hilbert spaces, not only phase spaces, but for the simplicity, we will show our results only for pure quantum states. The extension from pure to mixed states can follow [11] or in matrix form [22].

4.2. Quantum Emergent Properties

In the simplest case, we can analyze the pure quantum system with two states A, B. The only difference compared to the classical probability union (7) is that in quantum case (8), the intersection P A B can have both negative and positive signs due to the cosine function [17]:
P A B = P A + P B P A B
P A B = P A + P B · e j · φ 2 = P A + P B + 2 · P A · P B · cos φ
In system analysis, we examine various overlaps, correlations between subsets, and try to identify and eliminate the common intersection P A B in (7). It makes sense when optimizing the frequency band, compressing data, etc. After such optimization, the probabilities P A and P B carry a unique and non-overlapping information.
In system synthesis, we can extend P A B even if the probabilities P A and P B do not change. Let us assume that P A and P B represent the probability of two approaches A and B. Thanks to human creativity, we can assume the existence of a negative intersection P A B due to phase parameter φ to modify the probability union:
P A B = P A + P B P A B = P A + P B + P A B
It should be noted that without the existence of A, B, the P A B could not be extended.

4.3. Quantum Resource—Demand Model

By Equation (8), the probability union P A B can include both positive and negative probabilistic intersections ± P A B . The resource corresponds according to Equation (8) to the positive probabilistic intersection P A B representing phases φ in the left half of the complex plane. We obtain the same result as in (7). If the intersection is negative P A B , which corresponds to the right-half of complex plane, we can talk about the demand.
This easy example can be extended to the more dimensional set of components A, B, C, D,… among which some of them are positive and others are negative intersections. Within an encapsulated system, partial resources and demands cancel each other, but there still remains unused resources and unmet demands of a system as a whole.
The unification of ideas does not necessarily have to be limited to the consciousness of the individual but also works for a team of people who understand each other and listen to each other. Presenting the ideas of different participants can generate more and more ideas that would never have come up without a suitable environment. These methods are commonly known as brainstorming.
Quantum physics transforms this situation into an energy model of a whole system (square of wave probabilistic model), thus eliminating all positive and negative phases. The forecast of a quantum model is correct because it shows how the system evolves externally, taking into account all inner synergies between partial resources and demands.
Statistically, the external demands will sooner or later be satisfied (steady states), and thus, the quantum model predicts the system behavior well. Sometimes (with some probability), the demand is not fully satisfied due to, e.g., lack of energy or experiment set up. Such a situation still corresponds to the statistical expectation of quantum physics.
In our environment, there exists counterparts such as lack/surplus, demand/resource. For surplus or resources, statistics should not be used. It remains only a statistical description for lack or demand, which is indeed of probabilistic (intangible) origin in nature.
The presented concept can be applied to conscious reasoning. Information processing starts both by the identification of common similarities (critical analysis of positive intersections) and by looking for the demands (negative intersection) in order to broaden or supplement current knowledge. The more information we process (interconnect together), the greater demands for new knowledge will appear.
This principle can easily explain the emergent force caused by entropy non-equilibrium (similar to entropic field in hypothetical gravitation theory [23]) that leads us to constantly study and seek new and better theories for a greater understanding of the world around us.

4.4. Emergent Resonance in Pure Quantum Systems

With respect to generalized information systems, we can define the wave information flow and the wave information content as the wave probabilistic functions:
ψ Φ = α Φ , 1 · Φ 1 + α Φ , 2 · Φ 2 + + α Φ , N · Φ N
ψ I = α I , 1 · I 1 + α I , 2 · I 2 + + α I , N · I N
where Φ 1 , , Φ N and I 1 , , I N are possible values of information flow and information content, respectively. Complex parameters α Φ , 1 , , α Φ , N and α I , 1 , , α I , N represent wave probabilities.
The wave information power can be expressed through wave probabilistic functions as follows:
ψ P I = ψ Φ ψ I = α Φ , 1 · α I , 1 · Φ 1 , I 1 + + α Φ , 1 · α I , N · Φ 1 , I N + + α Φ , N · α I , 1 · Φ N , I 1 + + α Φ , N · α I , N · Φ N , I N
where symbol means Kronecker operation for vectors transformed into multiplication, while each i,j-th component Φ i , I j represents a particular value of information power that characterizes the falling/measuring of the information flow Φ i and the information content I j .
The multiplication of different combinations of the information flows and contents, Φ i , I j , Φ k , I l can achieve the same (or similar) information power K r :
Φ i · I j Φ k · I l K r
It can be seen that interferences of wave probabilities can emerge as a wave resonance. Finally, an information power in renormalized form can be expressed as:
ψ P I = β 1 · K 1 + β 2 · K 2 + + β r · K r +
This approach yields to the wave resonance principle between the received/transmitted information flow and information content, which causes emergent behavior in generalized pure quantum systems [17].

5. Emergent Intelligence

5.1. Quantum Physics

Contemporary quantum physics [10] distinguishes between bosons and fermions. For bosons (with integer spin), the principle applies that they attract each other and cluster together into individual spatial areas and are a source of kinetic energy (generalized flow). Photons are canonical bosons and display no such behavior.
On the contrary, for fermions (with a half-numbered spin), the well-known Pauli Exclusion Principle applies. That is, it is not possible to find two fermions in the same place. Fermions therefore form spatial structures and are responsible for the formation of matter (generalized effort).
A photon is a boson (with a unit spin) and is itself an antiparticle. On the other hand, the electron has polarization, because otherwise, all electrons would cluster around the nucleus, could not be separated, and all atoms would have the same properties. A photon does not have mass, but it does have momentum. Feynman diagrams can accurately calculate the probabilities of the creation/dissolution—energy to matter conversion and vice versa [24]. It is to these relatively simple properties of electrons and photons that we owe the complexity and diversity of our world.
Several Nobel prizes have been awarded throughout history for demonstrating the breaking of the symmetry principle that is so popular with physicists. If there were no symmetry breaking, all electron–positron pairs would have been converted into photons at the origin of the universe and radiated their energy completely. We owe the principle of symmetry breaking to the fact that there is more matter than antimatter in the universe.
Stonier [25] compares the information content of a crystal and the genetic code and considers the impact of thermal energy as opposition to information. He demonstrates this principle on living organisms, which very consistently regulate their temperature in order to maintain their information content. In his predictions, he goes even further, claiming that there is a class of other hypothetical particles that consist only of information and call these particles infons.
Infons cannot manifest in physical experiments because they have neither matter nor energy, and their effect is manifested only by a change of orderliness. Here, you can see an analogy with our demand model, which also does not manifest itself in the physical world but leads to a new future arrangement.
In terms of information physics, even the absence of structure within the overall form can carry information as well as the structure itself. For example, a hole caused by the loss of an electron in the orbit of an atom creates a particle form of information.
Based on these principles and the relationships between particles and gaps in the structure, more general information laws can be considered at the borderline between the system and its environment. If we make a change in the system under study by taking away a part of it, that part then becomes part of its environment. Thus, the information in the system itself as well as in its surroundings is changed.
A Markov blanket seems to be an appropriate instrument to represent the boundaries of a system (e.g., a cell or a multi-cellular organism) in a statistical sense. It is a statistical partitioning of a system into internal and external states, where the blanket itself consists of the states that separate the two—external states are conditionally independent of internal states, and vice versa, as internal and external states can only influence each other via sensory and active states. The autonomous organization of living systems [26] can consist of the hierarchical assembly of Markov blankets through adaptive active inference.

5.2. Deterministic Chaos at the Quantum Borderline

Let us imagine a simple numerical problem: a mathematical equation that we plot on a two-dimensional graph. The graph (Figure 4) will have two maxima on the y-axis for two different values of the x-coordinate (x1, x2).
Let us use any numerical method to find the nearest maximum points of the given equation. Regardless of the complexity and sophistication of the numerical method used, we can say that absolutely every numerical method works in such a way that if we choose an initial arbitrary point on the x-axis, the algorithm gradually approaches the nearest maximum. The numerical method ends up recognizing that it is currently at the nearest maximum of the function, and the result of this method is the x-coordinate of the nearest maximum just found.
However, what happens if we have two maxima? For example, let us mark in blue the initial values of the x-coordinate, from which we run our numerical algorithm and which lead to finding the first nearest maximum—f(x1). Thus, for such a closest maximum, let us choose, for example, the first maximum from the left with respect to the x-axis. We can mark the initial values of the x-coordinate in red, for which our algorithm ends in the second nearest maximum—f(x2).
Let us try to investigate the x-axis marked in this way with color-coded initial conditions. Near the first maximum, the x-axis will only be blue. The same will apply to the neighborhood of the second maximum, where the x-axis will be colored red. This meets our expectations, as our algorithm always proceeds from the specified initial value to the nearest maximum where it ends.
However, what will the situation look like at the borderline of the two maxima, i.e., in places where we can no longer unambiguously determine that it is a part in full red or full blue? Here, we approach the theory of deterministic chaos. If we are to draw colored points on the x-axis, we must first choose a certain precision, resolution, or step on the x-axis. For example, let us start with one decimal place on the x-coordinate scale. An image of variously changing blue and red colors is created on our borderline for a given resolution level. If we increase the resolution accuracy, for example, from one decimal place to two, the image will change completely, and we will obtain a new reality and a new layout of the blue and red color combination. This can be repeated again and again, with each increase in the level of discrimination opening up a new reality and new knowledge about the given borderline. At each distinctive level, we obtain a completely unique distribution of blue and red colors. It is possible to prove mathematically that if we repeat this procedure to infinity, then even at infinity, we do not find a layout where two points next to each other have the same color [27]. If we go from a one-dimensional problem to a two-dimensional problem, it is possible to color spatial images at a given resolution level, which can have up to fractal complexity. Of course, a similar situation arises for 3D and other nD models.
An almost infinite sensitivity to initial conditions may lead to the assumption that the multiple-world existence of quantum physics arises from very rapid switching between different worlds on the brink of chaos, which we objectively perceive as parallel existing realities represented by probabilistic wave functions.
It can be seen that a large amount of information can be stored on the borderline. Here, it is also possible to combine different behaviors associated with differentiation levels and to use infinite variability to obtain new energy resources and emergent properties resulting from different variants of orderliness. This is a life-giving cocktail. After all, our brains also work by using properties on the brink of chaos to find new connections for much-needed creativity (variability of possibilities), from which, thanks to natural selection, a new non-traditional solution can emerge.

5.3. Quantum Natural Selection

Recall that quantum computers are based on mass-parallel computation, which means that all states/events are interconnected by phase parameters (the so-called quantum superposition), and thus, one operation can be applied to the whole set of mutually superposed states.
Assume the existence of two quantum agents defined by superposed states/events. Let the first one represent all possible superposed lock types (resource quantum agent) and the second one represent the superposition of all possible keys (demand quantum agent). When these two quantum agents meet or negotiate, they form a new quantum system that represents the mutual superposition of all phase-entangled combinations of locks and keys.
Suppose that if a particular key fits into the correct lock, the door opens, and a sequence of downstream processes is executed. Of course, the quantum approach can also be used for a multidimensional search, thus greatly speeding up natural selection, which can be called mass-parallel natural selection. For example, Grover’s quantum algorithm [28] can find an element with the desired property in a n-element list in n computations steps, which as a non-quantum algorithm requires at least n steps.
It is important to mention that, e.g., opening a door does not have to be quantum and can take place in the parallel many macro worlds. In this way, natural selection could be greatly accelerated and thus explains how living organisms could have come into existence through emergent quantum behavior.

5.4. Quantum Gyrator

Let us imagine the quantum example of the information gyrator given in Figure 3 with a set of superposed information flows and contents:
ψ Φ 1 = α 1 , 1 · Φ 1 , 1 + α 1 , 2 · Φ 1 , 2 + + α 1 , N · Φ 1 , N
ψ Φ 2 = α 2 , 1 · Φ 2 , 1 + α 2 , 2 · Φ 2 , 2 + + α 2 , N · Φ 2 , N
ψ I 1 = β 1 , 1 · I 1 , 1 + β 1 , 2 · I 1 , 2 + + β 1 , N · I 1 , N
ψ I 2 = β 2 , 1 · I 2 , 1 + β 2 , 2 · I 2 , 2 + + β 2 , N · I 2 , N
where α 1 , 1 , , α 1 , N , α 2 , 1 , , α 2 , N and β 1 , 1 , , β 1 , N , β 2 , 1 , , β 2 , N are wave probabilistic functions.
The gyrator input can be composed of a set of i-th input components I 1 , i Φ 1 , i and the output of a set of j-th components I 2 , j Φ 2 , j . Theoretically, the set of parallel working gyrators with their unique resonance frequencies is emerging [17]. Each resonance maximizes information content assigned to the combination I 1 , i , Φ 1 , i , I 2 , j , Φ 2 , j .
In summary, it yields into the superposition of different resonance frequencies. From radio-electronics, it is recognized that except for pure resonance frequencies assigned to I 1 , i , Φ 1 , i , I 2 , j , Φ 2 , j , combined frequencies known as higher harmonic components are also created. The more variants of gyrators I 1 , i , Φ 1 , i , I 2 , j , Φ 2 , j the more different frequencies can occur and then the ability of information coding is increasing.
Considering all the frequencies’ variants, it is evident that a complex web of frequencies can be created. If we take each frequency as the carrier of modulated information, we can bring the speculative hypothesis that our consciousness is linked to this brain network.

6. Quantum Consciousness

Based on the theories known so far, the constructive idea is proposed that consciousness may be a multidimensional (internal) space in which we store states/events, acquired information or created knowledge, including their interrelationships captured by phase parameters. The interrelationships represent both four-dimensional space but also a variety of difficult-to-describe sensations such as color similarity, emotional context, characteristic smell, etc.
The different components of consciousness may be in different relationships to stories lived in certain circumstances or at other times in life. Other stories can be read from books or acquired through interactions with other people. This creates a plethora of often redundant components that our consciousness tries to sort into higher logical units, stories, symbols, etc.
It is possible to hypothesize that this sorting takes place in a purely quantum environment of brain microtubules, as suggested by Roger Penrose [29]. On the other hand, even without the need for a quantum environment, as it is proposed here, due to the highly redundant arrangement of all possibilities, each of the variants is physically stored in a neural network, including all phase relations to other components. It is the physical realization of the many-worlds interpretation of quantum physics [10].
In line with the many-worlds interpretation of quantum physics, extra dimensions represent possible subsystems. The main difference is that we can use our free will to select one of the possible subsystem, i.e., to move in these additional dimensions. Our will cannot change anything in the 3D world, but probably what we can do is to move between possible subsystems. Our free will can select in which of the possible worlds we will be in the future.
Some components of consciousness are naturally attracted to each other and are compatible with each other, are close to each other in some way, and can form a communicable and graspable story. Other components, on the other hand, repel each other, are incompatible and cannot be part of a common story. This corresponds to the presented theory of generalized quantum systems.
The degree of subjectivity of consciousness lies in the component from which consciousness starts to create its internal picture of the world and in what way other components are gradually selected for inclusion in this model. Hypothetically, we can assume the existence of the same components of consciousness for several people. The difference in the internal model of consciousness may be shaped simply by the fact that everyone has a different preference as to which component to start with. Moreover, each has a different path of gradual phase linkage to other components. This creates various complex and unique structures, which are available only to a specific individual (intrinsic model). Only he/she can correct, add to, go through and change the structure of the intrinsic model according to other knowledge and conditions.
The external observer (extrinsic model) does not see into this internal structure of consciousness and is dependent only on measuring and evaluating possibilities using probability functions, as the so-called Copenhagen interpretation of quantum physics teaches us [10]. This is reminiscent of Bayesian statistics [30], which also uses measured data to build a model, thereby gradually increasing knowledge of the system under study, which may be fully deterministic in nature. Unfortunately, the algorithm for its behavior is hidden from us, and we are therefore left to evaluate only the input–output statistical characteristics in a stepwise fashion.
Entanglement makes it possible to connect chains of states/events into higher and more complex components of consciousness in such a way that they can no longer be broken down into lower parts. We must take such indivisible components as wholes and work with them in this way. The more states/events an indecomposable component contains, the qualitatively higher the individual’s consciousness, as presented, for example, in the Integrated Information Theory of Consciousness [31]. In this way, virtual complex structures can emerge, where the partial states/events of different stories are newly entangled to create an entirely new hypothetical story.
Many references try to describe the consciousness phenomenon [32,33,34,35,36,37], or at least they come close to explaining it. In conclusion, no theory is largely accepted by the scientific community. The problem with contemporary science is that it is limited to an external observer (extrinsic model) and processing the results of his observations according to the Copenhagen interpretation of quantum physics. It should be noted that even Einstein was not reconciled to this interpretation and was still looking for hidden (intrinsic) parameters. His sentence that “God does not play dice” is well known. Ian Stewart and many others, on the contrary, have shown that God not only “plays dice” sometimes but that he is also a pretty big gambler [38].
If we include in our considerations an intrinsic observer (the intrinsic model), which represents a model of our consciousness based on demand/resource principle, we can assume switching between different virtual scenarios (different worlds with deterministic chaos at the borderline). Based on emotional states and other hidden parameters, it can lead to synchronicity demonstrated at the quantum level, e.g., orchestrated decoherence [39] or the emergence of a Noetic field [40]. These phenomena are virtually indistinguishable to existing quantum physics, and their functioning is still a mere hypothesis. It can be expected that as the accuracy of measuring and detection instruments increases, we will soon see a clarification of these phenomena.

7. Conclusions

In order to create a model of complex systems, we need to extract the necessary knowledge. These ideas raise the question of the relationship between our knowledge and real reality. As a partial problem, the question arises about the difference between the information form of the actual event and the information available on the part of the observer [17]. It can be seen that the observer can obtain information with either delay or distortion, which significantly disrupts his knowledge of the surrounding reality. Quantum models can partially correct these disproportions using phase parameters.
There is a significant hypothesis that if we had a good description of the information received and a good description of the structure of information circuits with all the feedback, we could quite successfully reconstruct the real events. At the end of this thought chain, there should be an attempt to acquire knowledge of reality itself. This process is very complicated, but fortunately, we have the opportunity to use simulation experiments in virtual space, which can represent for us a picture of the world of information at our current level of knowledge.
Going back in history, sometime around 600 BC, the Cretan philosopher Epimenidus declared, “All Cretans are liars”. Nevertheless, the statement becomes undecidable because Epimenides was a Cretan. Once we believe that the one who utters this sentence is telling the truth, then we must accept the fact that he is a liar. Moreover, if he is a liar, then he thinks the opposite of what he claims, and therefore tells the truth.
Thanks to this paradox of the liar, the foundations of such an untouchable field as mathematics trembled in the 20th century, when the Brno-born and brilliant mathematician Kurt Gödel concluded that if a system is initially described by axioms; we reach undecidable conclusions by formal logic within the system [41,42]. More precisely: in any axiomatic system that is at least so complex that it contains the axioms of arithmetic, a theorem can be formulated that is not provable in that system. In other words, Kurt Gödel is saying with this sentence that mathematical provability is a weaker notion than truth.
It is a fact that we are living in a revolutionary time full of paradigm shifts in our perceptions of our environment. It was a great adventure to navigate through different interdisciplinary worlds and find different forms of inspiration for connecting partial knowledge and trying to see the world as a complex system with all possible and impossible connections and emergencies.
It brings us back to the legendary Jewish “pardes” [43], the garden of knowledge, where different fields pose all sorts of challenges and pitfalls. These are the inner and outer worlds of the ‘obvious area of what is obvious’, ‘the obvious area of what is hidden’, ‘the hidden area of what is obvious’, and ‘the hidden area of what is hidden’. It is beautiful passing freely through these worlds, but it requires not only an open mind, but also a lot of learning and training [43]. However, it is one of the very few ways for a comprehensive holistic experience from the garden of knowledge.

Funding

This work was supported by the European Regional Development Fund under the project AI & Reasoning (reg. no. CZ.02.1.01/0.0/0.0/15_003/0000466).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

Czech Institute of Informatics, Robotics and Cybernetics, Czech technical University and European Regional Development Fund under the project AI & Reasoning (reg. no. CZ.02.1.01/0.0/0.0/15_003/0000466).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–623. [Google Scholar] [CrossRef] [Green Version]
  2. Lombardi, O.; Holik, F.; Vanni, L. What is Shannon information? Synthese 2015, 193, 1983–2012. [Google Scholar] [CrossRef] [Green Version]
  3. Bar-Hiller, Y.; Carnap, R. Semantic Information. Br. J. Philos. Sci. 1953, 4, 147–157. [Google Scholar] [CrossRef]
  4. Barwise, J. Information and semantics. Behav. Brain Sci. 1983, 6, 65. [Google Scholar] [CrossRef]
  5. Zadeh, L.A. Fuzzy Sets. Inf. Control. 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  6. Kauffman, S. Molecular autonomous agents. Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 2003, 361, 1089–1099. [Google Scholar] [CrossRef]
  7. Maturana, H.R.; Varela, F.G. The Tree of Knowledge; Shambhala: Boston, MA, USA, 1987. [Google Scholar]
  8. Varela, F. Principles of Biological Autonomy; Elsevier-North Holland: New York, NY, USA, 1979. [Google Scholar]
  9. Mingers, J. The Cognitive Theories of Maturana and Varela. Syst. Pract. 1991, 4, 319–338. [Google Scholar] [CrossRef]
  10. Feynman, R.; Leighton, R.; Sands, M. Feynman Lectures of Physics; Addison Wesley Longman, Inc.: Boston, MA, USA, 1966. [Google Scholar]
  11. Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information; Cambridge University Press: Cambridge, UK, 2000; p. 23. [Google Scholar]
  12. Vedral, V. Introduction to Quantum Information Science; Oxford University Press: Oxford, UK, 2006. [Google Scholar] [CrossRef]
  13. Svítek, M. Wave probabilities and quantum entanglement. Neural Netw. World 2008, 5, 401–406. [Google Scholar]
  14. Rzevski, G.; Skobelev, P. Emergent Intelligence in Large Scale Multi-Agent Systems. Int. J. Educ. Inf. Technol. 2007, 1, 64–71. [Google Scholar]
  15. Svítek, M.; Votruba, Z.; Moos, P. Towards Information Circuits. Neural Netw. World 2010, 2, 241–247. [Google Scholar]
  16. Cheon, T. Altruistic contents of quantum prisoner’s dilemma. Europhys. Lett. EPL 2004, 62, 149. [Google Scholar] [CrossRef] [Green Version]
  17. Svítek, M. Information Physics—Physics-Information Analogies for Complex Systems Modelling; Elsevier: Amsterdam, The Netherlands, 2021; ISBN 978-0-323-91011-8. [Google Scholar]
  18. Moos, P.; Svítek, M.; Novák, M.; Votruba, Z. Information Model of Resonance Phenomena in Brain Neural Networks. Neural Netw. World 2018, 3, 225–239. [Google Scholar] [CrossRef] [Green Version]
  19. Votruba, Z.; Novák, M. Alliance Approach to the Modelling of Interfaces in Complex Heterogenous Objects. Neural Netw. World 2010, 5, 609–619. [Google Scholar]
  20. Amirkhani, A.; Barshiiu, A.H. Consesnsus in multi-agent systems: A review. Artif. Intell. Rev. 2021, 55, 3897–3935. [Google Scholar] [CrossRef]
  21. Svítek, M. Quantum System Modelling. Int. J. Gen. Syst. 2008, 5, 603–626. [Google Scholar] [CrossRef]
  22. Jordan, T.F. Quantum Mechanics in Simple Matrix Form; John and Sons: New York, NY, USA, 1986. [Google Scholar]
  23. Plastino, A.; Rocca, M.C.; Ferri, G.L. Quantum treatment of Verlinde’s entropic force conjecture. Phys. A Stat. Mech. Its Appl. 2018, 511, 139–142. [Google Scholar] [CrossRef] [Green Version]
  24. Feynman, R. QED: The Strange Theory of Light and Matter by Richard Feynman; Addison Wesley Longman, Inc.: Boston, MA, USA, 1966. [Google Scholar]
  25. Stonier, T. Information and Internal Structure of the Universe; Springer: London, UK, 1990. [Google Scholar] [CrossRef]
  26. Kirchhoff, M.; Parr, T.; Palacios, E.; Friston, K.; Kiverstein, J. The Markov blankets of life: Autonomy, active inference and the free energy principle. J. R. Soc. Interface 2018, 15, 20170792. [Google Scholar] [CrossRef]
  27. Gleick, J. Chaos: Making a New Science; Penguin Books: London, UK, 2008. [Google Scholar]
  28. Grover, L.K. A fast quantum mechanical algorithm for database search. In Proceedings of the 28th Annual ACM Symposium on the Theory of Computing, Philadelphia PA, USA, 22–24 May 1996; p. 212. [Google Scholar]
  29. Penrose, R. Shadows of the Mind: A Search for the Missing Science of Consciousness; Oxford University Press: Oxford, UK, 1994. [Google Scholar]
  30. Peterka, V. Bayesian Approach to System Identification. In Trends and Progress in System Identification; Eykfoff, P., Ed.; Pergamon Press: Oxford, UK, 1981; pp. 239–304. [Google Scholar] [CrossRef]
  31. Tononi, G. Consciousness as Integrated Information: A Provisional Manifesto. Biol. Bull. 2008, 215, 216–242. [Google Scholar] [CrossRef]
  32. Marcel, A.J. Conscious and unconscious perception: Experiments on visual masking and word recognition. Cogn. Psychol. 1983, 15, 197–237. [Google Scholar] [CrossRef]
  33. Velmans, M. Is human information processing conscious? Behav. Brain Sci. 1991, 14, 651–726. [Google Scholar] [CrossRef] [Green Version]
  34. Crick, F.; Koch, C. Towards a neurobiological theory of consciousness. Sem. Neurosci. 1990, 2, 263–275. [Google Scholar]
  35. Dehaene, S.; Changeux, J.-P.; Nacchache, L.; Sackur, J.; Sergent, C. Conscious, preconscious, and subliminal processing: A testable taxonomy. Trends Cogn. Sci. 2006, 10, 204–211. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Tononi, G.; Koch, C. The neural correlates of consciousness: An update. Ann. N. Y. Acad. Sci. 2008, 1124, 239–261. [Google Scholar] [CrossRef] [PubMed]
  37. Kriegel, U.; Williford, K. Self-Representational Approaches to Consciousness; MIT Press: Cambridge, MA, USA, 2006. [Google Scholar]
  38. Stone, A.D. Einstein and Quantum; Princeton University Press: Princeton, NJ, USA, 2016; 344p, ISBN 9780691168562. [Google Scholar]
  39. Hameroff, S.; Penrose, R. Orchestrated Reduction of Quantum Coherence in Brain Microtubules: A Model for Consciousness. Math. Comput. Simul. 1996, 40, 453–480. [Google Scholar] [CrossRef]
  40. Amoroso, R.L. (Ed.) Complementarity of Mind and Body: Realizing the Dream of Descartes, Einstein and Eccles; Nova Science: New York, NY, USA, 2010. [Google Scholar]
  41. Casti, J.L.; DePauli, W. Godel: A Life of Logic, The Mind, and Mathematics; Kindle Edition; Basic Books: New York, NY, USA, 2000. [Google Scholar]
  42. Svítek, M.; Žák, L. What is life…? World Wide J. Multidiscip. Res. Dev. 2021, 7, 12–20. [Google Scholar]
  43. Svítek, M.; Žák, L. Know theyself. J. Multidiscip. Eng. Sci. Technol. 2020, 7, 11398–11404. [Google Scholar]
Figure 1. Generalized system properties: e—generalized effort, f—generalized flow, q—generalized accumulation, p—generalized momentum.
Figure 1. Generalized system properties: e—generalized effort, f—generalized flow, q—generalized accumulation, p—generalized momentum.
Computation 10 00088 g001
Figure 2. Information gate ( Φ —information flow of data measured in bits per second, I—information content measured in Joules per bit).
Figure 2. Information gate ( Φ —information flow of data measured in bits per second, I—information content measured in Joules per bit).
Computation 10 00088 g002
Figure 3. Gyrator with parameter D, input parameters I 1 , Φ 1 , output parameters I 2 , Φ 2 , information capacitors C 1 , C 2 , I 1 = I S u , Φ 1 = Φ S u , I 2 = I R u and Φ 2 = Φ R u .
Figure 3. Gyrator with parameter D, input parameters I 1 , Φ 1 , output parameters I 2 , Φ 2 , information capacitors C 1 , C 2 , I 1 = I S u , Φ 1 = Φ S u , I 2 = I R u and Φ 2 = Φ R u .
Computation 10 00088 g003
Figure 4. Numerical method in the case of a function with two maxima (the method is started at a specific point and will be terminated when the maximum is reached).
Figure 4. Numerical method in the case of a function with two maxima (the method is started at a specific point and will be terminated when the maximum is reached).
Computation 10 00088 g004
Table 1. Physical systems.
Table 1. Physical systems.
Different SystemsGeneralized Effort (e)Generalized Flow (f)
MechanicalForceVelocity
TorqueAngular Speed
ElectricalVoltageElectrical Current
HydraulicPressureFlow Rate
ThermodynamicTemperatureEntropy Change
ChemicalChemical PotentialMolar Flow
MagneticMagneto-motive ForceMagnetic Flux
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Svítek, M. Emergent Intelligence in Generalized Pure Quantum Systems. Computation 2022, 10, 88. https://doi.org/10.3390/computation10060088

AMA Style

Svítek M. Emergent Intelligence in Generalized Pure Quantum Systems. Computation. 2022; 10(6):88. https://doi.org/10.3390/computation10060088

Chicago/Turabian Style

Svítek, Miroslav. 2022. "Emergent Intelligence in Generalized Pure Quantum Systems" Computation 10, no. 6: 88. https://doi.org/10.3390/computation10060088

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop