February 2023 (Original ≽)



Question: Our intelligence comes from a greater power of choice than the inanimate substance of physics, you say, and speech is more meaningful when it is more definite, less scattered. Isn't that contradictory?


Answer: No, the information (quantity of options) is equivalent to action (product of energy and duration). It therefore behaves like potential energy and useful work. Precisely because the object is raised and thus has potential energy, it can perform useful work by lowering it.

Similarly, organizing (increasing the efficiency of the system, focusing on the meaning of the words of the text) can go into reducing the information (the amount of options) given, only when those options were in excess (Adaptation). That is why less life cannot be expressed as we do.

Vitality or life, call it what you will, has an excess of information compared to the inanimate matter of physics. It is an excess of choices, possibilities of action, without which the subjects of theoretical physics are forced to follow trajectories of solutions of the Euler-Lagrange equations.


Question: If they can avoid the laws of physics, can the "vitals" also break the laws of probability?


Answer: It is a good question, among other things, for establishing clearer comparisons between our exact sciences and the new theory of information. First of all, the information of the normal (Gauss) distribution increases when the dispersion increases (Information Theory I, 36. Dispersion).

I = ln σ + const.

This means that the increase in dispersion, σ, includes more of the less likely events, previously ignored, so that the total information, I, of the normal distribution increases. This does not "fall out" from the theory of probability, and it is only a matter of days when physics and other sciences will join that story.

Colloquially, this increase in information for a given (non)living system increases the freedom of action (number of options), and in a more pronounced case it will become a "defying" of natural laws (physics) similar to the behavior of living beings.


Question: When does a normal probability distribution occur?


Answer: Normal distribution is N(μ,σ2 ), mean values μ, the so-called expectation of the random variable X, and the variance σ2 which is the average squared scatter around the expectation. Applicability is added to it by the central limit theorem starting from the weak law of large numbers of probability theory (Khinchin). Such laws state that the average value of the sample probability (x̄) converges to the expected value (μ), and the theorem states that (normalized and centered) sums of a large number of independent random variables, of any but mutually identical probability distribution, will tend to a normal distribution.

The density of the normal distribution of the random variable X is

φ(x) = exp[-(x - μ)22]/[2σ√(2π)].

Integrating (summing) all corresponding values of this density on the interval x ∈ (-∞, +∞) gives a probability of 1, because the individual variable x must be one of those. Integrating φ in the interval a ≤ x ≤ b we find the probability that the random variable turns out to be one of the numbers from (a, b).

For example, a crop of apples with an average weight of 110 grams has approximately a 50 percent probability of having randomly picked an apple with a mass less (greater) than 110 grams for each dispersion, because the (correctly) assumed normal distribution is symmetric about the mean. Intuitively, we estimate that the chances of the weight of an individual apple decrease around the mean value to smaller weights (from 110 to 0) as well as to larger ones (from 110 to 220).

Another example. The characteristic velocity of exhaust gases from rocket engines is about μ = 1333 m/s for monopropellant hydrazine (N2H4). If the average deviation from that value is σ, the probability that the velocity of an individual particle is in the interval (μ - σ μ + σ) is about 68 percent. About 2/3 of all outcomes of a normal distribution will be in the dispersion interval (μ ± σ), whatever the values of the parameters N(μ, σ2) is there.

However, given the dispersion σ, the normal distribution will have the most information of all possible distributions (theorem in the title 41, of the script Informatics Theory). Consequently, there will be a tendency to spontaneously decrease the information (ln σ), by reducing the dispersion of this distribution and, or leaving the distribution itself. The ultimate divisibility (Packages) is an obstacle to such yielding to the principle of thriftiness of information, especially in the micro-world, then there is also the law of conservation, and the fact that all the surrounding substance is already filled with information and that it also wants to get rid of it.

Nature will not just give up the mentioned "law of large numbers", nor the theory of probability, as the assumptions of normal distribution. It will strive for variations of basic distributions, in general multiplicities wherever it can, but also for memory. That is why it is increasingly difficult to apply simple forms of probabilities in increasingly complex systems.


Question: What kind of "pursuit of multiplicity" is this and, second question, what do you mean by "remembering"?


Answer: Because the normal distribution N(μ,σ2), for a given dispersion σ, has the most information (logarithm of the dispersion) in relation to all other distributions, due to the principled minimalism of information, the assumptions of normal distribution will spontaneously break down. This first means that the component distributions will lose mutual uniformity.

Among the very probabilities, diversification will increase, because it reduces the total information of the system. A similar phenomenon occurs with the encoding of increasingly long alphabets when the set of all letters begins to be separated into a significantly smaller subset of highly represented letters in communications versus a large "rest" of rarely used letters (see theorems under the title "81. Ergodic source" from the script "Informatics Theory III").

In this way, "dilution" of information by reducing its values partialy of system components, giving greater chances to that local participant, will be accompanied by the appearance of independent elements — due to the law of conservation. Such an excess is statistically insignificant, but we note that it would not exist if the information itself was not in excess. It reminds us of the useful work we get from potential energy by lifting a load to let it fall, or the vitality (intelligence) that, given enough information, can descend to better choices (smarter decisions) by focusing.

The reduction of freedoms may or may not produce safety, efficiency, or meaning, just as the lowering a load may not produce useful work.

The second question refers to the dilution of current information by means of another times, by removing some part of the information from the present to the past, from where such reduced information would continue to affect "reality". That memory is also a natural phenomenon, among other things, arising from a (local) desire for less emission of information.


Question: What is the role of independence of random variables in the development of information?


Answer: The sums of a large number of independent and equally distributed random variables will tend to a normal (Gaussian) probability distribution. That is, let's say, the first role of independence. It will be abandoned for the same reason as uniformity, that the resulting distribution would not be normal, to reduce the information for a given dispersion.

The second reason is similar, and concerns the exponential distribution. In the title "41. Limits" of my script "Information Theory" you will find a theorem that establishes that for the densities of a continuous distribution and a given expectations μ (in the above image 1/λ) the exponential distribution will have the most information, and before that it was proven that independent random variables have such a distribution. Independent realizations are those that are not affected by previous outcomes, such as tossing a coin or a die. We can conveniently say that these are the processes that "don't remember".

Therefore, striving for less information, independent systems will become dependent, they will begin to remember, they will have memories capable of influencing their future. Like abandoning excess options for the sake of better organization, efficiency, or meaningfulness, the loss of independence gives focus, direction, and let's face it, inertness to development flows.

For example, microbes are thought to multiply exponentially (Growth of bacterial populations ). However, this cannot be true in the long term, because by increasing their numbers, microbes significantly change the environment by destroying the host, creating immunity against themselves, running out of food, and the like. It is a process "with memory" that is why it ceases to be exponential.


Question: When it spontaneously emerges from a steep exponential distribution, does that mean that nature otherwise strives for equality?


Answer: No, it’s optimum is somewhere in between. When there is such an extreme distribution, which is exponential in the case of a given expectation, or normal in the case of a given dispersion, nature will try to "loosen up", as in the case of (equality).

The image on the left with the explanation is from "Information of Perception" with the title "2.2 Equality", and one of the proofs is again the aforementioned theorem "Information Theory", titled "41. Limitations", now its the first part. When the probability is given by the density on the interval (a, b) of random numbers, then log(b-a) is the largest of the information, and it is achieved only by the uniform distribution.

For example, in order for information to flow faster through a "free network" (graph theory), when the links of nodes are equal, each node will not have the same number of links, but a small number of nodes will have a large number of links versus a large number of nodes with few of them. Free market money flows are for such a model, and that is why they develop into a small number of (money, goods, services) holders (nodes) of a large volume of monetary transactions (links) as opposed to many relatively poor owners.

Another example of the same conclusion is the interpretation of the script theorem "Information Theory III", titled "84. Completeness': when we have many options, then a small number of them are very likely versus a large number of others unlikely. Consequently, we have a large number of football teams that could participate in the World Cup, but everything revolves around a small number of world champions, it is similar in tennis, but also in much further phenomena, because applications come from mathematical formalism.

In other words, when we insist on equality, as we would like in the matter of legal sciences and jurisprudence, we will oppose the natural desire for diversity that will make us have more and more lawyers, that the law will cost us more and more, and that we will not be able to make ends meet with, say, corruption and injustice.

The specificity of vitality is resistance to that minimalism, the spontaneous flow of nature towards deadness, so we can say that the rule of law is a living being. However, spontaneous flows win and, in the end, every living individual dies.


Question: You said "the rule of law is almost a living thing"?


Answer: Yes, in the sense of defying that universal parsimony of information, that minimalism. But on the other hand, because of the brazen intentions in "fixing" nature by ascription our laws that are "missing" in it.

Wars and competitions in general are the same properties of living beings. Nature by itself does not compete with anyone, it yields to everything, and the appearance of "struggle" arises from its Multiplicities. Such is the inanimate substance of science. Everything we call the life literally adheres to the "principle of least action" from which all the trajectories of theoretical physics known today are derived.

Living beings are somehow snatched away from that embrace of death, but not for long and constantly desiring it. All successful civilizations were first developed with a hunger for information similar to that of young children. More interested in novelties or something forbidden, unexplored, without long interest in the same, because repeated "news" ceases to be news. The old usually attribute it to the poor concentration of the young.

However, all the time of development they are followed by a ubiquitous but mild tendency towards information minimalism, that is, towards more probable outcomes of all systems and all their parts. Like a stone that we can throw high in spite of the universal force of gravity into a flight that will gradually be overcome by its attraction, life can be far removed from mere physics. As it matures slowly, it still loses vitality, the excess of options, and gains in security, efficiency, and wisdom, we believe.

As we grow older, we over legalize like tired civilizations, valuing order over risk, comfort more than freedom, which could be signs of losing our own (proper) information. I believe that this is precisely what is at stake, that the excess amount of possibilities is "kidnapped", that they become vitality, but they are also surrendered in the end. I also see the processes of yielding to the "natural course of things" (spontaneous) in submission to authorities (God, the state, tradition), in general everywhere where information can have a place. I remind you, information is ubiquitous, it is the fabric of the universe, but such that nature does not seem to want.

That is why societies strive to become sects, states endeavor to strengthen their regime, contrary to the constitution, which should actually defend the people against themselves and their leaders. If individuals did not have this natural tendency to subjugate, no amount of repression would help the dictatorship to tighten the subjects, moreover, the rulers would not have it either.


Question: We know that not everyone communicates with everyone. How does the new theory of information deal with this?


Answer: It is basically a property of Multiplicities. A world of endless differences is around us. They also include contrasts such as having and not having, knowing and not knowing, truth and lies. It is an obvious enough property of nature that we can take it for granted.

In the last case, if a subject were to communicate with everyone, it would be infinitely oversaturated and, among other things, it would give up the law of conservation of information. Subjects do not communicate with much, whether at all, rarely, at a given moment or in a given place, how when. In addition to this known inconsistency, we should not overlook the characteristic of news that it sometimes can be multiplied tirelessly.

Interdiction of communication is one of the ways that the multiplicity has for suppression, its immediate method. But there are other ways like the aforementioned theorem ("Informatic Theory III), in the title " 84. Completeness"), according to which in a multitude of options only a small number of them are likely compared to a large number of other improbable ones. Applied, in the environment, a small number of senses is enough to perceive what is happening, to avoid dangers and survive.

However, not all biological species on earth have vision, or they will recognize different parts of the spectrum of light (electromagnetic radiation). It is the same with hearing, volumes or frequencies of sound, with the sense of smell, and so on as we know. What we (officially) do not know is the existence of countless other possibilities that are irrelevant to us, because even in millions of years they may never happen.


Question: Do you have any assumptions about truths and falsehoods, in the algebra of logic?


Answer: Yes, let's say popularly told in the contribution "Dualism of lies". In short, the idea is that the world of truth has no boundaries, according to Russell "there is no set of all sets", or that according to Gödel "there is no theory of all theories". Therefore, the truths are infinitely many.

On the other hand, the algebra of logic knows the proposition that all its statements, which are either true or false, can be reduced to no more than three operations, say negation, disjunction and conjunction. These last two listed are the first two in the table on the right. In general, there can be only 24 = 16 of these binary operations, of which the third specified column is exclusive disjunction, and the fourth is equivalence.

Example 1. The sentence ab is the disjunction of the variables a and b which can take the values of the constants "true" (1) or "false" (0). Thus, we have the disjunction of two inaccuracies into inaccuracy (0 ∨ 0 = 0), and up to two accuracy which is accuracy (1 ∨ 1 = 1). ❏

The point is that by replacing correct with incorrect (1 --> 0) and vice versa, within each of those tables, we get a set of the same 16 tables again. It maps every sentence of algebraic logic into some sentence of algebraic logic, in particular every tautology (a sentence that is always true) into some contradiction (a sentence that is always false) and every contradiction into some tautology. I hereby prove that the "world of truth" is equivalent to the "world of lies", that there is a bijection (mutually unique mapping) between them.

Example 2. By replacing 1 and 0 in the table of exclusive disjunctions ⊕ the equivalence table ≡ vice versa. ❏

The worlds of truth and lies are equally infinite, partially overlapping, but it is dead nature that always tells the truth, which with the vitality can also lie. Additionally, we learn that a lie is attractive but weak (The Truth), that is again a consequence of the principled minimalism of information and the fact that reality, more precisely dead nature, is woven from the truths themselves.

Another part of answering the question is the problem of lie detection. Simply, if it is a living being, it can tell the truth as well as lie, while an inanimate "speaks" only the truth. For now, I don't take any mechanical aids like lie detectors, skin temperature, body language and the like seriously, except for the method of contradiction, or the well-known procedures of algebraic logic. I gave an unusual example in the mentioned "Dualism of Lies", and the next two are better known.

Example 3. We are at the crossroads of two roads, only one of which we want. We don't know which one it is, but there is one of two twin brothers, one of whom will always tell the truth and the other will always lie. We don't know which one it is. However, we have the right to ask him just one question and find out which path is the one we want. What is that question? ❏

Example 4. There are two people in front of us, Aco and Branko. We don't know who is who, but we know that at least one of them is lying. The first declares "I am Aco", and the second "I am Branko". Which of them is Aco, and which is Branko? ❏

The solutions to these questions are above in the right margin (column). They should demonstrate the power of mathematics itself in detecting lies as well as the reasons why we do not use it enough (we underestimate it). The use of its logic is inevitable because the liar does not have to be disturbed by the questioning, he does not have to be aware that he is lying, and the truth itself can be hidden from the most knowledgeable.

For example, Newton (Optics, 1704) proved that light is of particle nature. In addition to others, Fresnel (1817) presented his strong reasons for the wave nature of light, and Einstein (1905) again for the particle nature, so that De Broglie (1924) published the theory of the wave-particle nature of all matter.

If we really knew what was true, we would rule the cosmos. That is approximately the opinion of science today.


Question: Why don't two rational people want to cooperate?


Answer: Roughly, one might say, due to a misjudgment. Many reasonable selfish persons will betray another in a situation described by "prisoners dilemma".

In the picture on the left is a sketch of two prisoners, A and B, separated into two cells, who are suspected of having participated in the crime. They are offered three options: 1) if you confess and the other don't, we release you and he gets 20 years in prison; 2) if you both confess, you each get 5 years in prison; 3) if you both deny the crime, you each get 1 year in prison.

A confession will seem the most rational to a hasty, selfish, or untrustworthy defendant, although it should be a mutual denial. He does not consider that the former cooperation may continue. If such defendants were really rational, albeit isolated, they would adhere to the strategy of "reciprocity" (good onto good, and evil against evil), let's note.

In the spontaneous case (a dead-natural course of events), these statements would be exclusively true; the guilty would admit guilt and the innocent would deny it. An inanimate substance cannot lie.


Question: Isn't the economy also a product of vitality?


Answer: Yes, but the chance that the evolution of life will lead to a economy is almost nil, at least according to us. I agree though, it can be considered a subject of game theory and is worth mentioning here.

The two basic trading parameters are item price and quantity. In sales, when we capture as large a part of customers as possible, it is generally a lower price (a) the higher the sold quantity (x), but that the total revenue (product ax) grows with the latter. Production has a similar form, the higher the price (b) per item the less (y) them are, so the cost (by) increases with quantity. Like "perception information" gain is the sum of these two multiplications (S' = ax + by), which is actually a difference, because income and cost (a and b) are numbers of opposite sign.

When the seller and the producer have competition and share the market, each of them has its own (i-th) revenue and cost (Si = aixi + biyi), so that the total profit is the sum of individual (S' ' = ∑i Si). Studying the difference (ΔS = S'' - S') of these tells us about the benefit of monopoly (when the market belongs to only one) for the individual and conversely about the benefit of society from the prohibition of monopoly.

Market matches do not end with the aforementioned, competition of participants with the nature of sales and production flows, but continue with struggles against the very opponents. Let's say, bigger ones buy less, and by reducing the pressure of competition they also reduce the society's profit (the difference ΔS), after which the state intervenes in its favor and this "competition" continues.

Societies, in which one of the mentioned parties dominates the other – lose their vitality, because they lose their competitive element. If the owners win, it will be a too many organized rule of companies, banks, corporations, and if the statesmen win, a dictatorship will arise, again with denying of freedoms. Both cases move into a "regime" in a negative sense for the opportunities of citizens.

Question: What did you mean by "at least according to us, the economy is next to nothing"?

Answer: On the small chance of the emergence of intelligent life forms, looking at the multitude of biological species that arose and disappeared on Earth, what else would you.


Question: How does vitality arise and disappear?


Answer: The question is not general enough for the topic, I don't know the answer. The ways of life's creation and cessation, I believe, will long be the subject of research by various sciences and, let's not be fooled, with tasks that we can't even imagine now. But that does not mean that this part, with which we are now paving the way to them, is not very important.

For some unknown reasons, say, life will arise. It crashes like a stone flying towards the sky, resisting the gentle but persistent force of gravity that gradually overcomes it until it finally lands. In similar flights, or trajectories, some are models of the processes of youth, maturity, and old age. And that is that.

The mild but persistent force of information minimalism is gradually dominated by vitality down to its minimum. For now, with great certainty, we can consider that these processes go through the reduction of (intermediate) information, for example by better organization, increasing safety, efficiency, or thoughtfulness, the tendency to give in. Intelligence gives us the ability to slow down through scientific and technological development.

We notice that the increase in knowledge gives us new freedoms, previously unknown workplaces, ways of transportation and communication, work methods, increasing comfort, but at the same time reducing some of our other options. We strive for development in order to live better, to eliminate uncertainties, and thus vitality. I believe that in the middle values we achieve this. The increase in liberties we gain through civilization is less than the loss of liberties through comforts.

That's how we "tame". Like domestic animals that evolve to rely more on their owners, over generations we adapt to ongoing civilizations, to each other, to better ways of life. Just as simple living cellular forms tend to grow into more complex, larger individuals, we become more collective by surrendering individual freedoms to a higher organization, at best barely changing the overall information of the whole.

From that aspect, the path of civilization is expansion and subsidence. Individually we are less and less relevant and collectively more and more present. The more successful the intelligent development of assets becomes the more room is opened for dilution of vitality at the level of individuals. The chances of a (moderate) decline in everything we consider life are increasing, including intelligence itself.

This is a general phenomenon. We can see its models in other types of living beings, in one form or another, almost everywhere, including such possible places in space that we now have no idea that they exist.


Question: What can you say about the vitality of artificial intelligence and robots?


Answer: Vitality provides an excess of options over substance. In this sense, classical computers and the algorithms they are based on have no vitality. What is it about in the case of people, I will explain briefly.

The logic "from A follows B", or "if it is ... then it is" with a unique series of causes and effects, does not leave causality. There is no "information theory" (mine), nor its concept of "vitality". We have the most countable set of events and its cardinal number ℵ0 (aleph-zero) in case of infinite duration of the process.

However, if in some infinite subsequence of a countable sequence each member has at least two options, from comparing this with the binary notations of the real numbers greater than zero and less than one, we conclude that the options must be a continuum many. It is the infinity of real numbers and is of a higher order than countable. Nothing more will be achieved if the options have ten per step, because we write down the same real numbers in decimal digits as well.

From this analysis of cardinals (the numbers of infinities from set theory) we learn that our thoughts and ideas belong to a continuum, although the substance that could come up with them remains in a countable set of quantities. This paradox for classical physics is no more so for quantum physics. Quantum mechanics views its outcomes of events through "superpositions", distributions of probabilities that may or may not be completely "unpacked" and realized into individual ones. That small world of micro sizes comes to us filtered by the laws of large numbers (probability theory) behind which, mostly, we only see causalities.

In other words, we think in the way of eventual quantum computers, not in the way of existing classical computers. What can "break through" the barrier of the law of large numbers may be explained by "chaos theory", which is admittedly still only causal but can easily be prepared as part of the (future) information theory.


Question: Can small actions produce big reactions?


Answer: Yes. For example, Chaos theory deals with such in mechanics and mathematics. Seemingly random or unpredictable behaviors within determinism — her topics are.

Edward Lorenz (1961) instead of the original 0.506127, entered the approximate number 0.506 into the weather forecasting calculation while waiting for the approximate result. However, he got something completely different, and then he studied the phenomenon, the small causes that lead to large consequences. He used the metaphor of the "butterfly effect" (whose wing movement in Mexico can cumulatively cause a tornado in Texas) in popularizing the theory he became one of the founders of.

The orderliness of the chaos of the new theory is observed by observing its effects over a long period of time, when periodicity is often revealed. I described the connection of such (non)periodicity and (non)determinism in the book "Space-Time" (2017), here's how, in a nutshell.

It is known that the natural numbers form a countably infinite set (of cardinality ℵ0). Within them is an equal infinity of prime numbers {2, 3, 5, 7, 11, 13, ...}, and then each of these is powered {7, 72, 73, ...} made an infinite subset of the same natural numbers, where by choosing a different exponential base (instead of 7, let's say 11) we will get disjoint subsets (without any common element). So is the set of fractions, the so-called of rational numbers, equally countably infinite.

Every rational number can be written as a decimal, with periodic repetition of decimals. For example, 7/1 = 7,000... where zero repeats periodically, and 1/7 = 0.142857142857... where the sequence of digits 142857 repeats periodically with a period n = 6 digits long. To write a periodic decimal number (x = 0.1313..., period n = 2) as a fraction, we will multiply it by 10 as many times as the length of its period ( 102 = 100) and get the same number increased that many times (100⋅x = 13 + x) and hence the fraction (x = 13/99).

This procedure proves the periodicity of countable sequences, and I have already mentioned their connection with determinism. Here's an example that connects the two to chaos theory. We square the two-digit number and keep only the last two digits, which we square again. Continuing, we get a chaotic sequence of two-digit numbers that sooner or later must start repeating, because there are 100 two-digit endings in all. For example, if we start squaring with the number 13, the series of endings is 69, 61, 21, 81, 61, 21, 81, ..., with a period of three steps (61, 21, 81).

However, if the generated array is nonperiodic, it will take up more space than it can get with countable sets. It arises because there are options in an infinite subsequence of a given countable sequence. If each such step has at least two possibilities, then there are 2N in N steps, so in the last case (N --> ℵ0) to be a continuum, as many as real or irrational numbers.

From the point of view of information theory, it is important and a special question whether these periodicities, and then determinism, can always be brought under happy rounding situations, or whether authentic determinisms are really around us.

Waves II

Question: How can something in the universal spontaneous disappearance survive?

Waves II

Answer: Survival in ways as well as information otherwise, or its substrate, the excess that I label "vitality", is one of my frequent questions (Waves). I will try not to repeat myself in the answers.

The basic method of that survival is waves, and the causes of waves are the omnipresent force of thriftiness of information and the law of conservation as the opposite to it. The resulting information disappears immediately, but this creates an attractive lack of repulsive uncertainty. "News" once spoken is no longer news, but its termination is, so the disappearance of such disappearance can continue, indefinitely, with the same number (amount) of options. Such are micro-titration processes, for example.

Physics has been dealing with macro-titrations for a long time, so I won't talk about it. Inherent to information theory, however, is the reality of the very, not to say abstract, phenomena of "fluctuation". Mathematical truths are more real than the concepts we use to describe the objects around us, because they are more stable and objective in the sense that we can influence them less.

Thus, the energy, which, for example, is transmitted by a water wave, is no less real than the consequences of a devastating tsunami for settlements along the coast, even though the water particles themselves that carry that great energy circulate in a small space. Nor does the course of the (alternating) field of conductors imply any long-distance journeys of its electrons. This is how life is transmitted through reproduction, from generation to generation, making miracles on this planet, changing it a lot, and not a single living individual that contributed to the processes of those changes lasted long (compared to the duration of the planet).

The transmission of vitality through descendants who continue, live and die, is a type of information, or action, that is, a product of energy and time, no less real than the individuals who were its carriers. It doesn't have to survive and survive, just like a sea wave that breaks on the shore and goes out, while giving information about itself to a new environment.


Question: Can you clarify "micro periodicity"? (You interpret those formulas brilliantly; he praises me and asks)


Answer: Information is an expression of uncertainty, and as soon as it became a certainty, it would disappear. Thus, it gives meaning to time, then to space. Duration has the "length" of the path light travels in a given time (x = ict). Information is also a tissue of matter, so it exists because of the law of conservation.

How would we know that what happened in a physical experiment happened, in cases where information can be created or disappear just like that?

It is a short explanation of certainty as the absence of uncertainty and vice versa, that the smallest pieces of information (Packages), pure uncertainty, are again some certainties. That the disappearance of news is also news. It's not just a play on words for "information theory" (mine).

Rotation is a good model of such "emergence in disappearance". After the end of one cycle, full of "turns" (let's look at the appearance and abstractly, more broadly), it continues as if starting from the beginning. In the next "circle" everything will not be the same, moreover, the entire universe will change, so the application of rotations as a process makes sense. In addition, the processes themselves are some states.

For example, each of the Pauli matrices σ, pictured on the left, represents some process by the state of the quantum system, the vector v, translates into some new state, the vector u = σv. Acting on this one again, σu = σ(σv) = v, it translates it to its initial state, because each of these matrices is inverse with itself, it squared gives a unit matrix. The story about Pauli matrices as "rotations" does not end there, because processes are also states. These belong to the class of particles we call fermions.

In the contribution on fermions, which I linked, these matrices are not ordinary rotations (classical geometries), not more than a four-stroke internal combustion engine is. In addition to their abstractness, not only one turn (360o) is enough to reach the beginning, but two such (720o). And to make matters even more complicated, the quantum matrix processes in their world significantly scurry through additional dimensions.

Of course, not everything I just told is known to physics, and don't be in a hurry to flaunt it.


February 2023 (Original ≽)