Compared to other organized cell clusters, human beings have the opinion to be characterized by their intellect.
As parts of nature, humans try to create images of their sensory impressions on their own embedding in the enclosing nature. As time goes by, the wealth of those particular images will consolidate to their individual conceptions of the world consisting of memories of sequences of events, which step by step are condensing to some “if … then …“ patterns. In some abstracted or even skewed way, those accumulated sequence patterns are providing the base of the intellect his behaviour will be following.
Progressively, his behaviour will rest upon the experience how previous events had passed by – remembering those ones his brain has stored in some order of priority and, now, is putting at his disposal as his experience. “Experience“ is based on reproducibility.
Consequently, natural sciences by definition are based upon patterns of behaviour which are reproducible, while theologies, essentially, require support by “miracles”, which are not reproducible:
This reproducibility of a schedule pattern might fail either due to the complexity of its composition of details – this would be a failure by statistical reasons (too many alternatives to be checked) – or they are of principle nature. The latter would include what we popularly are assigning to the result of a “free will“.
While a process accountable to natural sciences still will be considered as inevitable, we are attributing a decision out of one‘s free will to an individual responsibility according to some moral category: actions will be judged to be “good“ or “evil“ depending on their “public“ welfare or harm.
Historically as well as in the present time, the fuzziness of the notion “public welfare“ pushed the gates wide open towards the misuse of notions like “good“ and “evil“. In the historical context, thus, interested “elites“, again and again, went restricting the “common“ welfare to their own, subjective welfare. For the purpose of camouflage, people, in daily life, then are going to lay the blame of their own responsibility for such a corrupt behaviour on alleged orders by superior authorities.
If there is no such “authority“, then they are going to invent some. In prehistory, thus, little goblins teasing us (compare the Burmese „Nats“, e.g.) commuted towards a world of gods the blame for everything inexplicable could be pinned on. Particularly, those mental clan chiefs loved to make use of this comfortable method in their role of a shaman.
We are well familiar with that kind of patronizing arrogance of the stronger or of the varmint, respectively. We are ascribing those barbaric excesses like inquisition, Kali rituals, or Djihadism to the dark Middle Ages before the Age of Enlightenment, and we prefer to dissociate ourselves from such a blind fanaticism.
On the other hand, however, still in our present era said to be that objective, a failure due to reproducibility by statistical reasons based on some personal insufficiency often will be sold as some principle failure, provided that misinterpretation just is fitting well into the actually dominating zeitgeist. Thus, the arbitrary restriction of generality of a thesis formulated more generally would be inadequate.
There are those well-intentioned dogmas born out of the arrogance of an anticipatory obedience towards the just prevailing mainstream which often are thwarting scientific progress for decades, if not for centuries to come (remember those “strings“, e.g.).
Meanwhile, medicinal and scientific evidence more and more is pointing in the direction that our “intellect“ only might be regarded as the operating mode (not yet perfectly elucidated) of some kind of “software“ (“soul“) of some basic physical “hardware“ (“body“) and that without that substance of “hardware“ it could not endure:
In order to maintain historically grown prejudices – if of religious nature (“good and evil“) or of some generally dogmatic nature (for the sake of domination) – the myth of a “free will“ is cultivated still to-day thus massively impeding progress in fundamental research.
Insofar, research on a scientific base, which only applies truly reproducible events but, nevertheless, admits statistical effects, too, will give results lying on the safe side – provided it defines:
The ultimate target of fundamental research is it to trace all natural sciences back to physics – as it could be achieved successfully for chemistry already.
Now, physics itself is plagued by the conflict between simple, atomistic statements on the one side and often hopelessly complex, statistical effects as they are originating cumulatively from assemblies of a great multitude of individual data on the other side. Classical continuum physics is gluing up both effects. Except in quantum theories, physics always succeeded in reducing a continuum to its atomistic components. Continuous systems, then, according to the “law of great numbers“, are to be interpreted as artificially continualized interpolations of a multitude of individual effects. – Topics: emergence (see below), measuring process, (cf. later).
Thus, it might be disputed splendidly if the limits of measuring accuracy are of purely statistical or of principle, non-recoverable nature. The history of physics is teaching us that – up to that special case of quantum theories – recoverable effects of statistics always had been active so far.
By the advance of technique, the apparent continuity always resolved itself as a temporary insufficiency to separate neighbouring details from each other when their values had been extremely great. Premature assumptions allocated by brute force roughly reflected the results temporarily, but they did not withstand a later, more precise check.
Exactly this still is the case with Schrödinger’s wave mechanics. At the top level of physical institutions claiming for themselves to call the shots on the field of all research, a social class tending to dogmatism tries to cut the Gordian knot of a “Quantum Gravity”, which, at least officially, is said not yet to be discovered, by the totally inaccurate assertion that, by Bell’s no-go theorem, quantum theory and General Relativity are contradicting each other.
“According to Bell“ there are no “hidden parameters“ in quantum theories permitting Schrödinger‘s wave statistics to be reduced to standard statistics. Their call for some totally “New Physics“ means enshrining a binding commitment for an asymptotic description where just its disintegration into detailed substructures is the requirement of the hour! (Cf. Einstein‘s unsuccessful reminder: “God does not play dice“.)
Now, in the course of a BBC interview, Bell himself had pointed to the fact in 1985 already that his no-go statements are based on the existence tacitly assumed of a free will:
Apart from eliminating a wealth of mathematical inconsistencies in the field theories having wormed in there during the last century [1] – that “New Physics“, hence, would exactly have to introduce those ”hidden parameters“!
When simply denoting those “atomistic“ physical substructures connected to those hidden parameters as “quanta“, we conclude:
They are the key elements to admit a consistent unification of Planck‘s quanta with Einstein’s General Relativity giving a common, uniform ”Quantum Gravity“ [1]! Physically this means the strict adherence of reproducibility – excluding the classical assumption of a free will to exist. (Otherwise, it would not be “free“.)
This exclusion of a “free will“, i.e. taking seriously the scientific principle of reproducibility, already will remove the majority of all those problems actual quantum theories are afflicted with.
Example: Their substructure of an underlying layer made of “quanta“ will ask for the logical
Its valence part will provide its discrete quantum numbers (i.e., the mathematical “labels“ of field theories like spin, charge, lepton number, etc.), while its non-valence part [1] will reproduce the quantum numbers hitherto considered as “continuous“ (i.e., the mathematical “arguments“ of field theories like location, time, momentum, mass, etc.) in terms of superposition effects of some very great number of quanta in a statistical approximation. The latter “arguments“ only apparently had been continuous by some statistical, approximate way of consideration (law of great numbers). Literature names them ”emergent“ parameters.
The composition of one particle made of a very great number of discrete “quanta“, then, also will explain the results of double-slit experiments, e.g. – why, by repeating the experiment, a single particle is able to produce an interference pattern behind the slit. According to actual quantum theories this is absolutely inexplicable! In New Physics, however, not complete particles, but their individual quanta obviously are passing through different slits before they are reuniting themselves to some compact particle again, as we will identify it by a measurement!
After excluding the existence of a “free will”, Bell’s “superdeterminism” means that the structure of our entire world is fixed uniquely once and for ever – without any exit options. Trivially, this solves Schrödinger’s cat paradox (is it dead or alive – depending on the state of some radioactive decay process), as well. And in cosmology, this will reactivate the discussion on the facts "cosmic inflation" [1] is based upon.
When strictly observing the rules of reproducibility, then one “unexplainable particularity“ of the world of quanta after the other turns out to be primitive standard physics. Quantum theories “are putting on their trousers one leg at a time“, too! We only should steer well clear of those miserable, elitist dogmas: Religion has no business in physics! Still missing is the courageous step forward towards an “Enlightenment 2.0“ brushing away those unbearable dogmas.
The reason for the reaction that violent on Einstein‘s theories of relativity was the philosophically explosive fact that the independence of space from time, which had been considered as true as long as anyone can remember, had been nullified at a stroke: Einstein had shown that both of them – in principle, at least – can be converted into each other. As a barrier against this conversion capability he had found the speed of light in a vacuum.
Relativity, however, did not only work between space and time but also between energy and momentum, e.g., where, until nowadays, the existence of mass is a philosophically unsolved mystery to be clarified only by a Quantum Gravity to come. In addition, the relation between electricity and magnetism had proved to be rather laborious, where only Einstein’s introduction of the photon as some new elementary particle had provided more clarity.
The identification of time as some sort of a 4th dimension to be added to the 3 dimensions of space still masked another special aspect: Mathematically considered, classical physics, essentially, is based on functional analysis in 1 dimension. The 3 dimensions of space, then, had been condensed to the 3-touples of the vector calculus rather hesitantly only. However, the associated algebra of matrices in 3 dimensions soon prevailed in physics.
But that 4th component of Einstein’s time adding to the 3 real dimensions of space, in addition, proved to be of imaginary nature – rather an impertinence towards mathematically uneducated philosophers!
However, it even came worse. For, the action mode of matrices also is different from that of numbers we are familiar with: For (real and complex) numbers, the order of their factors does not matter (”commutativity law“); this ”commutativity“ of factors, in general, does not any more hold true with matrices! A matrix, when applied to a vector serving as a physical “state“, will have the meaning of an ”action“ changing that ”state“. And actions, of course, are depending on their order!
We are familiar with that non-commutativity of actions from the rotation of rigid bodies in 3-dimensional space. Mathematicians are used to classify that kind of rotations more elegantly as ”orthogonal transformations in 3 dimensions“. The set of all such transformations, then, is abbreviated by O(3). (For a better “understanding“, this formalism, unfortunately, is indispensable!)
We are coming across those ”rigid bodies“ in the transition from point mechanics to multipoint systems. Their characteristic is that their “scalar product“ of the vector calculus we can construct out of 2 of their point differences are conserved unchanged with orthogonal rotations.
This is countered by the so called “unitary transformations“, which are changing these “scalar products“ – staying invariant, however, if we are using the complex-conjugated form for one its two vector differences, each. Mathematicians are demonstrating that this just is corresponding to probability conservation. (In the scalar product of a unitary transformation, the complex components of a vector with its complex-conjugate opponent, according to Pythagoras, just are summing up to the square of its unchanged total length.) With n dimensions, a unitary transformation is abbreviated by U(n). A U(n) is a (c-number) extension of the O(n).
Do not let yourself be intimidated by notions like SO(n) or SU(n). That letter “S“ (= special) placed in front only is pointing to a mathematical quibbling which generally will not specifically matter the layman. (An O(n) will include reflections not present in an SO(n), e.g.)
An SO(n,m), however, will belong to some totally different class of transformations connecting n imaginary dimensions with m r-number ones giving a total of n+m dimensions. One of them is the “Lorentz-group“ SO(1,3) connecting the 4 components of Einstein’s ”spacetime“ with each other. Its imaginary dimension will be denoted as “time-like“ , the 3 real ones as “space-like“ directions – the reverse is familiar, as well. (For the “S“ placed in front, the same will apply what is said before, already.) An SO(n,m) is called ”pseudo-orthogonal“, an SU(n,m) ”pseudo-unitary“.
Significant with all that is that particle reactions within a ”closed system“ of thermodynamics in any case must be (truly) unitary; for, according to the physical principle ”nothing will come from nothing, and nothing gets lost“ probability conservation is essential to them! On the other hand, Einstein‘s dynamics is pseudo-orthogonal; hence, it will violate probability conservation. Hence, it will be a component of some ”open system“, in fact! In Quantum Gravity [1] we shall call these thermodynamic systems ”channels“:
Verlag: BookRix GmbH & Co. KG
Texte: © 2018. All rights reserved.
Übersetzung: This is the English translation of the German original "Quantengravitation. Logik der Neuen Physik" released at the same date by the same publisher BookRix, Munich/Germany.
Tag der Veröffentlichung: 02.01.2018
ISBN: 978-3-7438-4841-2
Alle Rechte vorbehalten