Saturday, October 27, 2012

Preons are dead, but prequarks are fully alive

In an article at ( ), it wrote, “The latest Scientific American has a cover story about particle physics…. It’s called “The Inner Life of Quarks” and discusses models in which quarks and other elementary particles of the standard model are composites of more elementary objects called “preons”. The fact that the papers on the subject it refers to are from 1979 should make one suspicious: an idea that hasn’t had major developments in 33 years is a dead idea. Besides the overwhelming experimental evidence against preons (with the LHC bringing in many new much stronger negative results), the idea has huge inherent problems. The main issue is that one is trying to put together composites with masses as small as MeVs (or lower, if you try to do this with neutrinos) while the data says that things are point-like up to TeV scales, with just the forces you know about up to such scales.

The following is the summary of Scientific American’s article.
1.       In 1869 Dmitri Mendeleev created the periodic table of chemical elements by noticing that elements' properties fit into a repeating pattern, which physicists later explained as a consequence of atomic structure. A similar story may be playing out in particle physics again today.

2.       The 12 known elementary particles have their own repeating patterns, suggesting they are not truly fundamental but actually tiny balls containing smaller particles, which physicists tentatively call preons.

3.       Other evidence argues against this possibility. The Large Hadron Collider at CERN, along with several lesser-known experiments, may finally settle the question.

There are major differences (at least four) between the preons/Rishons and the prequarks.

One, The Preon model (done by Abdus Salam) which was expanded as Rishons model (mainly done by Haim Harari). This Rishons model is very similar to my Prequark model. It has sub-quarks (T, V): {T (Tohu which means "unformed" in Hebrew Genesis)  and V ( Vohu which means "void" in Hebrew Genesis)}. But, Harari did not know what T is (just being unformed). On the other hand, the A (Angultron) is an innate angle, a base to calculate Weinberg angle and Alpha.

Two, the choosing of (T, V) as the bottom was ad hoc, a result of reverse-engineering. On the contrary, there is a very strong theoretical reason for where the BOTTOM is for G-theory.
In G-theory, the universe is ALL about computation, computable or non-computable. For computable, there is a TWO-code theorem. For non-computable, there are 4-color and 7-color theorems.
That is, the BOTTOM must be with two-codes. Any lower level under the two code will become TAUTOLOGY, just repeating itself.
Anything more than two codes (such as 6 quarks + 6 leptons) cannot be the BOTTOM.

Three, rishons (T or V) carry hypercolor to reproduce the quark color, but this set up renders the model non-renormalizable, quickly going into a big mess. So, it was abandoned on day one. On the other hand, prequarks (V or A) carry no color, and the quark color arises from the “prequark SEATs”. In short, Rishons model cannot work out a {neutron decay process} different from the SM process.

This is one of the key differences between prequark and (Rishons and SM).

Four, Preon/Rishons model does not have Gene-colors.

More details are available at .

Sunday, September 2, 2012

Quantum algebra and axiomatic physics

Since the 1930s, quantum mechanics was viewed as one of the two foundations of physics, more fundamental than the classical physics. Seemingly, without adding “quantum” in front of a physics theory, it cannot be a viable theory.  Yet, besides all those quantum-theories, what is the ontological essence about “quantum”?

In general, the Uncertainty Principle can be interpreted in two descriptions.

One --- In the article “Ball on a Spring (Classic vs. Quantum physics),”, it wrote, “A is the amplitude of oscillation (which we are free to choose to be as large or small as we want). 
In quantum mechanics, things change.  At first glance (and that’s the only glance we need, really, …) there’s really only one thing that changes, and that is the statement that ‘we are free to choose [the amplitude] to be as large or small as we want.’  It turns out this isn’t true.  And correspondingly, the energy stored in the oscillation cannot be chosen arbitrarily.
A single quantum of oscillation wouldn’t even make the ball move by a distance of an atomic nucleus! No wonder we can’t observe this quantization!!  If the ball moves an amount that we can see, it has an enormous number of quanta of oscillation — and for such large values of n, we can make A be anything we want, as far as we can tell; see Figure 2. We can’t measure A nearly well enough to notice such fine restrictions on its precise value.”

In this description, although the quantum is the foundation, it becomes insignificant at the large scale which is ruled by determinism. That is, there is no conflict or contradiction between quantum facts as a foundation in a small region and the rule of determinism which rules the large scale.  

It says again, “… Even when there are no quanta of oscillation in the oscillator at all — when n = 0 — there’s still some amount of energy in the oscillator.  This is called zero-point energy, and it is due to a basic jitter, a basic unpredictability, that is at the heart of quantum mechanics.”

This shows that the “quantum zero” is seemingly significantly different from the mathematics zero which is a “static zero, the foundation of determinism”. In the quantum region, zero cannot be defined at a specific space-time point, and it can only be defined by averaging out a range of space-time points. The larger this range is, the closer the quantum zero approaches to the mathematics zero. The largest space-time range is this physical universe, and the quantum zero for it is the Cosmology constant, which is the best known measured “Physical zero”.  And, it is almost identical to the mathematics zero.  Again, there is no conflict or contradiction between the quantum zero and the determinism (the mathematics zero). Furthermore, no any kind of quantum dancing is able to help the quantum zero escaping from the confinement of the deterministic zero.

Two --- It can be stated with the following statement.
Statement A --- Quantum mechanics doesn't allow us to predict anything else than probabilities. So there's always some uncertainty about the answer to the question.”

If statement A is absolute, then it itself is a statement of determinism. If we add a statement B,

Statement B --- The validity of statement A depends on a probability.

Then, is the system of “statement A + statement B” absolute?

In fact, regardless of how many statements are added to the above, there is no way of any kind (even by God) to construct a Quantum-system to escape from the grip of the determinism at the end. Is the recently discovered Higgs-like particle a quantum particle? Are we positively determined that we have caught its tail?  I call this “quantum-ness paradox”.

With the two points above, it is clear that the important-ness of quantum-ness was overhyped in the past 80 years. I can show this more with the following three points.

A. Quantum algebra:
In mathematics,  sugar + sugar = more sugar, and one apple + one apple = two apples (not two oranges). But, in quantum algebra, there is a significantly different rule.
               Quantum + quantum = deterministic, such as,
       Quantum particle + quantum particle = deterministic particle, for examples,
                   Proton + electron = hydrogen atom
                  Hydrogen atom + hydrogen atom = hydrogen molecule
                   Two hydrogen molecules + one oxygen atom = water (wholly deterministic).
In fact, any kind of quantum dancing (fields, operators, etc.) will eventually lead to the determinism, and there is no way escape from it. It does not take too many steps to erase the quantum-ness.

B. All important quantum parameters can be derived from the axiomatic physics deterministically.
The most important quantum parameters for the construction of this universe are the Cabibbo angle, the Weinberg angle and the Fine structure constant (Alpha).  And, they can all be derived deterministically in the axiomatic physics. Please read the article “Axiomatic physics, the revolutionary physics epistemology,” for details.

C. Both proton and neutron must be Turing computers in order to give rise to a biologic universe. Please read the article “Quantum behavior vs. the Cellular Automaton determinism,” for details.

With the five points above, I have concluded that the quantum-ness is only a naughty child of the supreme Daddy of Axiomatic physics. 

Thursday, August 16, 2012

Quantum behavior vs. the Cellular Automaton determinism

In the article " ’t Hooft on Cellular Automata and String Theory (",  it wrote, "Gerard ’t Hooft in recent years has been pursuing some idiosyncratic ideas about quantum mechanics; ...
His latest version is last month’s Discreteness and Determinism in Superstrings, which starts with cellular automata in 1+1 dimensions and somehow gets a quantized superstring out of it.  
Personally, I find it difficult to get at all interested in this.  ...
Looking at the results he has, there’s very little of modern physics there, including pretty much none of the standard model (which ’t Hooft himself had a crucial role in developing).  ...
Basing everything on cellular automata seems to me extremely unpromising: you’re throwing out deep and powerful structures for something very simple and easy to understand, but with little inherent explanatory power. That’s my take on this, ..."

Seemingly, the key objection of the author is his belief that the quantum behavior is not compatible with the Cellular Automata determinism while the fact that ’t Hooft’s inability to make contact with the known physics (mainly the Standard Model) played only a small part in forming his opinion.

In fact, many attributes of any quantum particle are following exact determinism. Regardless of the complexity of the proton’s internal structure (with zillions gluons, quarks and anti-quarks dancing randomly), the final outcome cannot go beyond the simple (u, u, d). With all those great quantum mystic power, no proton can acquire a different mass or different electric charge or spin. With all those great mystic quantum dancing powers, the determinism is the Supreme boss. There are exactly three quark colors, not three + uncertainty. There are exactly three generations, not three + uncertainty. The quantum mystic dancing power is just a small child of the Supreme Daddy of determinism.

The most important determined attribute for both proton and neutron is that they both are Turing computers, and this fact is described in detail in the article “The Rise of Biological Life ( )” which was written in 1993.

Saturday, May 5, 2012

Predictions from Axiomatic physics

All BSMs (Beyond Standard Model) are extensions of SM, such as the SUSYs and the String theories. Yet, there is a major difference between the SM and the AP (Axiomatic physics) on the theoretical level, which was discussed in the article “Neutron decay and proton’s stability --- the source of universe’s evolution” ( ). In the SM, the electroweak symmetry breaking is caused by the Higgs mechanism. In the AP, the Higgs mechanism is only a shadow of the Real-Ghost flip-flop mechanism (see the article “The Rise of Gravity and Electric Charge”, ). And, this Higgs mechanism issue will soon be answered by the LHC data. Thus, the first prediction of this AP is as follow.

Prediction one:
        a.  Higgs boson of any kind will be ruled out.
        b. All SUSY theories with s-particles will be ruled out.
        c. Any String theories with extract spatial dimension(s) will be ruled out.

Prediction two: As the observation of the fact that the expansion of the universe is accelerating was discovered in 1998, the prediction of that fact was published in 1984 in the book “Super Unified Theory”. See the article “Acceleration of the expanding universe, mystery no more! , “.

Prediction three: There are only 48 elementary particles in Nature, not counting the force carriers. See the article “48, the exact number for the number of elementary particles, “. This prediction includes the followings.
            a. No Higgs boson of any kind.
            b. No s-particle of any kind.
            c. No fourth generation particles.

As this prediction was done in 1984, the top quark and the tau lepton were not discovered then; thus, this AP also predicted both the top quark and the tau lepton as they are parts of the 48. While other theories, such as the SM, also predicted them, they did not and still do not include the a, b, and c predictions above.This prediction will separate these two types of theories.

Prediction four: The measured Alpha (Fine structure constant) for the old galaxies must be slightly different from the younger ones. See the article “Axiomatic physics, the revolutionary physics epistemology, “.

Prediction five:  The gravitation constant should vary during the evolution of the universe. See the article “The rise of gravitation, and hierarchy problem no more!", “. In that article, the hierarchy problem is also resolved. Then, the dark matter and dark energy issue are resolved. See the article “Dark matter, mystery no more! , “.

These five predictions are experimentally testable, and some of them already experimentally or observationally verified. In addition to these predictions, this AP provides the answers to the following issues. 

      i. The origins of flavor and of generation --- see the article “48, the exact number for the number of elementary particles, “.

      ii. The origin of space --- see the article “Origin of spatial dimensions, and the definition for dimension, “.

     iii. The origin of mass --- see the article “Origin of mass, gateway to the final physics, “.

      iv. The origin of time --- see the article “Origin of time, the breaking of a perfect symmetry,

       v. The theoretical calculation of the Cabibbo / Weinberg angles and Alpha --- see the article “Axiomatic physics, the revolutionary physics epistemology, “.

Friday, May 4, 2012

Neutron decay and proton’s stability --- source of universe’s evolution

The three generations of quarks and leptons in the Standard Model (SM) are based on step by step knowledge advancement on the phenomenology, not on a true theoretical framework.  On the contrary, in the Axiomatic physics (AP), the origins of flavor and of generation are the direct consequences of space-time structure (see the article “48, the exact number for the number of elementary particles, “). Nonetheless, the SM became very successful after its correct predictions of the weak-bosons and their masses by using a postulated Higgs mechanism. However, if the SM Higgs is not found this year, SM will receive a deadly blow. While we are waiting for the verdict from LHC on this SM Higgs issue, we can still review the entire framework of the SM with this AP.

In SM, all force carriers are viewed as particles which have defined masses and measurable lifetime. On the contrary, only quarks and leptons are space-time structurally defined particles in this AP. The three force carriers (photons, gluons, and gravitons) are viewed as the fibers for some kinds of envelopes in this AP. All photons of the universe weave out an envelope of an event horizon, the causal envelope. The gluons inside of a particle weave out an envelope for that particle with a definite wavelength which defines that particle’s mass. All gravitons in the universe weave out an envelope as the space-time front. In this sense, the three forces (electromagnetic, strong and gravitation) are doing similar works and should be represented with the same mathematical function. The only difference among these three is the scales for their applications. These three are “constructive” forces, constructing different envelopes.

On the other hand, the weak force is completely different from the three above. Instead of being a constructive force, it is a “destructive” one, breaking some envelopes. If the three constructive forces are three forward moving gears, the weak force is the reverse gear. For any evolution process, both the forward and reverse moving gears are needed, even for the evolution of the universe, see the article “Sexevolution --- The Grand Design (rise of Intelligence)", ( ).

Thus, among the four forces, the weak force is fundamentally different from the other three in this AP although it is unified with the others with the unified force equation below.

F (unified force) = K ħ / (delta T * delta S)

K is the coupling constant, dimensionless. However, only by knowing the differences between those forces, we will understand the true meaning of their unification.

In the Standard Model, the weak force is the culprit for decaying processes. By definition, decay is a spontaneous “internal” process, that is, no external forces act upon to it.   For the neutron decay,

                               N    ->   P + E + v(e)-bar

In the SM, the above process is mediated by a weak force carrier, the W-boson. As the W is a force carrier, the process is, of course, spontaneous and internal. The two graphs below show the differences between the SM and the G-theory.

Then, why the proton does not decay with the following equation which is just as genuine as the N equation above.

                               P   ->   (e)bar + Pi(zero)

There is no good answer on this issue in SM.

On the other hand, there are two types of decaying processes in this AP.

A. A particle decays after it “interacts” with the space-time vacuum. Thus, this decaying process is not truly spontaneous and internal.
      a. The neutron decay is such a case.  In this AP, neutron decay is described in terms of Prequark Chromodynamics. Neutron first “picks up” a d-quark/anti-d-quark pair from the space-time vacuum. Then, this new five-quark compound goes through two steps.
         i. the flavor changing --- a d-quark/anti-d-quark pair changes into a u-quark/anti-u-quark pair,
         ii. the exchange of two prequarks between two quarks.

Obviously, there is a significant difference between this AP description and SM’s. For the SM, the W-boson is the force carrier (the mediating actor, the cause), and thus the process is spontaneous and internal. In this AP, the W-boson is the “result” (a transient state) of a space-time induced process, not the cause. Thus, the neutron decay is a space-time induced flavor change decaying process in this AP, not spontaneous and internal.

       b. Another space-time induced decay is the muon decay.
                                      Muon    ->   e + v(e)bar + v(muon)

In Prequark Chromodynamics (PC), this decay is driven purely by the “generation force” which is a “color charge” in PC.  Thus, the muon decay is a “generation” change process.

In this AP, both the “flavor” and “generation” are the consequences of the space-time structure (See the article “48, the exact number for the number of elementary particles”). Thus, those two decaying processes above are space-time induced, not spontaneous and internal.

The detailed diagrams and descriptions of these two decays are available in the article “Neutron Beta Decay, “. They were published on page 19 to 20 of the book “Super Unified Theory” (ISBN 0-916713-02-4, Copyright # TX 1-323-231, Library of Congress Catalog Card Number 84-90325).

B. The proton decay equation below is a truly spontaneous and internal decaying process.

                                         P   ->   (e)bar + Pi(zero)

The detailed proton decay diagram and description are available in the article “Proton's stability and its decay mode” ( ). In the diagram, proton does not interact with the space vacuum, and there is no generation change. Without any external energy infusion from the space-time vacuum, the proton cannot decay as it has lower energy in comparison to its decayed products.  Yet, in this AP, a proton will decay when the energy of the space-time vacuum has enough energy to crash the envelope of the proton.  

With the examples above, there is no true spontaneous and internal decaying process in this AP. “All” decays are space-time vacuum induced or are driven by the space-time structure, the changing of flavor and/or generation, which are the traits of the space-time structure. Now, the true essence of the “weak force” is understood in this AP.

Now, the difference between this Axiomatic physics (AP) and the Standard Model (SM) is very clear. The validity of the SM is hinged on the postulated Higgs mechanism which is only a “shadow” of the space-time structure of this AP. See the article “Higgs Boson, a shadow of the Prequark field” ( ). And, the LHC data will soon give an answer on this issue.

Wednesday, May 2, 2012

Dark matter, mystery no more!

It is commonly accepted as a fact now that the visible matter (made of known particles) accounts only about 5% of the mass of this universe while 95% of it is invisible. In recent years, the neutrinos although not normally visible are ruled out as the key factor on this issue. Today, most of the hopes for the answer are placed on the s-particles, postulated by many SUSY (supersymmetric theories). Yet, the recent LHC data has ruled out many standard types of SUSY. Thus, many SUSY advocators are now desperately hanging on to some bizarre theories which require many fine-turnings.

In this Axiomatic physics (AP), the above SUSY arguments are simply wrong for the following reasons.

A. The Naturalness principle (NP) --- no fine-tuning is allowed in the Nature physics. See the article “Axiomatic physics, the final physics, at

B. The SUSY with s-particles are simply wrong in AP, see the following articles,
     i. “Origin of time, the breaking of a perfect symmetry, at “.

     ii. “Supersymmetry, Gone with the wind, at “.

      iii. No s-particle is allowed in AP  --- see the article “48, the exact number for the number of elementary particles, “.

While the above articles showed that s-particles cannot be the dark matter, they do not give an answer to the issue. As the “dark matter” effect is very much the fact of our universe, it is a genuine issue also in this AP.  Thus, a solution must be derived axiomatically.

I will begin this axiomatic derivation by looking the example of proton’s mass. While the proton can be written as the composite of three quarks [u, u, d], its internal structure is, in fact, very complicated according to many test data. In addition to the three quarks, there are many “quark and anti-quark” pairs and many gluons. And, there is no fixed number for them as they change from time to time, and is, of course, different from proton to proton. That is, every proton has a different internal structure. Even the same proton has different internal structure, evolving in time. However, the total number of up quarks minus the total number of up antiquarks is 2, and the total number of down quarks minus the total number of down antiquarks is 1.

Yet, all protons are still “identical” to the external world even with the internal structure as described above.

These two facts seemingly form a paradox. However, this paradox can be resolved with a kaleidoscope model. For two identical kaleidoscopes (with identical internal structure and identical colored beats), they will give out completely different images. Even the same kaleidoscope will give out different images in its time evolution. The complexity does not arise from its internal structure per se but arises from its “void space” which allows the random movements of those colored beats.   With this kaleidoscope model, the apparent internal structure complexity of proton gives the hints about the spacetime structure “inside” of proton.

For a kaleidoscope, there are three parts,

a. a container, and it carries the majority of its mass,

b. a set of mirrors, they have some masses too. Yet, most importantly, they allow the “essence” of the empty space to be visualized, a mirror for the invisible,

c. a few colored beats, they account very small percent of the total mass. Yet, the existential essence of the void space was expressed via these beats’ existential present.

This description for kaleidoscope is almost identical to Proton’s. Proton’s three quarks (u, u, d) accounts only about 1% of its total mass. Most of the proton’s mass is carried by its gluons and the fluctuation of the vacuum inside of proton (in the forms of quark/anti-quark pair).

In the Standard Model, the gluons are viewed as particles, such as rubber bands which bound the quarks together.  Regardless of what the essence of gluons is, the end result is that an envelope was formed to confine those quarks and the internal vacuum of proton. Thus, in this AP, the gluons of a particle are viewed as an envelope for that particle. For proton’s case, that envelope accounts for about 80% of proton’s mass while the spacetime (the vacuum) enclosed by that envelope accounts for the remaining 20%.

In fact, this AP cosmology is similar to this proton model and has three parts.

     i. The visible matter (the galaxies, etc.), similar to the [u, u, d] of a proton, accounts for less than 5% of the total mass.

     ii. The boundary (the space-time front) of the universe, similar to the proton’s envelope, accounts for the majority of the mass, perhaps over 80%.

     iii. In this AP, mass can be defined in terms of space and time. That is, the spacetime (vacuum) also has mass.  
    a. Ms (space-defined mass) = (h-bar/c) * (1/delta s), c (light speed), s (space)
    b. Mt (time-defined mass) = (h-bar/c) * (1/[c * delta t]), t (time)

So, M (mass) = (Ms * Mt)^ (1/2)

See the article “Origin of mass, gateway to the final physics, “.

Thus, in this AP, there is no “dark matter” in terms of any kind of particles per se while the visible matter, indeed, only accounts for 5% of the total mass.

In addition to this proton/kaleidoscope model above, the “gravitation constant evolution” also contributes some to this dark matter issue, especially on its appearance counting. See the article “The rise of gravitation, and hierarchy problem no more! ,“.  Furthermore, the total mass of the universe can be estimated by the difference between the measured Alpha values, as the Alpha for the old galaxies must be slightly different from the younger ones.  See the article “Axiomatic physics, the revolutionary physics epistemology,“. 

Update (March 26, 2013): The iceberg model is a precise description of the prequark field. After the new Planck data was released, I have shown that the iceberg model fits the Planck data perfectly (see  Planck data, the last straw on Higgs’ back, ).