Archive for the Science Category

The ABC Preon Model, Background: the Standard Model of Elementary Particle Physics

Posted in ATS Thread, Cosmos, Science on March 16, 2017 by betweentwopines

Post by delbertlarson at ATS
This thread is the first of what I plan to be a series of several threads concerning my ABC Preon Model. I look forward to everyone’s comments. To get started, I’ll first present a review of how the science of elementary particle physics got to where it is today.

The figure above shows a portion of the particles that were discovered using particle accelerators. The number of such particles became so large that it was termed “a particle zoo”. It was clear by the early 1960’s that the number of particles discovered was getting so large that there was likely some underlying pattern that could simplify our view of elementary particle physics.

A major step forward in simplifying mankind’s view of nature occurred in 1964 when Murry Gell-Mann and George Zweig proposed an underlying model. Gell-Mann had used the term quark for the elementary particles, while Zweig had used the term ace. Eventually, the term “quark” was accepted by the community. In the quark model, Hadronic matter is proposed to be built from underlying quarks. Baryons are states that have three bound quarks, while mesons are a bound quark-antiquark pair. Leptons were identified as a separate type of matter. As a result, in 1964, simplicity was reestablished. Nature consisted of three quarks, named up, down, and strange, and four leptons, which were the electron, the muon, and their two associated neutrinos.

The initial simplicity of the quark model began to fade into complexity almost immediately. In 1965 Glashow and Bjorken proposed a fourth quark, the charm quark, which was discovered by Richter and Ting in 1974. In 1970 Kobayashi and Maskawa theorized that CP violation in experimental results could be explained by adding two more quarks, and indeed these quarks were discovered by Ferimab researchers. The bottom quark was discovered in 1977 and the top in 1995. Also, over the period between 1974 and 1977, a new lepton, the tau, was discovered at SLAC by a team of collaborators.

In addition to the quarks and leptons, force carriers are a central part of today’s standard model. In 1979, Glashow, Weinberg and Salam proposed the electro-weak theory of particle interactions to unify the weak and electromagnetic forces in a single theoretical framework. This work predicted the existence of three more particles, which were called the intermediate vector bosons. The weak bosons, called the W and Z, were discovered by a team at CERN led by Carlo Rubia in 1983. Simon van der Meer enabled the discovery by leading the development of stochastic cooling of particle beams. Note that the W boson comes in two types, one with a positive electric charge and the other negatively charged, while the Z particle has zero electric charge.

In the figure above we see a depiction of the standard model for elementary particles as advertised by its proponents. The depiction shows a rather simple set of 16 particles, which includes six quarks, six leptons and four force carriers.

Despite the advertised simplicity of the standard model, the model has several problems that leave it rather unsatisfactory from a philosophical point of view. The first additional complication is that the rules used to form particles involve a color charge. It is of course perfectly acceptable that nature may employ otherwise identical particles that have one of three color charges, but the downside is that this means that there are actually three quarks for each one listed in the figure above. The theory also specifies that there are eight different gluons, not just the one shown above. Secondly, each quark and lepton shown in the figure above has an antimatter counterpart. This too is OK, even necessary, but it means that there are twice as many particles than the number advertised above. And beyond the counting slight of hand, there are additional problems.

Fundamental to present theory is the result that no quark can be isolated. As quarks become separated from their partners, the theory stipulates that the force pulling them back in gets ever larger. Before a quark can be freed, separating it involves a force so large that the energy associated with it is capable of generating a quark antiquark pair, and each member of the pair then associates with the fragments of what was being pulled apart, so no quark can ever be isolated. In light of this, as philosophers we should ask: How can something be proven to exist if it can never be isolated? I would submit that such existence can never be proven – only inferred.

Another problem is that the weak force has no direction. Typical forces such as the electric, magnetic and gravitational forces have both magnitude and direction. They are vector quantities. But the weak force is really a particle exchange phenomena that has no direction associated with it. A last known problem is that there is no satisfactory calculational framework for the standard model. There are many good approximation schemes, but the mathematics is not anywhere close to the elegance and accuracy of quantum electrodynamics. This makes it hard to compare results against theory to test the model. (For instance, a pion is presumed to be a two body state. The two body problem is well known, yet there is no standard model prediction for pion masses.) For all of these reasons, despite its success, it took quite a while for the quark model to gain full acceptance in the physics community. I recall back in the early days speakers starting their comments by saying, “in what is now the standard way of doing things” and eventually “in what is becoming the standard model of our field”. The standard model was indeed the model that became the standard way of looking at things, but early on everyone was under the belief that something better would soon come along.

Above we see a figure that is more honest in its presentation of the existing standard model. Shown above are each of the three colors of each of the six quarks as well as their antimatter partners. Also shown are all six leptons along with their antimatter partners. Lastly, all the force carriers are shown. With this full accounting of particles it is seen that the standard model involves 61 elementary particles, since there are 18 quarks, 18 anti-quarks, 6 leptons and 13 force carriers. While some standard model proponents may argue that a red up quark is the same as a blue up quark, the rebuttal is that we certainly don’t believe that a positron is the same particle as an electron. Even though a positron is identical to an electron in every aspect except for its electric charge and lepton number we still recognize that any such difference means that the particles are different particles. Similarly, an up quark with a red charge should be recognized as a different particle than an up quark with a blue charge if we are going to have an honest appraisal of our elementary particles. With this honest appraisal it is clear from the diagram presented above that the standard model has reached the point in its development where a simpler underpinning is desirable.

The treatment I have presented here so far has focused solely on a mental picture of what particles and forces make up our world. But modern physics in general, and the standard model specifically, is considerably more than just a physical model. Indeed, there are many who take the position that a physical model of nature is not something we as mere mortals are even capable of understanding. That latter philosophy dates back to relativity and the early quantum theories, theories that originally seemed quite odd, but theories that survived every important test. And therefore, with the underlying physical modeling so difficult to mentally grasp, modern physics turned toward mathematics and underlying principles instead of physical mental pictures. Between them, the principle of relativity and the principle of least action have been used very successfully to blend into a Lagrangian approach that has produced a mathematical understanding of our world. The pictures above are rather gross simplifications of the true theory, and while those simplifications are useful to describe things to the public at large, the truer picture of the standard model comes from the Lagrangian, which is far more complex that even the rather complicated picture of 61 “elementary” particles.

Above we see the first of 30 equations from a reference available here… that serves as one reference for the Lagrangian of the present Standard Model. The terms in the Lagrangian got a good start based on the work of Dirac, who successfully arrived at a covariant formalism (meaning it is manifestly consistent with relativity) for electrons and positrons. From there, the work of many others has been successfully incorporated into a theory of mammoth proportions. We have come a long way from the simple expressions used by Newton, Maxwell, Lorentz and Einstein. So now, with such vast complexity, I believe we should ask “Is nature really that complex? Or might there be a simpler understanding?”

Of course, there are many good things about the standard model. First, it gets everything right. No known experiment is in violation of the standard model. And whenever new experiments indicate that something might not quite fit, the standard model has exhibited the room for growth needed to accommodate any new experimental results. Mixing angles and renormalization, as well as additional quarks and leptons have been added to the model over time. The analysis techniques are extremely complex, and it takes a decade or more to master them. A full Ph.D. in physics, as well as post doctoral training, are usually needed to fully grasp the intricacies of the model, and even then, practitioners may only be truly expert in a small portion of the overall model. Furthermore, development of the standard model has involved man-centuries of effort by some of the best, brightest and most trained members of the globe. As a result, the standard model is a monument to the creativity of man, and one that results in a complete modeling of all known particles and forces.

But at this moment, it is also important to note that there were many good things about the Music of the Spheres model… for celestial mechanics as well. First, it got everything right. No known observation of stellar or planetary motions were in violation of its tenets. And whenever new experiments indicated that something might not quite fit, the celestial mechanics model exhibited the room for growth needed to accommodate any new experimental results. Additional spheres, cycles and epi-cycles were added to the model over time as new observations became verified. The analysis techniques were extremely complex, and it took practitioners of the time a decade or more to fully grasp the intricacies of the model. Furthermore, development of the classical celestial model involved man-centuries of effort by some of the best, brightest and most trained members of the globe. As a result, the classical celestial model was a monument to the creativity of man that resulted in a complete modeling of all known stellar and planetary motions.

Please be advised that I am not attempting to mock the standard model by comparing it to the medieval and now discredited celestial model. I truly believe that the medieval celestial model was indeed a monumental achievement, and I feel it deserves much more credit than it presently gets. The credit should come because of its attention to detail, its coherent fundamentals, and its mathematically correct and exact derivations that led to explanations of all experimental data. It was indeed an impressive effort. However, Kepler and Copernicus showed us that a much simpler model was possible. And it is my belief that nature is simpler than the standard model as well, the details of which we will get into on my next thread in this series.


The ABC Preon Model. Modeling the Massive Leptons.

– By delbertlarson
The first observation relevant to the ABC Preon Model occurred to me when I was a graduate student in the very early 1980’s. I noticed that the decay of the muon was extremely similar to the decay of a hydrogen atom from an excited state into its ground state. Below we see drawings of the two events. In the first drawing, we can see a muon decay. The muon decays into an electron, and two neutrinos. Neutrinos were believed to be either massless or nearly so.

In the second drawing, we see hydrogen excited into its 2s state decaying into its ground state by emitting two photons. The photons are believed to be massless.

Notice that muon decay appears in many ways to be similar to the decay of hydrogen from its 2s state. A muon decays into an electron by emitting two neutrinos, while a hydrogen atom in its 2s state decays into a hydrogen atom in its 1s state by emitting two photons. Here we introduce the standard notation for neutrinos and photons by denoting a neutrino by the Greek letter nu, and a photon by the Greek letter gamma. It is known that the hydrogen atom is very effectively modeled as a proton and an electron being bound by a photon, and therefore the starting point for the ABC Preon Model is to propose that the massive leptons consist of two new particles, called preons, bound by a neutrino. The word preon is meant to confer a precursor particle to the ones presently assumed to be elementary, and that is why I refer to this new model as a preon model.

Below we see pictures of the internal structure of the hydrogen atom and our proposed preon model of the massive leptons. In the Hydrogen atom, an electron orbits a proton, and the force is carried by a photon:

In our newly proposed massive lepton model, we will propose an analogous substructure with one particle orbiting another. From experiment, we observe that hydrogen decays into its ground state by emitting photons, and a photon is the carrier of the force that binds it. Hence, since muons decay into electrons by emitting neutrinos, it follows from our analogy that the force that binds the preons together to form massive leptons is carried by the neutrino. Therefore a neutrino is shown as the binding quanta in the picture. At this point in the development we will simply name the preons “A” and “B” and we will investigate their properties later on, in future threads:

I want to emphasize how simple the onset of this new elementary particle model is. We simply look at the decay processes of Hydrogen and muons and propose that the internal structure of the muon is composed analogously to the internal structure of hydrogen. Since the radiated particle is a neutrino instead of a photon, we replace the photon by the neutrino. It is all just a simple observation at this point.

We’ve just seen how our analogy with the hydrogen atom has led to a proposal that the muon is the second quantum state of a composite system, and that the electron is the first quantum state. Of course, there is a third massive lepton, the tauon, that also has properties nearly identical to the muon and the electron, but with an even heavier mass. In the model proposed here, it is easy to identify the tauon as being the next excited state of the same composite system. And while the force binding the preon particles together is quite strong, neutrinos can still flow freely through matter as long as the cross section for the interaction is low. This is similar to the fact that some photons flow relatively freely through glass.

What this all mean to us, ordinary mortals, not versed in physics lanquage as further explained  by delbertlarson.
“Both the Standard Model and the ABC Preon Model are models for elementary particle physics, which is the study of the ultimate building blocks of nature. The quest to answer the question “What is the World Made of?” is one of the great philosophical undertakings that has occupied mankind’s thoughts since the ancient Greeks.

As for elements, in the ancient model, the world was thought to be composed of four elements – earth, fire, air and water. As history unfolded, additional elements were found and added to the list. Around the late middle ages, chemistry replaced the earth, fire, air and water model with a model of chemical elements. But the chemical element model was complex, as it involved around 100 elements, some of which came in multiple isotopic varieties. By the 1930’s it was appreciated that all elements were actually made up of three sub-atomic particles – the electron, proton, and neutron. However, by the 1950’s, experimentation with beams resulting from accelerators showed that there were vastly more sub-atomic particles – so many, that the situation was called “the particle zoo”. In the mid-1960’s it was theorized that the particles of the particle zoo could all be understood in terms of underlying particles called quarks and leptons. Today, the quark and lepton model has so many particle members that I call it “the quark and lepton zoo”, and the ABC Preon Model will show that there is a much simpler underpinning for all particles known to exist.

Along with our increasing knowledge of elements (particles) we have also evolved in our knowledge concerning the forces that act within our world. Newton proposed several valuable laws, including a law of the gravitational force. Several scientists contributed to the study of electricity and magnetism, leading up to Maxwell’s equations. Lorentz and others found modifications to Newton’s force laws in the early 1900’s. The Standard Model has increased our knowledge with the elucidation of two more forces – the strong and the weak, as well as modifications to gravity. The ABC Preon Model advances our knowledge further, by identifying what is believed to be the weak force with simple quantum tunneling, further refining our knowledge.

Now you might say, well, that’s nice, but what does it mean for me? And the answer is that if you evaluate technological advances made in the past two millennia, a great many of them can be seen to spring from our knowledge of chemistry, electricity, magnetism, and force laws. Understanding electricity and magnetism has led to electric power and electric lights. Understanding quantum mechanics, along with the prior advances in electromagnetism, led to the microchip and modern computing and phones. And these are of course just minimal examples – there are vastly more. If we next can more fully understand the nuclear force it may lead to further significant advances. For one thing, we know the sun to be powered by fusion, and if we know enough we could perhaps put fusion generation devices on a chip, resulting in clean, safe (if we use a-neutronic fusion), and unlimited power. And that is just one example, as it is of course hard to predict what uses may come once we are in a position to use any newly found knowledge.”

The ABC Preon Model. Assigning Some Quantum Numbers.

– By delbertlarson

In the previous thread, we introduced the preonic modeling for the massive leptons, repeated here in the picture below:

At this point in our development, we can now move on to assign some quantum numbers to our preons. A first point of analysis is to note that to date, no experiment has shown the existence of free electric charge in fractional amounts. For that reason, we will begin by arbitrarily assigning our new A preon to have zero electric charge and our new B preon to have a charge of minus one. Since the neutrino has zero electric charge, this will leave our leptons as having a charge of minus one, as they must. Note that the antimatter leptons will have a charge of plus one, but we will deal with that topic later. Next, it is known that the neutrino has a half integer spin. In this model I am assuming that one quanta of the binding particle is contained within the composite particle, and hence, the A and B particles can either be both fermions or both bosons. Recall that Fermions are particles with half integer spin, while bosons are particles with integer spin.

By adding two fermions one will get an integer value, and then adding the half integer of the neutrino results in an overall half integer spin. Similarly, adding the spin of two bosons results in an integer spin, and then adding the half integer of the neutrino results in an overall half integer spin. Recall that in all of these additions, spin is a vector quantity. So if we add a half integer spin of the A to a half integer of the B we will get either one or zero. When we then add the half integer of the neutrino we will either get one half or one and a half. We will get one and a half if all three spins are aligned. Since leptons have a spin of one half, this means that all three such spins cannot be aligned. A similar analysis can be done if the spin of the A and the B are bosons with integer values of spin, and that case will have similar constraints on the needed alignments.

Here we will also propose a new charge law for the preons. Since the force carrier has been proposed to be the neutrino, we will call this new charge the neutrinic charge. Following our analogy with the hydrogen atom, where an electrically negative particle orbits a positive nucleus, here we will have a particle with a negative neutrinic charge orbiting a particle that has a positive neutrinic charge. We can arbitrarily assign a negative neutrinic charge to the B particle we proposed earlier, and a positive neutrinic charge to the A particle we proposed earlier. Since the neutrinic charge is arbitrary, we are free to attach the electric charge to either of the particles, and we have already chosen to assign the B particle a negative electric charge, while leaving the A particle with zero electric charge. Here we see a picture of the massive leptons with their quantum numbers assigned:

The nomenclature introduced above is to have a trailing superscript indicating the electric charge on the preon and a preceding subscript indicating the neutrinic charge on the preon. With the total electric charge being equal to minus one, we see that our preon model for leptons gives the correct electric charge. With each substituent having the opposite neutrinic charge, we see that our constructs have overall zero neutrinic charge. The result that stable particles have zero total neutrinic charge is the analogy of the fact that atoms also have zero total electrical charge. Lastly, by having the A and B particles be either both fermions or both bosons, the total spin of the leptons can be arranged to be half integer, since the bound neutrino is itself a half integer spin particle. Hence, all quantum numbers of the leptons are obtained in a model that readily allows for three generations of leptons. (At this point in the development, it is not known whether the spins of the preons are bosons or fermions, only that they are both fermions or both bosons, and the spins are constrained so that the total spin of the massive leptons is one half.)

Also introduced in the picture above are the anti-matter counterparts to the massive leptons, as well as anti-preons. A line (also called a bar) above the letter identifying the preon indicates it is an anti-preon. It will turn out in future analysis that the massive leptons are actually made up of a B and an anti-A, rather than a B and an A, so that improvement to the model is introduced above as well.

With massive leptons now modeled and their quantum numbers defined, we’ll see how hadrons get modeled in the next post.

Study reveals substantial evidence of holographic universe

Posted in Cosmos, Science, Technology on January 30, 2017 by betweentwopines

A sketch of the timeline of the holographic Universe. Time runs from left to right. The far left denotes the holographic phase and the image is blurry because space and time are not yet well defined. At the end of this phase (denoted by the …more

A UK, Canadian and Italian study has provided what researchers believe is the first observational evidence that our universe could be a vast and complex hologram.

Theoretical physicists and astrophysicists, investigating irregularities in the (the ‘afterglow’ of the Big Bang), have found there is substantial evidence supporting a holographic explanation of the —in fact, as much as there is for the traditional explanation of these irregularities using the theory of cosmic inflation.

The researchers, from the University of Southampton (UK), University of Waterloo (Canada), Perimeter Institute (Canada), INFN, Lecce (Italy) and the University of Salento (Italy), have published findings in the journal Physical Review Letters.

A , an idea first suggested in the 1990s, is one where all the information that makes up our 3-D ‘reality’ (plus time) is contained in a 2-D surface on its boundaries.

Professor Kostas Skenderis of Mathematical Sciences at the University of Southampton explains: “Imagine that everything you see, feel and hear in three dimensions (and your perception of time) in fact emanates from a flat two-dimensional field. The idea is similar to that of ordinary holograms where a three-dimensional image is encoded in a two-dimensional surface, such as in the hologram on a credit card. However, this time, the entire universe is encoded.”

Although not an example with holographic properties, it could be thought of as rather like watching a 3-D film in a cinema. We see the pictures as having height, width and crucially, depth—when in fact it all originates from a flat 2-D screen. The difference, in our 3-D universe, is that we can touch objects and the ‘projection’ is ‘real’ from our perspective.

In recent decades, advances in telescopes and sensing equipment have allowed scientists to detect a vast amount of data hidden in the ‘white noise’ or microwaves (partly responsible for the random black and white dots you see on an un-tuned TV) left over from the moment the universe was created. Using this information, the team were able to make complex comparisons between networks of features in the data and . They found that some of the simplest quantum field theories could explain nearly all cosmological observations of the early universe.

Professor Skenderis comments: “Holography is a huge leap forward in the way we think about the structure and creation of the universe. Einstein’s theory of general relativity explains almost everything large scale in the universe very well, but starts to unravel when examining its origins and mechanisms at quantum level. Scientists have been working for decades to combine Einstein’s theory of gravity and quantum theory. Some believe the concept of a holographic universe has the potential to reconcile the two. I hope our research takes us another step towards this.”

The scientists now hope their study will open the door to further our understanding of the and explain how space and time emerged.

Read more at:

DNA Dethroned – Inheritance is Protein-Based.

Posted in ATS Thread, Health, Medicine, Science, Technology on October 7, 2016 by betweentwopines

A post by soficrow at ATS.
Time to re-write those biology textbooks. Inheritance is protein-based – it’s epigenetic, not genetic.

Long story short, it’s NOT our DNA and genes that determine our traits and biological destiny – it’s proteins. Specifically, prions, aka “intrinsically disordered proteins” that can “pass heritable traits from cell to cell by their structure instead of by DNA.” These prions -and the traits they confer- can be inherited;
in humans, some are conserved over hundreds of millions of years.

Revising the meaning of ‘prion’

…When the team examined the human cognates of the prion-proteins, the intrinsically disordered domains were conserved over hundreds of millions of years.

Intrinsically disordered proteins drive emergence and inheritance of biological traits

Prions (of Mad Cow disease fame) – and their role in evolution – have intrigued me for over a decade. Finally, the scientific proofs are rolling in.

Prions are all about rapid response to environmental change, and biological-evolutionary flexibility. Proteins can change their shape when they encounter new environmental conditions (external or internal) – when they do, they change their function, and can become infectious prions. Some prions cause disease; some (most?) are beneficial.

We all have inherited prion-based traits and memories of our ancestors’ responses to environmental changes – some dating back hundreds of millions of years. Our individual exposures can trigger a truly ancient memory-response – or one first developed by our great grandmother.

Cool, huh?

In summary:
* Inherited traits are passed on by prions – with some dating back hundreds of millions of years.

As well:
* Conscious memories are ‘stored’ in prions;
* Prions are airborne; and
* Transmitted human-to-human.

Does this information affect your understanding of life? Reincarnation and karma? What else?

(c) Lanie Patrick 9/6/16


2011: Mad Cow Disease Agent Can Infect Via the Air

2012: Prion Proteins Play Powerful Role in Survival, Evolution

2014: Prion-Like Protein Controls Long-term Memories

2015: Mad Cow-Like Prion Disease – Human-to-Human Transmission

And then there’s this:

2014: DARPA Funds Project to See How Meds Trigger Prion Diseases

NOTE: Big Pharma has been tinkering with proteins since 1950 when Linus Pauling identified the actin protein’s “a” and “b” shapes. But the pharmaceutical industry’s scientific results and knowledge are protected as “Intellectual Property” by “Confidentiality Agreements.” Including their ‘mistakes.’ [I wonder how many disease-causing prions they’ve ‘accidentally’ created and distributed over the past six and a half decades. (Think side-effects.)] Now though, the dam is breaking.

Now, the information is getting out into the public domain. Which is only fair considering we the public have been funding the research with our tax dollars and donations all the way along.

The Higgs Boson aka “The God Particle”, and The Problem of Unnatural Fine Tuning

Posted in ATS Thread, Computer, Conspiracy, Cosmos, Science, Technology, Warnings on September 5, 2016 by betweentwopines

A Post by AnkhMorpork at ATS.
The Higgs Boson: A Natural Disaster!


So, in a nutshell, the standard model of physics was rendered complete with the detection of the Higgs Boson particle.

However, problem. It’s mass was not what the standard model predicted, but was “fine-tuned” by trillions (quadrillions) of degrees or what I’ve seen other physicists refer to as “reductions”, the result for which appears to be entirely unnatural.


1) Supersymmetry – or anti-particles or sparticles for every particle. Problem. Not detected. Also, such supersymmetry could also mean that the universe should not be but would have self-annihilated the moment it was “born” with particle and sypersymmetrical anti-particle annhiliating each other. The LHC will be continuing to look for this, and an even larger collider may be built to test for it, but this solution isn’t looking promising.

2) Compositeness – the Higgs isn’t a fundamental particle, but a composite of still more fundamental particles that somehow self-adjust in their relationship to one another resulting in the appearance of unnatural fine tuning. Problem, nothing else and no other particles are coming out of the LHC and a larger collider might not produce them either.

3) Multi-Worlds, Multi-Verse Theory, with Strong Anthropic Principal. Problem – end of science, whereby all subjectivity and analysis is rendered moot as an unfathomable “coincidence” whereby we just so happen to be measuring in the one universe of an infinite possible array wherein the Higgs Boson only appears to have been unnaturally fine-tuned by quadrillions of reductions. In other words, it just is what it is mon (ire), and well, if it were any other way, then we would not be here to ask the question. Wut?! Huh?! Yea, the end of science, with no more particles forthcoming because there aren’t any more.

4) String Theory. Problem – same as multi-worlds/multiverse. String theory seems to describe every other universe but our own and does not appear to be subject to testing and verification by any empirical means so it will forever remain a theory. It also involves a direct appeal to the multiverse hypothesis with strong anthropic principal to try to explain the fine-tuning problem.

5) God did it, by super-intelligent design and with intent (by anticipation).

One way of looking at number five, if you’re uncomfortable with the idea of a creator God, would be to take on the idea of a field of infinite knowledge via accumulated information arising in eternity, like a Godhead of absolute formless potential (uncreated), whereby, beginning with the end in mind, having considered every possible outcome, and taking on the role of creator “you” measure twice and cut once since you could end up probing the impossible forever without arriving at what is now actualized. In other words you can’t get from there to here, except by anticipation. This mind-of-God explanation is in alignment with the work of physicists such as Bernard Haisch and Erwin Laszlo whereby Haisch employes the idea of the Godhead using the allegory of a filtered white light that, in order to differentiate itself and to make this life possible, including our own place in it, it must intelligently limit or reduce itself many many times, not unlike the fine-tuning of the Higgs, and limit or filter it’s infinite, absolute, formless potential. He then suggests that there’s no need to draw the distinction between God and Godhead of which we ourselves are not unlike a chip off the OLD block, by anticipation and with intent, presumably in order so that a shared and varied experience would be possible.

Problem. This would have to include not only the quantum realm, but also the precise organization of the material universe to bring about our present circumstance and thus the entire cosmic framework all the way to our own earth-moon-sun configuration (not as a mere chance or random occurrence).

The implications of unnatural fine-tuning, if Supersymmetry and Compositeness and the Multiverse (with Strong Anthropic Principal) must be discarded in favor of a new paradigm of some sort of intelligent creative X factor, cannot be underestimated, since they point to a whole new basis of our understanding of our place in the grand scheme of things, even as children of a loving and very generous God who it pleased to share his eternal kingdom of light, life and love.. “therefore fear not little ones nor let your hearts be troubled, for it pleased the father to share his kingdom”.

Can these ancient understandings and basis of reason and logic move from the realm of “religion” into the logical assertions of the empirical evidence of modern science at the very cutting edge, even to the very point of almost cutting off it’s own nose to spite it’s face amid the end of any reasonable inquiry into who and why the universe is the way it is?

Is this (Higg’s Boson unnatural fine-tuning) the end of scientific inquiry into the underlying nature of the material world, or will the new paradigm of an infinitely intelligent cosmological unity or self-aware universe that has included us on purpose, prevail, no matter what are it’s implications?

By avoiding the implication of a type of God Theory, would science be willing to shoot itself in the foot? I don’t think so. I have more faith in human curiosity and imagination than that. We go where the evidence leads us, no matter what are the implications..

Many of you will differ here of course, also to avoid the implication of the unnatural fine tuning of the Higgs Boson aka (ironically) “the God particle”, but it (a God Theory) is the most reasonable position to take in the face of the alternative multiverse, strong anthropic principal “hypothesis” which goes nowhere fast and leaves us in a state of perpetual uncertainty about the nature of reality.

“To be is to be perceived”
~ an adage of modern Quantum physics. (I just hope we get some privacy when we’re in the shower!).

Brilliant Disguise: Light, Matter and the Zero-Point Field, by Physicist Bernard Haisch

Bernard Haisch is an astrophysicist whose professional positions include Staff Scientist at the Lockheed Martin Solar and Astrophysics Laboratory, Deputy Director for the Center for Extreme Ultraviolet Astrophysics at the University of California, Berkeley, and Visiting Fellow at the Max-Planck Institute for Extraterrestrial Physics in Garching, Germany. His work has led to close involvement with NASA; he is the author of over 130 scientific papers; and was the Scientific Editor of the Astrophysical Journal for nine years, as well as the editor in chief of the Journal of Scientific Exploration.

The God Theory

Quoting Bernard Haisch from “The God Theory”

If you think of white light as a metaphor of infinite, formless potential, the colors on a slide or frame of film become a structured reality grounded in the polarity that comes about through intelligent subtraction from that absolute formless potential. It results from the limitation of the unlimited. I contend that this metaphor provides a comprehensible theory for the creation of a manifest reality (our universe) from the selective limitation of infinite potential (God)…

If there exists an absolute realm that consists of infinite potential out of which a created realm of polarity emerges, is there any sensible reason not to call this “God”? Or to put it frankly, if the absolute is not God, what is it? For our purposes here, I will indentify the Absolute with God. More precisely I will call the Absolute the Godhead. Applying this new terminology to the optics analogy, we can conclude that our physical universe comes about when the Godhead selectively limits itself, taking on the role of Creator and manifesting a realm of space and time and, within that realm, filtering out some of its own infinite potential…

Viewed this way, the process of creation is the exact opposite of making something out of nothing. It is, on the contrary, a filtering process that makes something out of everything. Creation is not capricious or random addition; it is intelligent and selective subtraction. The implications of this are profound.

If the Absolute is the Godhead, and if creation is the process by which the Godhead filters out parts of its own infinite potential to manifest a physical reality that supports experience, then the stuff that is left over, the residue of this process, is our physical universe, and ourselves included. We are nothing less than a part of that Godhead – quite literally.

Ervin Laszlo

Ervin Laszlo is considered one of the foremost thinkers and scientists of our age, perhaps the greatest mind since Einstein. His principal focus of research involves the Zero Point Field. He is the author of around seventy five books (his works having been translated into at least seventeen languages), and he has contributed to over 400 papers. Widely considered the father of systems philosophy and general evolution theory, he has worked as an advisor to the Director-General of the United Nations Educational, Scientific, and Cultural Organization. He was also nominated for the Nobel Peace Prize in both 2004 and 2005. A multidisciplinarian, Laszlo has straddled numerous fields, having worked at universities as a professor of philosophy, music, futures studies, systems science, peace studies, and evolutitnary studies. He was a sucessful concert pianist until he was thirty eight.

In Laszlo’s view, the zero-point field (or the Akashic Field, as he calls it) is quite literally the “mind of God”.

Naming Hal Puthoff, Roger Penrose, Fritz-Albert Popp, and a handful of others as “front line investigators”, Laszlo quotes Puthoff who says of the new scientific paradigm:

[What] would emerge would be an increased understanding that all of us are immersed, both as living and physical beings, in an overall interpenetrating and interdependant field in ecological balance with the cosmos as a whole, and that even the boundary lines between the physical and “metaphysical” would dissolve into a unitary viewpoint of the universe as a fluid, changing, energetic/informational cosmological unity.”

An excert from Science and the Akashic Field, an Integral Theory of Everything

Akasha (a . ka . sha) is a Sanskrit word meaning “ether”: all-pervasive space. Originally signifying “radiation” or “brilliance”, in Indian philosophy akasha was considered the first and most fundamental of the five elements – the others being vata (air), agni (fire), ap (water), and prithivi (earth). Akasha embraces the properties of all five elements: it is the womb from which everything we percieve with our senses has emerged and into which everything will ultimately re-descend. The Akashic Record (also called The Akashic Chronicle) is the enduring record of all that happens, and has ever happened, in space and time.”

Science and the Akashic Field, an Integral Theory of Everything, 2004…=sr_1_1?ie=UTF8&qid=1249275852&sr=8-1

Science and the Reenchantment of the Cosmos: The Rise of the Integral Vision of Reality…=sr_1_6?ie=UTF8&qid=1249275852&sr=8-6

Maybe at the end of science there’s a joke at the expense of both science and religion that just keeps on getting better the more we come to grips with it and to better understand it’s implications.

Wouldn’t that be funny.. and interesting..?!

Scientists Caught ‘Undead’ Genes Coming Alive After Death

Posted in Health, Medicine, Science, Uncategorized on August 28, 2016 by betweentwopines

What really happens to us after death? Once a person stops breathing, and their heart ceases to pump blood, they’re what doctors consider “clinically dead.” On a biological level, the eventual decomposition of cells, organs, and brain tissue signal its final and irreversible stages.

But what if that’s not actually the end? Two new studies claim that hundreds of genes actually kept expressing—and, in some cases, become more active—after death occurred. This came as a surprise to the researchers, because forensic pathologists have long suspected that gene activity degrades postmortem, which is why their rate of change is sometimes used to calculate time of death.

According to the lead author of both papers, microbiologist Peter Noble of the University of Washington, the discovery of “undead” genes could help to improve the preservation of organs destined for transplantation. The two studies are currently available on the pre-print server bioRxiv, and it’s important to note that neither have undergone peer review yet.

Read more at :

Related :

Respected Cornell geneticist rejects Darwinism in his recent book

Posted in Conspiracy, EDUCATION, History, Science on April 25, 2016 by betweentwopines

Genetic Entropy & the Mystery of the Genome
by John Sanford (October 2005)

Genetic Entropy

In retrospect, I realize that I have wasted so much of my life arguing about things that don’t really matter. It is my sincere hope that this book can actually address something that really does matter. The issue of who we are, where we came from, and where we are going seem to me to be of enormous importance. This is the real subject of this book.

Modern Darwinism is built on what I will be calling “The Primary Axiom”. The Primary Axiom is that man is merely the product of random mutations plus natural selection. Within our society’s academia, the Primary Axiom is universally taught, and almost universally accepted. It is the constantly mouthed mantra, repeated endlessly on every college campus. It is very difficult to find any professor on any college campus who would even consider (or should I say dare) to question the Primary Axiom.

Late in my career, I did something which for a Cornell professor would seem unthinkable. I began to question the Primary Axiom. I did this with great fear and trepidation. By doing this, I knew I would be at odds with the most “sacred cow” of modern academia. Among other things, it might even result in my expulsion from the academic world.

Although I had achieved considerable success and notoriety within my own particular specialty (applied genetics), it would mean I would have to be stepping out of the safety of my own little niche. I would have to begin to explore some very big things, including aspects of theoretical genetics which I had always accepted by faith alone. I felt compelled to do all this, but I must confess I fully expected to simply hit a brick wall. To my own amazement, I gradually realized that the seemingly “great and unassailable fortress” which has been built up around the primary axiom is really a house of cards. The Primary Axiom is actually an extremely vulnerable theory, in fact it is essentially indefensible. Its apparent invincibility derives mostly from bluster, smoke, and mirrors. A large part of what keeps the Axiom standing is an almost mystical faith, which the true-believers have in the omnipotence of natural selection. Furthermore, I began to see that this deep-seated faith in natural selection was typically coupled with a degree of ideological commitment which can only be described as religious. I started to realize (again with trepidation) that I might be offending a lot of people’s religion!

To question the Primary Axiom required me to re-examine virtually everything I thought I knew about genetics. This was probably the most difficult intellectual endeavor of my life. Deeply entrenched thought pattern only change very slowly (and I must add — painfully). What I eventually experienced was a complete overthrow of my previous understandings. Several years of personal struggle resulted in a new understanding, and a very strong conviction that the Primary Axiom was most definitely wrong. More importantly, I became convinced that the Axiom could be shown to be wrong to any reasonable and open-minded individual. This realization was exhilarating, but again frightening. I realized that I had a moral obligation to openly challenge this most sacred of cows. In doing this, I realized I would earn for myself the most intense disdain of most of my colleagues in academia not to mention very intense opposition and anger from other high places.

What should I do? It has become my conviction that the Primary Axiom is insidious on the highest level, having catastrophic impact on countless human lives. Furthermore, every form of objective analysis I have performed has convinced me that the Axiom is clearly false. So now, regardless of the consequences, I have to say it out loud: the Emperor has no clothes!

To the extent that the Primary Axiom can be shown to be false, it should have a major impact on your own life and on the world at large. For this reason, I have dared to write this humble little book which some will receive as blasphemous treason, and others revelation.

If the Primary Axiom is wrong, then there is a surprising and very practical consequence. When subjected only to natural forces, the human genome must irrevocably degenerate over time. Such a sober realization should have more than just intellectual or historical significance. It should rightfully cause us to personally reconsider where we should rationally be placing our hope for the future.

John Sanford

Sanford drew heavily from the work of Motoo Kimura, James Crow, and Walter ReMine. He featured a lot of data I had never seen, and he applied the concept of signal-to-noise ratios (from information theory) to show that the selection pressures are too weak for natural selection to transmit useful information into the genome. He made devastating critiques of naturalistic evolution using standard population genetics. It was a superb book, something one would expect from such a capable scientist. I’m surprised this book is relatively obscure, it ought to be required reading for serious IDers!

Sanford’s Bio: Cornell Professor of 25 years (being semi-retired since 1998). He received his Ph.D. from the University of Wisconsin in the area of plant breeding and genetics. He founded 2 successful biotech firms, Biolistics and Sanford Scientific. Most of the transgenic crops grown in the world today were genetically engineered using the gene gun technology developed by Sanford. He still holds a position of Courtesy Associate Professor at Cornell.

Here are some endorsements for the book:

In the Mystery of the Genome Cornell University researcher John Sanford lifts the rug to see what evolutionary theory has swept under it. He shows that, not only does Darwinism not have answers for how information got into the genome, it doesn’t even have answers for how it could remain there.

Michael Behe

I strongly recommend John Sanford’s Mystery of the Genome, which provides a lucid and bold account of how the human genome is deteriorating, due the accumulation of mutations. This situation has disturbing implications for mankind’s future, as well as surprising implications concerning mankind’s past.

Phillip Johnson

Source :

CERN releases 300TB of Large Hadron Collider data into open access

Posted in Cosmos, Science, Technology on April 24, 2016 by betweentwopines

Cancel your plans for this weekend! CERN just dropped 300 terabytes of hot collider data on the world and you know you want to take a look.

Kati Lassila-Perini, a physicist who works on the Compact Muon Solenoid (!) detector, gave a refreshingly straightforward explanation for this huge release.

“Once we’ve exhausted our exploration of the data, we see no reason not to make them available publicly,” she said in a news release accompanying the data. “The benefits are numerous, from inspiring high school students to the training of the particle physicists of tomorrow. And personally, as CMS’s data preservation coordinator, this is a crucial part of ensuring the long-term availability of our research data.”

Amazing that this perspective is not more widely held — though I suspect it is, by the scientists at least, if not the publishers and department heads who must think of the bottom line.

The data itself is from 2011, much of it from protons colliding at 7 TeV (teraelectronvolts, you know) and producing those wonderful fountains of rare particles we all love to fail to understand. All told, it’s about half the total data collected by the CMS detector, and makes up about 2.5 inverse femtobarns. But who’s counting?

cmsgiffyThere’s both the raw data from the detectors (so you can verify the results) and also “derived” datasets that are more easy to work with — and don’t worry, CERN is providing the tools to do so, as well. There’s a whole CERN Linux environment ready for booting up in a virtual machine, and a bunch of scripts and apps (some are on GitHub, too).

Just messing around in the same computing environment used by researchers plumbing the depths of the universe would be an interesting way to spend a few labs in a college physics course. There are even “masterclasses,” data sets and tools specially curated for high school kids.

This is only the latest of several data dumps, but it’s also by far the largest. A more detailed explanation of the types of data and how they can be accessed is right here.

Source :