Simulation Universe

Please follow the next lines and images about an interesting questions.

That is, why inferring design on functionally specific, complex organisation and associated information, e.g.:

abu_6500c3magand equally:

cell_metabolism

. . . makes good sense.

Now, overnight, UD’s Newsdesk posted on a Space dot com article, Is Our Universe a Fake?

The article features “Philosopher Nick Bostrom, director of the Future of Humanity Institute at Oxford University.”

I think Bostrom’s argument raises a point worth pondering, one oddly parallel to the Boltzmann brain popping up by fluctuation from an underlying sea of quantum chaos argument, as he discusses “richly detailed software simulation[s] of people, including their historical predecessors, by a very technologically advanced civilization”:

>>Bostrom is not saying that humanity is living in such a simulation. Rather, his “Simulation Argument” seeks to show that one of three possible scenarios must be true (assuming there are other intelligent civilizations):

  1. All civilizations become extinct before becoming technologically mature;
  2. All technologically mature civilizations lose interest in creating simulations;
  3. Humanity is literally living in a computer simulation.

His point is that all cosmic civilizations either disappear (e.g., destroy themselves) before becoming technologically capable, or all decide not to generate whole-world simulations (e.g., decide such creations are not ethical, or get bored with them). The operative word is “all” — because if even one civilization anywhere in the cosmos could generate such simulations, then simulated worlds would multiply rapidly and almost certainly humanity would be in one.

As technology visionary Ray Kurzweil put it, “maybe our whole universe is a science experiment of some junior high school student in another universe.”>>

In short, if once the conditions are set up for a large distribution of possibilities to appear, you have a significant challenge to explain why you are not in the bulk of the possibilities in a dynamic-stochastic system.

Let me put up an outline, general model:

gen_sys_proc_modelSuch a system puts out an output across time that will vary based on mechanical and stochastic factors, exploring a space of possibilities. And in particular, any evolutionary materialist model of reality will be a grand dynamic-stochastic system, including a multiverse.

Now, too, as Wiki summarises, there is the Boltzmann Brain paradox:

>>A Boltzmann brain is a hypothesized self aware entity which arises due to random fluctuations out of a state of chaos. The idea is named for the physicist Ludwig Boltzmann (1844–1906), who advanced an idea that the Universe is observed to be in a highly improbable non-equilibrium state because only when such states randomly occur can brains exist to be aware of the Universe. The term for this idea was then coined in 2004 by Andreas Albrecht and Lorenzo Sorbo.[1]

The Boltzmann brains concept is often stated as a physical paradox. (It has also been called the “Boltzmann babies paradox”.[2]) The paradox states that if one considers the probability of our current situation as self-aware entities embedded in an organized environment, versus the probability of stand-alone self-aware entities existing in a featureless thermodynamic “soup”, then the latter should be vastly more probable than the former.>>

In short, systems with strong stochastic tendencies tend to have distributions in their outcomes, which are dominated by the generic and typically uninteresting bulk of a population. Indeed this is the root of statistical mechanics, the basis for a dynamical understanding of thermodynamics i/l/o the behaviour of large collections of small particles.

For instance, one of my favourites (explored in Mandl) is an idealised two-state element paramagnetic array, with atoms having N-pole up/down, a physical atomic scale close analogue of the classic array of coins exercise. We can start with 500 or 1,000 coins in a string, which will of course pursue a binomial distribution [3.27 * 10^150 or 1.07*10^301 possibilities respectively, utterly dominated by coins in near 50-50 outcomes, in no particular orderly or organised pattern], then look at an array where each atom of our 10^57 atom sol system has a tray of 500 coins flipped say every 10^-13 – 10^-15 s:

sol_coin_fliprThe outcome of such an exercise is highly predictably that no cases of FSCO/I (meaningful complex strings) will emerge, as the number of possible observed outcomes is so small relative to the set of possibilities that it rounds down to all but no search, as the graphic points out.

This is of course an illustration of the core argument to design as credible cause on observing FSCO/I, that once functionally specific complex organisation and associated information are present in a situation, it demands an observed to be adequate explanation that does not require us to believe in statistical needle- in- vast- haystack- search- challenge miracles:

islands_of_func_challAlso:

is_ o_func2_activ_info

The Captain Obvious fact of serious thinkers making similar needle in haystack arguments, should lead reasonable people to take pause before simply brushing aside the inference to design on FSCO/I. Including in the world of life and in the complex fine tuned physics of our cosmos that sets up a world in which C-chemistry, aqueous medium terrestrial planet life is feasible.

But we’re not finished yet.

What’s wrong with Bostrom’s argument, and wheere else does it point.

PPolish and Mapou raise a point or two:

>>1

  • Simulated Universes scream Intelligent Design. Heck, Simulated Universes prove Intelligent Design.

    I can see why some Scientists are leaning in this direction. Oops/Poof does not cut it any more. Unscientific, irrational, kind of dumb.

  • ppolish,

    It’s a way for them to admit intelligent design without seeming to do so (for fear of being crucified by their peers). Besides, those who allegedly designed, built and are running the simulation would be, for all intents and purposes, indistinguishable from the Gods.

    Edit: IOW, they’re running away from religion only to fall into it even deeper.>>

In short, a detailed simulation world will be a designed world.

Likewise High School student projects do not credibly run for 13.7 BY. Not even PhD’s, never mind Kurzweil’s remark.

So, what is wrong with the argument?

First, an implicit assumption.

It is assuming that unless races keep killing off themselves too soon, blind chance and mechanical necessity can give rise to life then advanced, civilised high tech life that builds computers capable of whole universe detailed simulations.

But ironically, the argument points to the likeliest, only observed cause of FSCO/I, design, and fails to address the significance of FSCO/I as a sign of design, starting with design of computers, e.g.:

mpu_modelWhere, cell based life forms show FSCO/I-rich digital information processors in action “everywhere,” e.g. the ribosome and protein synthesis:

Protein Synthesis (HT: Wiki Media)

So, real or simulation, we are credibly looking at design, and have no good empirical observational grounds to infer that FSCO/I is credibly caused by blind chance and mechanical necessity.

So, the set of alternative possible explanations has implicitly questionable candidates and implicitly locks out credible but ideologically unacceptable ones, i.e. intelligent design of life and of the cosmos. That is, just maybe the evidence is trying to tell us that if we have good reason to accept that we live in a real physical world as opposed to a “mere” speculation, then that puts intelligent design of life and cosmos at the table as of right not sufferance.

And, there is such reason.

Not only is it that the required simulation is vastly too fine grained and fast-moving to be credibly  centrally processed but the logic of complex processing would point to a vast network of coupled processors. Which is tantamount to saying we have been simulating on atoms etc. In short, it makes good sense to conclude that our processing elements are real world dynamic-stochastic entities: atoms, molecules etc in real space.

This is backed up by a principle that sets aside Plato’s Cave worlds and the like: any scheme that implies grand delusion of our senses and faculties of reasoning i/l/o experience of the world undermines its own credibility in an infinite regress of further what if delusions.

Reduction to absurdity in short.

So, we are back to ground zero, we have reason to see that we live in a real world in which cell based life is full of FSCO/I and the fine tuning of the cosmos also points strongly to FSCO/I.

Thence, to the empirically and logically best warranted explanation of FSCO/I.

Design.

Thank you Dr Bostrom for affirming the power of the needle in haystack challenge argument.

Where that argument leads, is to inferring design as best current and prospective causal explanation of FSCO/I, in life and in observed cosmos alike.

Any suggestions and comments?

Advertisements

#cosmology, #math, #metaphysics, #philosophy, #physics, #science, #science-news, #universe

Multiverse Hypothesis – New Theory

694341main1_m69Full

Though galaxies look larger than atoms and elephants appear to outweigh ants, some physicists have begun to suspect that size differences are illusory. Perhaps the fundamental description of the universe does not include the concepts of “mass” and “length,” implying that at its core, nature lacks a sense of scale.

This little-explored idea, known as scale symmetry, constitutes a radical departure from long-standing assumptions about how elementary particles acquire their properties. But it has recently emerged as a common theme of numerous talks and papers by respected particle physicists. With their field stuck at a nasty impasse, the researchers have returned to the master equations that describe the known particles and their interactions, and are asking: What happens when you erase the terms in the equations having to do with mass and length?

Nature, at the deepest level, may not differentiate between scales. With scale symmetry, physicists start with a basic equation that sets forth a massless collection of particles, each a unique confluence of characteristics such as whether it is matter or antimatter and has positive or negative electric charge. As these particles attract and repel one another and the effects of their interactions cascade like dominoes through the calculations, scale symmetry “breaks,” and masses and lengths spontaneously arise.

Similar dynamical effects generate 99 percent of the mass in the visible universe. Protons and neutrons are amalgams — each one a trio of lightweight elementary particles called quarks. The energy used to hold these quarks together gives them a combined mass that is around 100 times more than the sum of the parts. “Most of the mass that we see is generated in this way, so we are interested in seeing if it’s possible to generate all mass in this way,” said Alberto Salvio, a particle physicist at the Autonomous University of Madrid and the co-author of a recent paper on a scale-symmetric theory of nature.

In the equations of the “Standard Model” of particle physics, only a particle discovered in 2012, called the Higgs boson, comes equipped with mass from the get-go. According to a theory developed 50 years ago by the British physicist Peter Higgs and associates, it doles out mass to other elementary particles through its interactions with them. Electrons, W and Z bosons, individual quarks and so on: All their masses are believed to derive from the Higgs boson — and, in a feedback effect, they simultaneously dial the Higgs mass up or down, too.

THE MULTIVERSE ENNUI CAN’T LAST FOREVER.

The new scale symmetry approach rewrites the beginning of that story.
“The idea is that maybe even the Higgs mass is not really there,” said Alessandro Strumia, a particle physicist at the University of Pisa in Italy. “It can be understood with some dynamics.”

The concept seems far-fetched, but it is garnering interest at a time of widespread soul-searching in the field. When the Large Hadron Collider at CERN Laboratory in Geneva closed down for upgrades in early 2013, its collisions had failed to yield any of dozens of particles that many theorists had included in their equations for more than 30 years. The grand flop suggests that researchers may have taken a wrong turn decades ago in their understanding of how to calculate the masses of particles.

“We’re not in a position where we can afford to be particularly arrogant about our understanding of what the laws of nature must look like,” said Michael Dine, a professor of physics at the University of California, Santa Cruz, who has been following the new work on scale symmetry. “Things that I might have been skeptical about before, I’m willing to entertain.”

The Giant Higgs Problem

The scale symmetry approach traces back to 1995, when William Bardeen, a theoretical physicist at Fermi National Accelerator Laboratory in Batavia, Ill., showed that the mass of the Higgs boson and the other Standard Model particles could be calculated as consequences of spontaneous scale-symmetry breaking. But at the time, Bardeen’s approach failed to catch on. The delicate balance of his calculations seemed easy to spoil when researchers attempted to incorporate new, undiscovered particles, like those that have been posited to explain the mysteries of dark matter and gravity.

Instead, researchers gravitated toward another approach called “supersymmetry” that naturally predicted dozens of new particles. One or more of these particles could account for dark matter. And supersymmetry also provided a straightforward solution to a bookkeeping problem that has bedeviled researchers since the early days of the Standard Model.

In the standard approach to doing calculations, the Higgs boson’s interactions with other particles tend to elevate its mass toward the highest scales present in the equations, dragging the other particle masses up with it. “Quantum mechanics tries to make everybody democratic,” explained theoretical physicist Joe Lykken, deputy director of Fermilab and a collaborator of Bardeen’s. “Particles will even each other out through quantum mechanical effects.”

This democratic tendency wouldn’t matter if the Standard Model particles were the end of the story. But physicists surmise that far beyond the Standard Model, at a scale about a billion billion times heavier known as the “Planck mass,” there exist unknown giants associated with gravity. These heavyweights would be expected to fatten up the Higgs boson — a process that would pull the mass of every other elementary particle up to the Planck scale. This hasn’t happened; instead, an unnatural hierarchy seems to separate the lightweight Standard Model particles and the Planck mass.

With his scale symmetry approach, Bardeen calculated the Standard Model masses in a novel way that did not involve them smearing toward the highest scales. From his perspective, the lightweight Higgs seemed perfectly natural. Still, it wasn’t clear how he could incorporate Planck-scale gravitational effects into his calculations.

Meanwhile, supersymmetry used standard mathematical techniques, and dealt with the hierarchy between the Standard Model and the Planck scale directly. Supersymmetry posits the existence of a missing twin particle for every particle found in nature. If for each particle the Higgs boson encounters (such as an electron) it also meets that particle’s slightly heavier twin (the hypothetical “selectron”), the combined effects would nearly cancel out, preventing the Higgs mass from ballooning toward the highest scales. Like the physical equivalent of x + (–x) ≈ 0, supersymmetry would protect the small but non-zero mass of the Higgs boson. The theory seemed like the perfect missing ingredient to explain the masses of the Standard Model — so perfect that without it, some theorists say the universe simply doesn’t make sense.

Yet decades after their prediction, none of the supersymmetric particles have been found. “That’s what the Large Hadron Collider has been looking for, but it hasn’t seen anything,” said Savas Dimopoulos, a professor of particle physics at Stanford University who helped develop the supersymmetry hypothesis in the early 1980s. “Somehow, the Higgs is not protected.”

The LHC will continue probing for convoluted versions of supersymmetry when it switches back on next year, but many physicists have grown increasingly convinced that the theory has failed. Just last month at the International Conference of High-Energy Physics in Valencia, Spain, researchers analyzing the largest data set yet from the LHC found no evidence of supersymmetric particles. (The data also strongly disfavors an alternative proposal called “technicolor.”)

THE THEORY HAS WHAT MOST EXPERTS CONSIDER A SERIOUS FLAW: IT REQUIRES THE EXISTENCE OF STRANGE PARTICLE-LIKE ENTITIES CALLED “GHOSTS.”

The implications are enormous. Without supersymmetry, the Higgs boson mass seems as if it is reduced not by mirror-image effects but by random and improbable cancellations between unrelated numbers — essentially, the initial mass of the Higgs seems to exactly counterbalance the huge contributions to its mass from gluons, quarks, gravitational states and all the rest. And if the universe is improbable, then many physicists argue that it must be one universe of many: just a rare bubble in an endless, foaming “multiverse.” We observe this particular bubble, the reasoning goes, not because its properties make sense, but because its peculiar Higgs boson is conducive to the formation of atoms and, thus, the rise of life. More typical bubbles, with their Planck-size Higgs bosons, are uninhabitable.

“It’s not a very satisfying explanation, but there’s not a lot out there,” Dine said.

As the logical conclusion of prevailing assumptions, the multiverse hypothesis has surged in begrudging popularity in recent years. But the argument feels like a cop-out to many, or at least a huge letdown. A universe shaped by chance cancellations eludes understanding, and the existence of unreachable, alien universes may be impossible to prove. “And it’s pretty unsatisfactory to use the multiverse hypothesis to explain only things we don’t understand,” said Graham Ross, an emeritus professor of theoretical physics at the University of Oxford.

The multiverse ennui can’t last forever.

“People are forced to adjust,” said Manfred Lindner, a professor of physics and director of the Max Planck Institute for Nuclear Physics in Heidelberg who has co-authored several new papers on the scale symmetry approach. The basic equations of particle physics need something extra to rein in the Higgs boson, and supersymmetry may not be it. Theorists like Lindner have started asking, “Is there another symmetry that could do the job, without creating this huge amount of particles we didn’t see?

Wrestling Ghosts

Picking up where Bardeen left off, researchers like Salvio, Strumia and Lindner now think scale symmetry may be the best hope for explaining the small mass of the Higgs boson. “For me, doing real computations is more interesting than doing philosophy of multiverse,” said Strumia, “even if it is possible that this multiverse could be right.”

For a scale-symmetric theory to work, it must account for both the small masses of the Standard Model and the gargantuan masses associated with gravity. In the ordinary approach to doing the calculations, both scales are put in by hand at the beginning, and when they connect in the equations, they try to even each other out. But in the new approach, both scales must arise dynamically — and separately — starting from nothing.

“The statement that gravity might not affect the Higgs mass is very revolutionary,” Dimopoulos said.

A theory called “agravity” (for “adimensional gravity”) developed by Salvio and Strumia may be the most concrete realization of the scale symmetry idea thus far. Agravity weaves the laws of physics at all scales into a single, cohesive picture in which the Higgs mass and the Planck mass both arise through separate dynamical effects. As detailed in June in the Journal of High-Energy Physics, agravity also offers an explanation for why the universe inflated into existence in the first place. According to the theory, scale-symmetry breaking would have caused an exponential expansion in the size of space-time during the Big Bang.

However, the theory has what most experts consider a serious flaw: It requires the existence of strange particle-like entities called “ghosts.” Ghosts either have negative energies or negative probabilities of existing — both of which wreak havoc on the equations of the quantum world.

“Negative probabilities rule out the probabilistic interpretation of quantum mechanics, so that’s a dreadful option,” said Kelly Stelle, a theoretical particle physicist at Imperial College, London, who first showed in 1977 that certain gravity theories give rise to ghosts. Such theories can only work, Stelle said, if the ghosts somehow decouple from the other particles and keep to themselves. “Many attempts have been made along these lines; it’s not a dead subject, just rather technical and without much joy,” he said.

Marcela Carena, a senior scientist at Fermi National Accelerator Laboratory in Batavia, Ill.

Strumia and Salvio think that, given all the advantages of agravity, ghosts deserve a second chance. “When antimatter particles were first considered in equations, they seemed like negative energy,” Strumia said. “They seemed nonsense. Maybe these ghosts seem nonsense but one can find some sensible interpretation.”

Meanwhile, other groups are crafting their own scale-symmetric theories. Lindner and colleagues have proposed a model with a new “hidden sector” of particles, while Bardeen, Lykken, Marcela Carena and Martin Bauer of Fermilab and Wolfgang Altmannshofer of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, argue in an Aug. 14 paper that the scales of the Standard Model and gravity are separated as if by a phase transition. The researchers have identified a mass scale where the Higgs boson stops interacting with other particles, causing their masses to drop to zero. It is at this scale-free point that a phase change-like crossover occurs. And just as water behaves differently than ice, different sets of self-contained laws operate above and below this critical point.

To get around the lack of scales, the new models require a calculation technique that some experts consider mathematically dubious, and in general, few will say what they really think of the whole approach. It is too different, too new. But agravity and the other scale symmetric models each predict the existence of new particles beyond the Standard Model, and so future collisions at the upgraded LHC will help test the ideas.

In the meantime, there’s a sense of rekindling hope.

“Maybe our mathematics is wrong,” Dine said. “If the alternative is the multiverse landscape, that is a pretty drastic step, so, sure — let’s see what else might be.”

Original story reprinted with permission from Quanta Magazine, an editorially independent division of SimonsFoundation.org whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

#multiverse-hypothesis, #research, #science, #universe