Supernovae Evidence for Foundational Change to Cosmological Models

URL: arxiv.org
12 comments

This paper argues that the Timescape model [0] provides a better fit than the cold dark matter model when examining Type Ia Supernovae. According to the Timescape model, clocks run faster in voids where the gravitational field is less, and significant differences exist between a galaxy floating in a void and one like the Milky Way Galaxy. The Timescape model suggests that other models, which fail to account for these differences, lead to less accurate calculations and less plausible solutions.

[0] https://en.wikipedia.org/wiki/Inhomogeneous_cosmology?useski...

> a better fit than the cold dark matter model

than the Lambda Cold Dark Matter model.

Lambda, i.e. the cosmological constant, a.k.a. dark energy, is what they do away with, not dark matter.

Thanks for saving me time in dismissing this paper lol. Any time somebody wants to get rid of dark energy, i run into some garbage. Reminds me of the mond nuts

Just reading the rest of the comment section is enough to help me verify that.

For some reason, hackernews always gets kooky when it comes to this stuff.

If clocks run slower in the presence of gravity, wouldn’t it stand to reason it runs more quickly in a void where there’s less gravity? Or is the model saying that clocks run even faster in a void than Einstein’s theory predicts?

Clocks run at "normal" speed (i.e. "1x" speed) in the absence of a gravitational field. The stronger the gravity, the slower they run (i.e. less than "1x" speed).

Right so is the paper saying that lambda CM completely ignored clock differences due to heterogeneity in mass distribution in the universe where isolated galaxies would be experiencing less time slowing than galaxies near other galaxies which would experience more time dilation?

Pet theory is that our universe is run on some external computational substrate. A lot of the strangeness we see in quantum physics are side effects of how that computation is executed efficiently.

The inability to reconcile quantum field theory and general relativity is the that gravity is a fundamentally different thing to matter: matter is an information system that's run to execute the laws of physics, gravity is a side effect of the underlying architecture being parallelized across many compute nodes.

The speed of light limitation is the side-effect of it taking a finite time for information to propagate in the underlying computational substrate.

The top-level calculation the universe is running is constantly trying to balance computation efficiently among the compute nodes in the substrate: e.g. the universe is trying to maintain a constant complexity density across all compute nodes.

Black holes act as complexity sinks, effectively "garbage collection." The matter than falls below the event horizon is effectively removed from the computation needs of the substrate. The cosmological constant can be explained by more compute power being available as more and more matter is consumed by black holes.

This can be introduced into GR by adding a new scalar field whose distribution encodes "complexity density." e.g. some metric of complexity like counting micro-states, etc. This scalar field attempts to remain spatially uniform in order to best "smooth" computation across the computational substrate. If you apply this to a galaxy with a large central supermassive black hole, you end up with almost a point sink of complexity at the center, then a large area of high complexity in the accretion disk, and then a gradient of complexity away towards the edges of the galaxy. That is, the scalar field has strong gradients along the radius of the galaxy, and this gives rise to varying gravitational effects over the radius (very MOND-like).

Some back of the napkin calculations show that adding this complexity density scalar field to GR does replicate observed rotation curves of galaxies. Would love to formalize this and run some numerical simulations.

Would hope that fitting the free parameters of GR with this complexity density scalar field would yield some testable predictions that differ from current naive assumptions around dark matter and dark energy.

”External computation susbtrate” is a useful idea if it leads to falsifiable theories. As a ”theory of everything” it sucks because it’s clearly not motivated by any specific maths or observations, but by the human need to map nature into some comprehensible analogue. Ie. taking some simpler subset of nature and trying to pretend the rest of it is like that as well. Usually nature so far has become more incomprehensible the deeper we’ve looked at it.

Newtonian mechanics & mechanical clocks being hottest precision technique led scientists at the time to viewing nature as a clockwork. Now we have computers, we think ”nature is like computers” because it’s an appealing analogue.

But it’s a false analogue imo. Just like clocks are a thing enabled by nature (a subset, in every meaning of the word) similarly computers are a subset of nature. So yes, nature can think (with human brains) and nature can run computations (with cpu:s impregnated with programs) but that also is just a subset of nature.

Now: games of the mind and helpfull analogues rock. And asking ”how is nature analogous to a turing machine” is interesting for sure. But just because a game is fun or analogue appealing, should not one let forget in the philosophical sense that one is playing only with a limited subset of a thing.

[deleted]

There's a Danny Hillis talk on this but I couldn't find it.

As someone who works decently close to, but not in, this area, I am surprised to see this on the front page of HN. The paper authors do not use correct statistical practices (e.g. H_0 cannot be fixed “as a nuisance parameter” to remove a degeneracy with another parameter - nuisance parameters must be marginalized over!) and the authors fail to account for several effects in their model (e.g. stretch/color factors for each supernova must be varied) which are known to be necessary for robust inference of cosmological parameters from supernovae data.

This is an honest question since I have seen this phenomenon occur a few times now with cosmology/astrophysics papers on HN: How did the original poster find this? And why has it gotten such interest/points? I sincerely hope it is simply a well-intentioned interest in our universe (which it greatly heartens me to see!) combined with naïveté (not meant pejoratively, just to refer to lacking context) wrt the technical nature of this work, but I am interested to hear your thoughts.

It has been circulating on the intertubes. I saw it on a more general interest site earlier today, before seeing it on HN just now.

There's been a general problem in astronomy for a long time that it seems like there just hasn't been enough time for objects to develop

The oldest version of this I know of can be seen in a diagram of ways that large black holes could possibly form in this book

https://en.wikipedia.org/wiki/Gravitation_(book)

which shows as early in 1973 people knew they had no idea how supermassive black holes could possibly form. Lately these problems have intensified because Webb seems to see that all sorts of developments seemed to happen a lot more quickly than they should of which leaves one wondering if the first billion years were really the first ten billion years. Could Timescape explain that?

AFAIK one possible explanation for the black hole issues could be primordial black holes, which are also a candidate for at least a component of dark matter.

Yep. There is the idea that you could get little primordial black holes (that maybe weigh as much as a mountain and could be evaporating now) and the idea that you could get huge primordial black holes. Also the occasional strange idea that the universe might be cyclic (not too fashionable but can fill the hole left by inflation) and that black holes can survive the crunch.

Black holes can survive a Big Crunch scenario? That can go a long way to explaining many things. Can you please provide a paper with more references to this, and potentially one with an example mechanism?

This is the opening salvo in cosmology's Battle of Trafalgar. Dave Wiltshire has lined up a set piece 20 years in the making that is going to obliterate both lambda CDM and MOND and all the rest.

Sounds fascinating. Anywhere I can read more about the build-up to this moment? Has David Wiltshire written about this?

A very compelling argument that the need for dark matter may be an artifact of a in incorrect assumption about the universe; the extent to which it is homogeneous and large scale structures can be ignored in calculations

Dr Ridden, an author of this paper, has a great explainer video: https://www.youtube.com/watch?v=YhlPDvAdSMw

Typo: Dark Energy*, not Dark Matter

An implication is that you would expect ancient advanced civilizations to form in the voids.

Wouldn't such a civilization slow down as it gathers?

Webb is turning out to be one of the most impactful pieces of scientific apparatus of the last century or so. Not that it took all the relevant data, but that it was the final thing that broke open all the doors being held shut. We're watching a Kuhnian paradigm shift in astronomy unfold in real time.

I’ll be happy to see all the dark matter, dark energy stuff explained away.

We have a century's worth of evidence for dark matter and about 20 years worth of evidence for dark energy.

Once an alternative theory stands up to scrutiny, maybe we shouldnt a priori dismiss things we dont understand?

Good wikipedia article on these types of cosmologies including timescape cosmology:

https://en.wikipedia.org/wiki/Inhomogeneous_cosmology

Many advancements in science have happened because we stopped for a second, and then looked to generalize our assumptions. Consider,

e.g.

Euclidean geometry -> non-euclidean geometry; Classical analysis -> nonstandard analysis; Linearity -> non-linearity; Homogeneity -> inhomogeneity; Flat spacetime -> curved spacetime; Singular probabilities -> superposition.

All of these were loosening of certain criteria that opened up many possibilities. It is certainly erroneous to assume we must, by necessity, have a homogeneous cosmology.

Is anyone familiar with the (ln B > x) notation being used? What is this value being referenced?

See section 2 of the paper.

I'm surprised cosmology hasn't accounted for differences in clocks given how central GR is to astronomy. Granted I am no expert, but adding this dynamic was, until today, a bridge too far, or thought to average out somehow and not be pertinent

> cosmology hasn't accounted for differences in clocks given how central GR is to astronomy

Of course it has. Yes, LCDM's FLRW metric, by its defining assumption of spatial homogeneity, doesn't allow the metric (let alone the speed of clocks) to vary spatially. However, it is very common to do perturbation theory on top of the FLRW metric to account for density fluctuations. Besides, there are also models like LTB (Lemaître-Tolman-Bondi) which give up on homogeneity at the non-perturbative level (while still preserving isotropy, though).

All in all, the idea that local voids could explain away the Lambda in LCDM is anything but new. It's just that the OP's timescape approach is the first one that seems to produce promising results. (Disclaimer: I merely skimmed the paper.)